What are the delays on the video input cards?
When we talk of the delay ( also known as latency) of in an input card, we are considering the time it takes from a signal being recieved at the input to being output by the card.
For a media server, the important value is the throughput latency – the total time taken between the signal being recieved at the input to it being displayed on an output including the processing time of the software that is running – in this case, Ai.
Under testing with Datapath cards, we regularly see times of between 1 and 2 frames total throughput latency for the whole system. When measured in milliseconds, this equates to between 40 and 80 ms.