How to Measure Latency from Radar Interfacing to Display
In the case of radar video, latency is the time it takes for data to go from the radar source to a display screen. Radar video latency is typically measured in milliseconds and the closer it is to zero the better.
Why measure it?
A large latency could result in a noticeable difference between the radar antenna rotation and its representation on the screen. The true position of targets may have changed significantly by the time they are drawn on the screen. Here at Cambridge Pixel our engineers are occasionally asked: “How can I measure the latency between receipt of a radar video signal into an HPx card (our hardware for interfacing to analogue radar video signals) and its presentation on a display?” The answer to this depends on a number of considerations:
The acquisition and input buffering within the HPx hardware
Processing and packetization within the sending software
Scan conversion buffering and refresh timing
Thinking about each of these stages, in broad terms one might expect 40ms of latency at the analogue acquisition and buffering stage, followed by a few milliseconds of processing latency, some non-deterministic network latency (maybe...
Subscribe to continue reading this article, it's free.
Free access to Engineering Insights, authored by our industry leading experts.
You will also receive the Cambridge Pixel newsletter which includes the latest Engineering Insights releases.
Fill in the form below and you will be sent an Instant Access link.