How to measure radar video latency | Cambridge Pixel

You are currently using an outdated browser.

Please consider using a modern browser such as one listed below:

Cambridge Pixel Customer Login

Customers

How to Measure Latency from Radar Interfacing to Display

How to Measure Latency from Radar Interfacing to Display

In the case of radar video, latency is the time it takes for data to go from the radar source to a display screen. Radar video latency is typically measured in milliseconds and the closer it is to zero the better. Why measure it?  A large latency could result in a noticeable difference between the radar antenna rotation and its representation on the screen. The true position of targets may have changed significantly by the time they are drawn on the screen. Here at Cambridge Pixel our engineers are occasionally asked: “How can I measure the latency between receipt of a radar video signal into an HPx card (our hardware for interfacing to analogue radar video signals) and its presentation on a display?” The answer to this depends on a number of considerations: The acquisition and input buffering within the HPx hardware Processing and packetization within the sending software Network delays Scan conversion buffering and refresh timing Thinking about each of these stages, in broad terms one might expect 40ms of latency at the analogue acquisition and buffering stage, followed by a few milliseconds of processing latency, some non-deterministic network latency (maybe...

Subscribe to continue reading this article, it's free.

Free access to Engineering Insights, authored by our industry leading experts. You will also receive the Cambridge Pixel newsletter which includes the latest Engineering Insights releases.

Fill in the form below and you will be sent an Instant Access link.

Your email address will be added to our mailing list. You may unsubscribe at any time. We take data protection seriously. See our Privacy Policy.