SPRAA56
DSP/BIOS Real-Time Analysis (RTA) and Debugging Applied to a Video Application
13
The low-resolution CLK_getltime API is used instead of the high-resolution CLK_gethtime
because the range of the latency is known to be on the order of one or more frame times, where
a frame time is 33.33 ms in NTSC systems. The low-resolution timing measurement provided by
CLK_getltime is more cycle efficient and is in milliseconds. Since the data is displayed in
milliseconds, the lower-resolution time base results in a faster measurement, with sufficient
accuracy for the latency benchmark.
The corresponding code in the video output task finishes the benchmark once the frame has
propagated through the system:
if (!benchCapVid.captodisplay.done) { // benchVideoDisRta.captodisplay
benchCapVid.captodisplay.latency
= CLK_getltime() - benchCapVid.captodisplay.latency;
// current time - last captured frame timestamp = latency
UTL_logDebug2("Latency = %d [ms], for frame %d ", benchCapVid.captodisplay.latency,
benchCapVid.captodisplay.frameNum );
benchCapVid.captodisplay.done = 1;
}
Note that this measurement does not include the latency introduced by the capture and display
drivers. Similar techniques could be applied, using the UTL or STS APIs, to measure the driver
latency, however this would require modifying and rebuilding the driver, which is outside the
scope of this application note. To measure the total input-to-output latency, add the driver
latencies to the measured benchmark reported here.
4.4 Measuring the Frame Rate
Frame rate
is the rate, in frames per second or Hz, of the capture, processing, or display of
video frames by the system. In video systems it is possible for the display frame rate to exceed
the capture and/or processing frame rate, so it is often important to measure it separately for the
capture, processing, and display stages in the data stream.
In this example application, the actual frame rate is measured at each stage, and user control of
the frame rate is provided for the processing stage.
During periods of peak CPU loading, the processing rate of the DSP can fall below the display
rate of the output device, resulting in dropped frames. Dropped frames are frames that were
received during capture or decode but not displayed, or frames that were captured but not
encoded. Frame dropping can occur when the CPU is overloaded by the processing required for
real-time encoding or decoding.
The VPORT display driver from the DDK is written to handle this condition gracefully. If a new
frame is not received from the application in time for the video port to display it, the device driver
continues to show the previously displayed frame. With high-motion video, this condition can
sometimes result in noticeable “jerkiness”. At other times, dropped frames can be difficult to
detect or quantify, so a method of detecting dropped frames is useful during development,
debugging, and demonstrations. A method for detecting dropped frames is implemented in this
application using the UTL and CLK services.
The following code from the tskProcess function measures the number of dropped frames by
subtracting the reference time from the actual time required to capture 30 or 25 frames. The
reference time should be approximately 1 second for NTSC or PAL systems, respectively.