32
These functions allow you to fully integrate common video functions with video interfaces,
processors, and external memory controllers. The example design uses an Altera Cyclone® IV E
EP4CE115F29 featured tPad board.
A video source is input through the CMOS sensor on tPad which generates a digital output in RGB
format. A number of common video functions are performed on this input stream in the FPGA.
These functions include clipping, chroma resampling, motion adaptive deinterlacing, color space
conversion, picture-in-picture mixing, and polyphase scaling.
The input and output video interfaces on the tPad are configured and initialized by software running
on a Nios® II processor. Nios II software demonstrates how to control the clocked video input,
clocked video output, and mixer functions at run-time is also provided. The video system is
implemented using the SOPC Builder system level design tool. This abstracted design tool provides
an easy path to system integration of the video processing data path with a NTSC or PAL video
input, VGA output, Nios II processor for configuration and control. The Video and Image
Processing Suite MegaCore functions have common open Avalon-ST data interfaces and Avalon
Memory-Mapped (Avalon-MM) control interfaces to facilitate connection of a chain of video
functions and video system modeling. In addition, video data is transmitted between the Video and
Image Processing Suite functions using the Avalon-ST Video protocol, which facilitates building
run-time controllable systems and error recovery.
For the objective of a better visual effect, the CMOS sensor is configured to enable the left right
mirror mode. User could disable this functionality by modifying the related register value being
written to CMOS controller chip.
Figure 4-14
shows the Video and Image Processing block diagram.