AN75779 How to Implement an Image Sensor Interface with EZ-USB

AN75779
How to Implement an Image Sensor Interface with EZ-USB® FX3™ in a USB Video
Class (UVC) Framework
Author: Karnik Shah
Associated Project: Yes
Software Version: FX3 SDK1.2.3
Related Application Notes: None
If you have a question or need help with this application note, contact the author at
[email protected].
Cypress’s EZ-USB® FX3™ is a USB 3.0 peripheral controller with a general programmable interface, called GPIF™ II,
which can connect easily to a range of devices. For example, AN75779 describes how FX3 works with USB video class
(UVC) to allow a video-streaming device, such as a webcam, to connect to a USB host to stream video.
Debug Interface Details .....................................................16 Table of Contents
Introduction .......................................................................1 Understanding the UVC Protocol ......................................2 Designing the GPIF II Interface .........................................3 Image Sensor Interface ................................................3 Requirements for GPIF II Descriptor................................... 4 Pin Mapping of Image Sensor Interface .......................4 GPIF II DMA Capabilities..............................................5 GPIF II State Machine Design ......................................8 Correlation Between Image Sensor Waveforms, Data
Path, and State Machine ..............................................9 Integration of the GPIF II Descriptor ...........................10 USB Video Class Requirements .....................................10 USB Descriptors for UVC ...........................................10 UVC-Specific Requests ..............................................11 Control requests – Brightness and PTZ control ................ 11 Streaming requests – Probe and Commit control ............. 12 Video Data Format .....................................................12 UVC Video Data Header ............................................12 Creating a DMA Channel to Stream Data from GPIF II to
USB.................................................................................13 Execution Path of the Firmware Application....................13 Application Threads....................................................13 Enumeration ...............................................................14 Starting the Streaming................................................14 Handling the Buffers During Streaming ......................14 Clean Up ....................................................................14 Aborting the Streaming...............................................15 Firmware Example Project File Details............................15 Debug Interface..........................................................16 www.cypress.com
Using the Debug Interface.................................................17 Hardware Setup ..............................................................20 Hardware Procurement ..............................................20 Hardware Setup .........................................................20 UVC Host Applications....................................................21 Running the Demo .....................................................21 Troubleshooting ..............................................................22 Summary.........................................................................22 About the Author .............................................................22 Appendix A......................................................................23 Designing with the GPIF II Designer ...............................23 Creating the Project....................................................23 Choosing the Interface Definition ...............................23 Drawing the State Machine on the Canvas ................25 Basic Drawing Actions on the Canvas...............................25 Drawing Image Sensor Interface State Machine for GPIF II
...........................................................................................28 Editing the GPIF II interface details.................................33 Document History............................................................34 Worldwide Sales and Design Support.............................35 Products..........................................................................35 PSoC® Solutions .............................................................35 Introduction
EZ-USB® FX3™ lets developers add USB 3.0 device
functionality to any system. Its GPIF II can create an
interface with virtually any processor, ASIC, image sensor,
or FPGA. AN75779 introduces the basics of UVC and
Document No. 001-75779 Rev. *B
1
®
How to Implement an Image Sensor Interface with EZ-USB FX3™ in a USB Video Class (UVC) Framework
shows you how to design an application compatible with
UVC by creating an FX3 interface with the following
signals: frame valid, line valid, pixel clock, and 8-bit to 32bit parallel data bus.
The firmware project associated with AN75779 has been
designed for an image sensor with the following
properties:

8-bit synchronous parallel data interface

16 bits per pixel

YUY2 color space

1280- x 720-pixel resolution (720p)

30 frames per second

active high frame/line valid signals

positive clock edge polarity
The firmware project showcases a 720p, 30-fps streaming
mode when enumerated as a USB 3.0 device. It switches
to VGA, 15-fps streaming mode when enumerated as a
USB 2.0 device. To make this change, use image sensor
register settings. The change is not made by cropping
from within FX3. Nevertheless, you can crop images by
modifying the GPIF II state machine design. With minor
customization, you can create an interface with a variety of
sensors that have a similar interface. In Appendix A, see
the section Editing the GPIF II Interface Details to learn
how to change the data bus width as an example of
customization.
After introducing the basics of UVC, AN75779 discusses
the flow of data from the image sensor to the internal
buffers and then to the USB 3.0 host. Included is an
explanation of the parts of the design required to
implement this data flow:

Designing an interface waveform for the GPIF II

Creating a USB descriptor to enumerate FX3 as
a UVC device

Connecting the GPIF II to the USB through a
DMA channel for streaming
In addition, brightness and PTZ (Pan-Tilt-Zoom) controls
are provided as examples of UVC control.
An overview of the system is shown in Figure 1, a system
block diagram. A host application, such as AMCap, talks
through the UVC driver to configure the image sensor over
video control interface and to receive video data over the
video streaming interface. FX3 firmware translates the
video control settings to the image sensor over the GPIO
2
or I C interface. The FX3 DMA channel streams data from
the image sensor to internal buffers, where the UVC
header is added to the image data. Then, this video data
is sent to the video streaming endpoint. The next section
briefly discusses the UVC protocol to show you how the
UVC driver talks to the firmware and what is required of
the FX3 firmware to reach a point at which it can start
streaming video.
Figure 1. System Block Diagram
Understanding the UVC Protocol
Find documentation for the UVC specification at usb.org.
This section, which repeats basic information from the
specification, describes how a standard UVC driver or
UVC application detects the device capabilities and
streams video. As with any other standard USB class, all
the capabilities of a device are reported to the host
through USB descriptors. The capabilities include the
frame properties (such as width, height, frame rate,
resolution, and bit depth) and control properties (such as
brightness, exposure, gain, contrast, and pan-tilt-zoom
control). After enumeration, the UVC driver polls for details
www.cypress.com
regarding these capabilities. This phase happens before
the UVC application starts to stream video. This polling of
information is executed by UVC class-specific requests,
called the stream and control requests, over endpoint 0
(USB control endpoint).
Document No. 001-75779 Rev. *B
2
®
How to Implement an Image Sensor Interface with EZ-USB FX3™ in a USB Video Class (UVC) Framework
General Exchange of UVC-Class Requests
HOST
frame rate, image resolution). This process is called
bandwidth negotiation.
DEVICE
X
GET_CUR request
X
GET_MIN request
X
GET_MAX request
X
GET_INFO request
X
Designing the GPIF II Interface
The GPIF II block of FX3 is a flexible state machine on
which an interface waveform can be designed to interface
with virtually any device. To design an interface waveform,
you need to understand the interface requirements and
DMA capabilities of FX3. In addition, you need to meet the
class-specific requirements of UVC to make the
application compatible with UVC standards.
SET_CUR request
The figure above shows the general approach of a UVC
driver to discover the details regarding the capabilities (X
is the capability in the figure). There are two types of
capabilities and, therefore, two kinds of requests: control
requests and streaming requests. The control requests are
related to brightness, contrast, and pan-tilt-zoom-like
controls. The streaming requests are related to bandwidth
(frame rate/resolution/image format) and streaming mode
selections.
For example, if the UVC device indicates in the USB
descriptor that it supports brightness control, the UVC
driver will try to discover the maximum brightness value
(GET_MAX), minimum brightness value (GET_MIN),
resolution of the steps possible in between the min and
max values (GET_RES not shown), default brightness
value (GET_DEF not shown), and the current brightness
value (GET_CUR) using GET control requests for
brightness control. When the user or UVC application
makes a request to change the brightness value, the UVC
driver would issue a SET control request for brightness
control to change the brightness value (SET_CUR).
Similarly, when the UVC application chooses to stream a
supported video format/frame rate/resolution, it issues
streaming requests. There are two types: PROBE and
COMMIT. PROBE requests are used to discover the
minimum, maximum, and step values between minimum
and maximum. In addition, PROBE requests are used to
obtain default values of the video parameters (format,
www.cypress.com
After the device has accepted the SET current value of the
PROBE request by looping back the same values (as the
host sent in SET current) in the GET current request of
PROBE request, the host sends a COMMIT request of the
type SET current. This indicates that the next thing a UVC
application will do is request video data for streaming. At
this point, the firmware must prepare the image data for
the USB host to pull out of FX3. Given this background
about the UVC protocol, the next sections of the
application note explore the way an image sensor is
interfaced with the FX3 using its programmable GPIF II
state machine, the way video data is presented to the
UVC application using DMA, and the way control requests
are handled.
Image Sensor Interface
A standard image sensor interface looks like the interface
between the image sensor and FX3 shown in Figure 1.
Usually, the image sensor requires a reset signal from the
FX3 controller. This can be handled by using an FX3
2
GPIO. Image sensors typically use an I C connection to
allow a controller to configure the image sensor
parameters. The I2C block of FX3 can act as an I2C
master to configure the image sensor with the correct
parameters. The various signals (unidirectional from the
image sensor to FX3), associated with transferring an
image, are as follows:
1.
FV – Frame Valid (indicates start and stop of a frame)
2.
LV – Line Valid (indicates start and stop of a line)
3.
PCLK – Pixel clock (clock for the synchronous
interface)
4.
Data – 8- to 32-bit data lines for image data
Figure 2 shows the timing diagram of the FV, LV, PCLK,
and Data signals. The FV signal is asserted to indicate the
start of a frame. Then, the image data is transferred line
by line. The LV signal is asserted during each line transfer
when the data is driven by the image sensor. This data
can be an 8-bit to 32-bit simultaneous transfer from the
image sensor.
Note This parallel data bus width is defined by the data
width as defined in the image sensor datasheet. Do not
confuse the data bus width with the bit depth/resolution of
the image sensor.
Document No. 001-75779 Rev. *B
3
®
How to Implement an Image Sensor Interface with EZ-USB FX3™ in a USB Video Class (UVC) Framework
The FX3 GPIF II bus can be configured only in 8-, 16- or
32-bit data interfaces. Therefore, the data bus width from
FX3 should be set such that it is larger than or equal to the
image sensor data width. This size will be the interface
width over GPIF II. If the interface width is larger than the
image sensor data width, the additional padded bits
transferred should be discarded by the end application –
usually the software running on the host PC.
For example, if the image sensor sends out 12 bits in
parallel, the interface width should be set to 16 bits and
the unconnected 4 bits will behave as padding. The time
between each line is called horizontal blanking. During
horizontal blanking, no data is transferred and the LV
signal is deasserted. After all the lines are transferred from
the image sensor, the FV signal is deasserted, and it
remains deasserted for the time equivalent to the vertical
blanking. You can implement a slave state machine for
this interface by using the GPIF II to receive the image
data from the image sensor.
Requirements for GPIF II Descriptor
The previous section described a typical image sensor
interface. This section describes the design requirements,
which can be derived from the image sensor interface, for
the GPIF II state machine. Looking at the image sensor
interface timing diagram we can infer the following:

of pixels. If P is the bytes per pixel (including the
padding), an image sensor continuously sends
C x P bytes of image data; R times with short
pauses. So once a line starts, GPIF II design
should be able to move CxP bytes without a
break because there is no flow control on the
image sensor interface.

At the end of the frame, a signal must be
generated from GPIF II to the FX3 CPU so that
the CPU indicates to the host that the current
frame data transfer is complete.

FX3 can operate only in 8-bit, 16-bit, or 32-bit
modes. Therefore, the data bus width for the
GPIF II must be set so that it is greater or equal
to the parallel data coming out of the image
sensor.
An example GPIF II design for an image sensor, created
using the GPIF II Designer tool, is attached with this
application note. Find step-by-step instructions to build it in
the “Designing with the GPIF II Designer” section
(Appendix A).
The following section shows the pin mapping used in the
GPIF II implementation of the image sensor interface in
this application note.
Consider an image resolution C x R. C is the
number of columns and R is the number of rows
Figure 2. Image Sensor Interface Timing Diagram
Pin Mapping of Image Sensor Interface
The pin mapping used in the GPIF II implementation of the parallel image sensor interface in this application note is shown in
Table 1. Also shown are the GPIO pins and other serial interfaces (UART/SPI/I2S) available when you use the GPIF II design
available with this application note.
www.cypress.com
Document No. 001-75779 Rev. *B
4
®
How to Implement an Image Sensor Interface with EZ-USB FX3™ in a USB Video Class (UVC) Framework
Table 1. Pin Mapping for Parallel Image Sensor Interface Descriptor
Synchronous Parallel Image Sensor
Interface with 16-bit Data Bus
EZ-USB FX3 Pin
Synchronous Parallel Image Sensor
Interface with 8-bit Data Bus
GPIO[28]
LV
LV
GPIO[29]
FV
FV
GPIO[0:7]
DQ[0:7]
DQ[0:7]
GPIO[8:15]
DQ[8:15]
Unused / Available as GPIOs
GPIO[16]
PCLK
PCLK
GPIO[17:27]
Available as GPIOs
Available as GPIOs
GPIO[33:45]
Available as GPIOs
Available as GPIOs
GPIO[46]
GPIO/UART_RTS
GPIO/UART_RTS
GPIO[47]
GPIO/UART_CTS
GPIO/UART_CTS
GPIO[48]
GPIO/UART_TX
GPIO/UART_TX
GPIO[49]
GPIO/UART_RX
GPIO/UART_RX
2
GPIO/I S_CLK
2
2
GPIO/I S_SD
2
GPIO[50]
GPIO/I S_CLK
GPIO[51]
GPIO/I S_SD
GPIO[52]
GPIO/I S_WS
GPIO/I S_WS
GPIO[53]
GPIO/SPI_SCK /UART_RTS
GPIO/SPI_SCK /UART_RTS
GPIO[54]
GPIO/SPI_SSN/UART_CTS
GPIO/SPI_SSN/UART_CTS
GPIO[55]
GPIO/SPI_MISO/UART_TX
GPIO/SPI_MISO/UART_TX
GPIO[56]
GPIO/SPI_MOSI/UART_RX
GPIO/SPI_MOSI/UART_RX
GPIO[57]
2
GPIO/I S_MCLK
I C SCL
2
I C SDA
I C_GPIO[59]
2
2
2
I C_GPIO[58]
2
GPIO/I S_MCLK
2
I C SCL
2
2
I C SDA
2
Note For the complete pin mapping of EZ-USB FX3, please refer to the datasheet “EZ-USB FX3 SuperSpeed USB Controller.”
The following section explains the DMA configuration used
to achieve video streaming from FX3’s GPIF II to the USB
interface.
Figure 3. Default Settings for GPIF II Threads
and P Block Sockets
GPIF II DMA Capabilities
The GPIF II block, as a part of the P (processor port)
block, can run up to 100 MHz with 32 bits of data (400
Mbps). To transfer the data to internal buffers, GPIF II
uses threads connected to DMA producer sockets of the P
block. The sockets point to a DMA descriptor that sets
DMA buffer size, count, and chaining sequence. The GPIF
II has four threads, which can be associated with any four
sockets of the 32 P block sockets. Default settings are
used for this application. The default threads are
associated with the sockets of the same number. As
shown in Figure 3, GPIF II has four threads through which
it can transfer data. The switching between these threads
is accomplished by in-state thread switching, as explained
in the GPIF II State Machine Design section.
www.cypress.com
Document No. 001-75779 Rev. *B
5
®
How to Implement an Image Sensor Interface with EZ-USB FX3™ in a USB Video Class (UVC) Framework
The DMA data transfer mechanism is shown in Figure 4.
The example is of a DMA channel in which there is a
single producer socket (consumer side not shown) and
three buffers, of length L, chained in a linear circular loop.
The figure shows the internal memory of FX3. The left
column shows the memory offset, and the right column
shows what is stored in that memory location. The red
arrows (data path) indicate how the buffers will be
accessed by the socket. The blue arrows (connecting the
DMA descriptor chain to the socket) show how the socket
loads the descriptors from memory. The following
execution steps show the process or mechanism of data
moving from a socket to the internal buffers. The steps are
also marked in Figure 4 for reference.
Figure 4. Example of DMA Data Transfer Operation
Step 3: Load DMA Descriptor 2 as pointed to by the
current DMA Descriptor 1. Get the buffer location (A2),
buffer size (L), and next descriptor (DMA Descriptor 3)
information. Go to step 4.
Step 4: Transfer data to the buffer location starting at A2.
After transferring buffer size L amount of data, go to
step 5.
Step 5: Load DMA Descriptor 3 as pointed to by the
current DMA Descriptor 2. Get the buffer location (A3),
buffer size (L), and next descriptor (DMA Descriptor 1)
information. Go to step 6.
Step 6: Transfer data to the buffer location starting at A3.
After transferring buffer size L amount of data, go to
step 1.
Notice that in this execution path, the socket is not
continuously transferring data. The socket pauses to get
configuration in between two buffers. There is a non-zero
(typically 1 µs) delay when the socket switches buffers.
For only one socket implementation, the first requirement
from the requirements section can fail at buffer boundaries
if the buffers are not a multiple of line length.
One obvious solution is to use buffer sizes equal to a
multiple of the line size. Under such an implementation, if
the resolution of the image changes, the buffer size needs
to be changed. Setting buffer size equal to line size will not
yield the maximum throughput, either. USB 3.0 allows a
maximum of 16 bursts of 1024 bytes over bulk endpoint.
Use this feature to maximize throughput. To enable a USB
burst of 16KB, the USB DMA buffer size (consumer side)
should be set to 16KB. The P block DMA buffer size
(producer side) will be similar, as explained in USB Video
Class Requirements section.
Step 1: Load DMA descriptor 1 from the memory into the
socket. Get the buffer location (A1), buffer size (L), and
next descriptor (DMA Descriptor 2) information. Go to
step 2.
An alternative method can be implemented in which two
sockets are used from the GPIF II side to write data in an
interleaved fashion. Because the switching of sockets for
GPIF II has no time latency, the sockets can be switched
when the buffer associated with the active socket is full
(i.e., socket switching at the exact buffer boundary as
compared to the line boundary). The data transfer using
dual sockets is described in Figure 5 with the execution
steps labeled. Socket0 and Socket1 access to DMA
buffers is differentiated with red and green arrows (data
paths for individual sockets/threads), respectively. The ‘a’
and ‘b’ parts of each step occur simultaneously. This
parallel operation of the hardware helps to mask the buffer
switching time and allows the GPIF II to stream data
continuously into internal memory.
Step 2: Transfer data to the buffer location starting at A1.
After transferring buffer size L amount of data, go to
step 3.
www.cypress.com
Document No. 001-75779 Rev. *B
6
®
How to Implement an Image Sensor Interface with EZ-USB FX3™ in a USB Video Class (UVC) Framework
Figure 5. Dual Socket Data Transfer Architecture
0
ep
St
1a
DMA Descriptor 1
Buffer Address: A1 – Size L
Next DMA Descriptor: 3
DMA Descriptor 2
Buffer Address: A2 – Size L
Next DMA Descriptor: 4
p
1b
Step 4a
6a
DMA Descriptor 4
Buffer Address: A4 – Size L
Next DMA Descriptor: 2
Ste
p3
a
ep
St
DMA Descriptor 3
Buffer Address: A3 – Size L
Next DMA Descriptor: 1
St
e
Socket 0
Socket 1
A1
DMA Buffer 1
2
ep
St
6b
ep
St
e
St
A1+L
A2
p
3b
St
DMA Buffer 3
5b
ep
A2+L
A3
St
ep
4b
DMA Buffer 2
A3+L
A4
DMA Buffer 4
A4+L
Step 1: At initialization of the sockets, both Socket 0 and
Socket 1 load the DMA Descriptor 1 and DMA Descriptor
2, respectively.
Step 2: As soon as the data is available, Socket 0
transfers data to DMA buffer 1. The transfer length is L. At
the end of this transfer, go to step 3.
Step 3: GPIF II switches the thread and, therefore, the
socket for data transfer. Socket 1 starts to transfer data to
DMA buffer 2, and, at the same time, Socket 0 loads the
DMA Descriptor 3. By the time Socket 1 finishes
transferring L amount of data, Socket 0 will be ready to
transfer data into DMA buffer 3.
Step 4: GPIF II now switches back to the original thread.
So, Socket 0 will now transfer the data of length L into
DMA buffer 3. At the same time, Socket 1 will load the
DMA Descriptor 4 and be ready to transfer data to DMA
buffer 4. After Socket 0 finishes transferring the data of
length L, go to step 5.
www.cypress.com
Step 5: GPIF II switches thread and Socket 1 transfers
data of length L into DMA buffer 4. Socket 0 loads DMA
Descriptor 1 and gets ready to transfer data into DMA
buffer 1 at the same time. Notice that Step 5a is the same
as Step 1a except that Socket 1 is not initializing but,
rather, transferring data simultaneously.
Step 6: GPIF II switches back the thread, and Socket 0
starts to transfer data of length L into DMA buffer 1. It is
assumed that by now, the buffer is empty (consumed by a
consumer socket—UIB socket, because USB is the usual
consumer). At the same time, Socket 1 loads the DMA
Descriptor 2 and is ready to transfer data into DMA buffer
2. The cycle now goes to Step 3 in the execution path.
Assume that the buffers are consumed by the consumer
(USB), so that when the execution path comes to fill the
buffers, there is no data loss. The GPIF II state machine
can implement the in-state thread switching at the buffer
boundary through counters. The counter value needs to
be set according to the buffer size on the producer side.
Document No. 001-75779 Rev. *B
7
®
How to Implement an Image Sensor Interface with EZ-USB FX3™ in a USB Video Class (UVC) Framework
If the consumer is not fast enough, the sockets would drop
data as the buffer, to which the producer tries to write, is
inaccessible. If there is a data drop, the counters continue
to increment while the data is not transferred to the buffer,
resulting in premature thread switching. This implies a
break in the data transfer streaming sequence and image
data loss during transfer. The current frame will be
discarded due to this condition. If this condition is not
corrected, the misalignment will propagate to the next
frame. Therefore, you need to place a cleanup mechanism
at the end of every frame so that such misalignment does
not carry forward into the next frame transfer. This is
described in the Clean Up section.
signal to be asserted (State 2). Also, the counter should
be initialized to keep track of the data transferred, so that
state machine can switch threads (State 2).
Then, the state machine should transfer data into the
threads (State 3 and State 4). While the GPIF II state
machine is transferring the data into a thread, two
conditions can happen:
1.
The current line that is being transferred is completed,
so the LV signal is deasserted. The data transfer can
have two states for the threads.
a.
The data transfer ended at the buffer
boundary. In this case, the next data
transfer needs to happen in the next
thread (State 7 and State 8).
b.
The data transfer did not end at the
buffer boundary. In this case, the state
machine waits for LV signal to get
asserted back, to start transferring data
into the same thread (State 5 and
State 6).
With an execution path described in Figure 5, the data
transfer ends when a frame ends with four possible states:

Socket 0 has transferred a full buffer

Socket 1 has transferred a full buffer

Socket 0 has transferred a partial buffer

Socket 1 has transferred a partial buffer
In case either Socket 0 or Socket 1 has filled a buffer
partially, the CPU needs to commit this partial buffer so
that the consumer socket can consume this data. The next
step is to create a state machine based on all the
understanding from previous sections. This is described in
the following sections.
GPIF II State Machine Design
The GPIF II block is a versatile state machine with 256
states. In each state, you can perform a multitude of
actions, including the following:

driving multiple control lines

sending or receiving data and/or address

sending interrupts to internal CPU
State transitions can have internal or external signals,
such as DMA ready and Frame/Line Valid, as deciding
factors.
To begin designing, choose a point on the image sensor
waveform where the state machine is to start. The start of
a frame is the most convenient position, and that is
indicated by a positive transition on the frame valid signal.
GPIF II can detect only the state of an input and not the
edge. Positive edge detection of frame valid signal is
required to ensure that FX3 does not start transferring
frame data from the middle of a frame. So, the start of the
state machine (see Figure 6) should wait for FV to be
deasserted (State 1), and then it should wait for the FV
www.cypress.com
2.
When the data for a line is still being transferred, the
buffer associated with the current thread gets full. As
a result, the state machine needs to switch to
transferring the data through the next thread
(transitions between State 3 and State 4).
Under any circumstances, whenever the GPIF II state
machine has to switch threads, the counter needs to be
re-initiated. The GPIF II state machine can automatically
reload the counter value; however, it takes one clock cycle
for the reload to occur. Therefore, the actual limit of the
counter should be 1 less. Equation 1 helps to calculate the
value of the counter.
Equation 1
 producer _ buffer _ size ( L ) 
  1
count  
Interface _ width


For example: The UVC example, as described, has a
producer buffer size of 16,368 bytes. Its interface width is
8 bits. Therefore, the counter value to be set is 16,367
(0x3FEF). If the interface width is 16 bits, this value should
be 8183 (0x1FF7). If the interface width is 32 bits, this
value should be 4091 (0xFFB). Note that the buffer size is
independent of the interface width; however, the value of
the count that goes in the GPIF II descriptor design is
dependent on both. Editing the GPIF II Interface Details
section in Appendix A shows how to change the data bus
width using the GPIF II Designer tool.
Document No. 001-75779 Rev. *B
8
®
How to Implement an Image Sensor Interface with EZ-USB FX3™ in a USB Video Class (UVC) Framework
Figure 6. GPIF II State Machine Diagram
Note With such a state machine design, the CPU will get
an interrupt at the end of every frame. This can be used to
call a call back function that will enable the CPU to handle
several tasks, including the following:
1.
Commit last buffer if there is a partial buffer at the end
of a frame (State 9 and State 10).
2.
Let the consumer socket drain all data and then reset
the channel and reset the GPIF II state machine
(Reset the state machine to State 0 so that it can start
streaming data from start of a frame).
3.
Handle any special application-specific tasks to
indicate the consumer of the change of frame. For
example, UVC implementation requires indicating the
change-of-frame event by toggling a bit in the 12-byte
header.
www.cypress.com
Correlation Between Image Sensor
Waveforms, Data Path, and State Machine
The previous section showed the GPIF II state machine
design for an image sensor interface. This section
explains the relationship among the image sensor
interface, the GPIFII state machine, and the DMA data
path diagrams. Figure 7 shows how the following three
components are correlated: the image sensor waveform
described in Figure 2, the FX3 data path described in
Figure 5, and the state machine described in Figure 6. To
explain the correlation, let’s use a dummy image sensor
as an example. In the example, the buffer size L is equal
to 1.5 times the line size C. R = 6. FV and LV waveforms
show the image sensor timing. “Step” shows how the data
is being routed from Figure 5. “State” shows how the state
machine changes states in response to FV and LV from
Figure 6. With CPU intervention the data pipe is cleared
and GPIF II state machine is reset.
Document No. 001-75779 Rev. *B
9
®
How to Implement an Image Sensor Interface with EZ-USB FX3™ in a USB Video Class (UVC) Framework
Figure 7. Image Sensor Interface, Data Path Execution, and State Machine Correlation
Refer to Appendix A to see how to create the GPIF II state
machine described in the previous sections using the
GPIF II designer tool. The sample project created in the
Designing with the GPIF II Designer section (Appendix A)
is attached with this application note for reference. The
project name is ImageSensorInterface.cyfx and is stored
under the folder ImageSensorInterface.cydsn.
Integration of the GPIF II Descriptor
After a GPIF II Designer project is created, building the
project will generate a header file called cyfxgpif2config.h.
To integrate the GPIF II descriptor into the project that was
created, the cyfxgpif2config.h file is copied into the eclipse
project directory. This file has a structure called
“CyFxGpifConfig”.
In the firmware application that is included with this
application note, this structure needs to be passed in the
function “UVCAppThread_Entry  CyFxUvcAppGpifInit
 CyU3PGpifLoad” to load the GPIF II state machine
descriptor into the memory. Once loaded, the function
“CyFxUvcAppGpifInit  CyU3PGpifSMStart” starts the
GPIF II state machine. “CyU3PGpifSMStart” requires the
start state as the parameter. Find the start state
declaration in the cyfxgpif2config.h file. Refer to the FX3
SDK API Guide for more information on how to use the
GPIF II-related functions, including “CyU3PGpifLoad” and
“CyU3PGpifSMStart”.
Note All functions are in uvc.c unless otherwise specified.
The naming convention used for functions that reside
outside uvc.c is as filename/function.  is used to indicate
the function call hierarchy. Example: If a function F1 in
uvc.c is calling a function F2 in sensor.c, it is indicated as
F1sensor.c/F2 where the function of interest is F2.
The next sections explain the details of the UVC
requirements, DMA channel to stream data, and the
details of the firmware that supports UVC.
USB Video Class Requirements
There is an associated example firmware project available
with this application note that supports the UVC class.
This section explains the details of that example project.
The UVC class requires the following:

A device to enumerate with the UVC classspecific USB descriptors

Firmware to handle SET/GET class-specific
requests for the UVC control and stream
capabilities reported in the USB descriptors

Uncompressed video data format

Video stream with a header for every image
payload, where the header follows a particular
format
Details of these requirements are found in the UVC
specification.
USB Descriptors for UVC
The “cyfxuvcdscr.c” file in the associated project has all
the USB descriptor values. “CyFxUSBHSConfigDscr” and
“CyFxUSBSSConfigDscr” contain the USB 2.0 and USB
3.0 UVC-specific descriptors. The class-specific
descriptors contain the Interface Association Descriptor
(IAD), video control interface descriptor (and subdescriptors), and video streaming interface descriptor (and
sub-descriptors). Note that having two separate interfaces
makes the UVC device a composite device, by definition.
The video control interface descriptor and sub-descriptors
report all of the control interface-related capabilities.
Examples include brightness, contrast, hue, exposure, and
pan-tilt-zoom controls. According to the UVC specification,
a camera module has the following blocks in a videoprocessing pipeline: input terminal (camera unit) 
processing unit  extension unit  output terminal.
The camera unit controls mechanical (or equivalent digital)
features (like exposure and pant-tilt-zoom) of the device
www.cypress.com
Document No. 001-75779 Rev. *B
10
®
How to Implement an Image Sensor Interface with EZ-USB FX3™ in a USB Video Class (UVC) Framework
component that transmits the video stream. The
processing unit controls image attributes (such as
brightness, contrast, and hue) of the video being streamed
through it. The extension unit adds vendor-specific
controls. A complete list is available in the UVC
specification.

GET_RES reads the resolution (step value to
indicate the supported values between min and
max).

GET_DEF reads the default value.

GET_LEN reads the size of the attribute in bytes.
The UVC specification has divided these functionalities so
that you can easily structure the implementation of the
class-specific control requests. However, the actual
implementation of the functionalities is application-specific.
The supported control capabilities are reported in the bit
field “bmControls” of the respective terminal/unit by setting
the corresponding bit to ‘1’. The UVC device driver polls
for details about the control on enumeration. The polling
for details is carried out over class-specific endpoint ‘0’
(EP0) requests. All such requests, including the video
streaming
requests,
are
handled
by
the
“UVCAppEP0Thread_Entry” function.

GET_INFO queries status/support for specific
capability.
The video streaming interface descriptor and subdescriptors
report
the
various
frame
formats
(uncompressed, MPEG, H.264), frame resolutions (width,
height, and bit depth), and frame rates (discrete values or
continuous intervals). Based on the values reported, the
UVC application user can choose to switch streaming
modes by selecting supported combinations of frame
formats, frame resolutions, and frame rates.
Before starting to stream, the UVC application/driver
issues a set of probe requests to discover the possible
streaming modes. After the default streaming mode is
decided—it is negotiated between the host and device—
the UVC application/driver issues a commit request. Then,
based on the type of streaming endpoint—isochronous or
bulk—it may or may not issue a select alternative setting
command. At this point, the UVC application is ready to
stream video from the UVC device. Let’s explore the
details of how to handle these requests in the next section.
UVC-Specific Requests
The UVC specification uses USB control endpoint EP0 to
communicate with the UVC device for control and
streaming requests. Class-specific requests are used to
SET and GET video-related controls (change image
properties or video streaming).
Here is a list of class-specific requests:

SET_CUR is the only type of SET request, and it
is used to change the current value of an attribute
of a capability.

GET reads an attribute of a capability.

GET_CUR reads the current value.

GET_MIN reads the minimum supported value.

GET_MAX reads the maximum supported value.
www.cypress.com
Requests may be mandatory or optional, and they are
listed as such for every control (please refer to the UVC
specification for details). For example, if SET_CUR
request is optional for a particular capability, its presence
is determined through GET_INFO. If a video function does
not support a certain request, it must indicate that by
stalling the control pipe (EP0) when that request is issued
to the function.
Control requests – Brightness and PTZ
control
Brightness and PTZ (pan-tilt-zoom) controls are
implemented in the associated project. PTZ is optional and
can be turned on by defining “UVC_PTZ_SUPPORT” in
the “uvc.h” file. Because the implementation of these
controls is system-specific, only placeholder functions are
supplied.
For example, the brightness control is implemented under
the
“UVCHandleProcessingUnitRqts”
function.
“CyFxUVCApplnUSBSetupCB” function detects whether
the host has sent a UVC-specific request (control or
stream)
and
then
sets
an
event
flag
(“CY_FX_UVC_VIDEO_CONTROL_REQUEST_EVENT”
or “CY_FX_UVC_VIDEO_STREAM_REQUEST_EVENT”
respectively).
The EP0 thread receives these events and processes the
request accordingly. On enumeration, the UVC driver will
issue GET_INFO, GET_MIN, GET_MAX, GET_DEF,
GET_RES, and GET_CUR requests for brightness control.
“UVCHandleProcessingUnitRqts” is set to send back the
corresponding values. When implementing for a specific
system, you can predefine the min, max, def, and res
values based on image sensor capabilities. The
GET_CUR request should read the actual value from the
image sensor and report back to the host. The SET_CUR
request should write the value to the image sensor.
Use “UVCHandleProcessingUnitRqts” to implement all the
processing unit-related controls (brightness, contrast, hue,
and so on). Use “UVCHandleCameraTerminalRqts” to
implement all the camera terminal-related controls. Use
“UVCHandleExtensionUnitRqts” to implement all the
vendor-specific requests. To enable support for any of
these controls, you must set corresponding bits in the USB
descriptors. See the UVC specification for details on
Camera Terminal, Processing Unit, and Extension Unit
USB descriptors.
Document No. 001-75779 Rev. *B
11
®
How to Implement an Image Sensor Interface with EZ-USB FX3™ in a USB Video Class (UVC) Framework
Standard UVC applications do not provide a method to
manipulate the extension unit-related controls, but all
standard UVC driver APIs would allow an application the
same level of access to the extension unit controls as the
other two do. Therefore, to implement and use a vendorspecific control, you need to write or modify a host
application.
Streaming requests – Probe and Commit
control
“UVCHandleVideoStreamingRqts” handles streamingrelated requests. Let’s explore how they work. When a
UVC application needs to stream a video from a UVC
device, the first step is negotiation. In that phase, the
application sends PROBE requests, such as GET_MIN,
GET_MAX, GET_RES, and GET_DEF. These return a
probe structure. The structure contains the USB descriptor
indices of video format and video frame, frame rate,
maximum frame size, and payload size (the number of
bytes the UVC driver can fetch in one transfer). The
correct responses to these requests are required for the
UVC driver to move from negotiation to the streaming
phase.
Switch case for “CY_FX_UVC_PROBE_CTRL” handles
the negotiation phase of the streaming for either a USB
2.0 or USB 3.0 connection (the properties of the supported
video in these modes differ). Note that the reported values
for GET_MIN, GET_MAX, GET_DEF, and GET_CUR are
all the same because only one streaming mode is
supported in either USB 2.0 or USB 3.0. These values
would differ if multiple streaming modes need to be
supported.
Switch
case
for
“CY_FX_UVC_COMMIT_CTRL” handles the start of the
streaming phase. The SET_CUR request for commit
control indicates that the host will start streaming video
next. Therefore, SET_CUR for commit control sets
“CY_FX_UVC_STREAM_EVENT” event, which indicates
the main thread “UVCAppThread_Entry” to start the
GPIF II state machine for video streaming.
Video Data Format
The UVC specification supports limited color formats.
YUY2 is a common format used to transfer the image. The
YUY2 color format is a 4:2:2 downsampled version of the
YUV color format. Each of four pixels is represented by
{Y1, U1, Y2, V2, Y3, U3, Y4, U4} image data, where Y is
luminance, U/V are chrominance, and 1-4 are for pixel
numbers. The data sequence expected by the UVC
application is in the following order: {Y, U, Y, V, Y, U, Y, V,
.. }. Each Y/U/V channel is 8 bits wide. The RGB format is
not supported. However, 8-bit monochrome can be
supported by sending the image sensor data over the Y
channel and setting 0x80 for both U and V channels
(grayscale image).
of the image data being transferred. For example, it has a
new frame bit that needs to be toggled every frame. You
can set an error bit to indicate a problem in the streaming
of the current frame. This header is required for every
USB transfer.
For example, for a burst of 16KB, include one 12-byte
header in the 16KB USB payload. An FX3 CPU needs to
add this header. Therefore, the DMA channel must be
designed as a manual (rather than automatic) channel, in
which the FX3 CPU can intervene to add a header and
then commit every packet to USB. The manual multi-DMA
channel is defined in “CyFxUVCApplnInit” function. The
manual operation of adding a header and committing a
buffer to USB is implemented in the firmware in the
function “UVCAppThread_Entry”. It uses a combination of
“CyU3PDmaMultiChannelGetBuffer”,
“CyFxUVCAddHeader”,
and
“CyU3PDmaMultiChannelCommitBuffer” function calls.
There is a case in which GPIF II may not have filled the
last buffer at the end of a frame. In this case, firmware
must wrap up (“CyFxGpifCB  CyFxUvcAppCommitEOF
 CyU3PDmaMultiChannelSetWrapUp”) the partial buffer
that was being filled at the end of the frame transmission.
This wrap-up action causes a partial buffer (short packet)
to be produced on the producer side. This partial buffer is
acquired by the “CyU3PDmaMultiChannelGetBuffer”
function. Then, the CPU should add an end-of-frame
header by passing “CY_FX_UVC_HEADER_EOF” to the
“CyFxUVCAddHeader” function. The CPU should also
commit this packet with a byte count, which is determined
by the number of bytes present in the produced buffer +
12 bytes (for header).
The 12-byte header that the UVC class requires also
influences the size of the DMA buffer associated with the
P block socket. The architecture of FX3 is designed so
that any buffer associated with a descriptor needs to be a
multiple of 16 bytes.
To accommodate the 12-byte header offset, the DMA
descriptor buffer for the P block (PIB DMA descriptor)
should point to a memory location 12 bytes after the
memory location where the USB DMA descriptor buffer
points. Because we are using 16KB (16,384) as the size of
the USB buffer to maximize the throughput, the PIB buffer
should be a maximum of (16K bytes – 12 bytes) = 16,372
bytes. However, because of the restriction that the buffer
size must be a multiple of 16 bytes, the PIB buffer needs
to be set as (16,372 – (16,372modulo16)) = 16,368. This
means that USB will transfer 16,368+12 = 16,380 bytes in
one transaction except when the buffer is partially filled at
the end of a frame. Figure 8 explains this setup. In the
firmware, this is done in the “CyFxUVCApplnInit 
CyU3PDmaMultiChannelCreate” function. Details are
explained in the next section.
UVC Video Data Header
The UVC class requires a 12-byte header (for
uncompressed payloads) that describes some properties
www.cypress.com
Document No. 001-75779 Rev. *B
12
®
How to Implement an Image Sensor Interface with EZ-USB FX3™ in a USB Video Class (UVC) Framework
Figure 8. USB and PIB DMA Descriptors for UVC
0
USB DMA Descriptor 1
Buffer Address: A1 – Size 16384
Next DMA Descriptor: 3
PIB DMA Descriptor 2
Buffer Address: (A1+12) – Size 16368
Next DMA Descriptor: 4
USB DMA Descriptor 3
Buffer Address: A3 – Size 16384
Next DMA Descriptor: 1
DMA Descriptor 4
Buffer Address: (A3+12) – Size 16368
Next DMA Descriptor: 2
A1
12 Byte header
USB DMA Buffer 1
A1+12
16368 bytes
Image Data
A1+16380
A1+16384
PIB DMA Buffer 2
4 bytes – no data
A3
12 Byte header
USB DMA Buffer 3
2.
PIB buffer starts at 12 bytes after the USB buffer to
accommodate the 12-byte UVC header
3.
PIB buffer size is 16,368 to ensure it is smaller than
the USB buffer size, is a multiple of 16 bytes, and
allows space for the 12-byte UVC header
4.
There are two interleaved producer sockets (PIB is
the producer here) to make sure buffer switching time
does not cause a data drop
5.
One consumer socket (USB is consumer) that can
read all data off the memory in the order in which it
was stored by the producer
6.
FX3 CPU needs to manually commit data from GPIF II
to USB
This is implemented by setting the elements of the
“dmaMultiConfig” structure in the “CyFxUVCApplnInit”
function and creating a “MANUAL_MANY_TO_ONE”
channel. In addition, the USB endpoint that will stream
data to the USB 3.0 host needs to be configured to enable
a burst of 16 over the 1024-byte bulk endpoint. This is set
using the “endPointConfig” structure passed in the
“CyFxUVCAppInInit  CyU3PSetEpConfig” function for
“CY_FX_EP_BULK_VIDEO” endpoint.
A3+12
16368 bytes
Image Data
A3+16380
A3+16384
PIB DMA Buffer 4
4 bytes – no data
Creating a DMA Channel to Stream
Data from GPIF II to USB
You can configure the 12-byte offset and 16-byte
boundary condition in the PIB buffer using the header and
footer settings of a DMA channel configuration. The
“CyFxUVCApplnInit” function initializes the following:

GPIO for sensor reset

PIB block to enable GPIF II function

2
I C block to configure the image sensor

Image sensor to start streaming image data

USB block to enumerate as UVC device

USB endpoints to stream data to the USB Host

A DMA channel to connect the data pipe from
GPIF II to USB endpoint
We need to meet the following requirements for the DMA
channel:
1.
USB buffer size is 16,384 bytes to maximize
throughput
www.cypress.com
Execution Path of the Firmware
Application
Let’s examine the UVC application from a functional
standpoint. This section describes the flow of the firmware
to explain how the application works. The FX3 firmware
runs on top of a ThreadX Real Time Operating System
(RTOS). In the uvc.c/main function, the firmware defines
the configuration of FX3 I/Os. Here FX3 is configured to
enable the use of particular peripherals and set the data
width of the GPIF II interface to 32 bits or less. These
settings can be changed on the fly later, but do so
carefully to avoid any I/O conflicts. Then the firmware
creates two threads: the main application thread, which
runs in “UVCAppThread_Entry” function, and the control
request
handler
thread,
which
runs
in
“UVCAppEP0Thread_Entry” function. These are software
application threads that run on the ThreadX operating
system; they should not be confused with the GPIF II
hardware thread implementation.
Application Threads
These need to be separate threads to enable concurrent
functionality. The main thread is responsible for waiting for
a stream event to occur before the streaming starts,
committing buffers after the streaming has started, and
cleaning up the FIFO after a frame is transmitted or if the
streaming stops.
When the main thread is waiting, the firmware also needs
to handle some UVC class-specific control requests
(SET_CUR, GET_CUR, GET_MIN, and GET_MAX) over
EP0, such as brightness, PTZ, and probe/commit control.
Document No. 001-75779 Rev. *B
13
®
How to Implement an Image Sensor Interface with EZ-USB FX3™ in a USB Video Class (UVC) Framework
Any class-specific control request is received by
“CyFxUVCApplnUSBSetupCB” function. This function is
implemented in the main application thread. Whenever
one of these control requests is received by FX3, this
function sets corresponding events and immediately frees
the main application thread to perform its other concurrent
tasks. EP0Thread will trigger on these events to serve the
class-specific requests.
Enumeration
“UVCAppThread_Entry” calls “CyFxUVCApplnDebugInit”
to
initialize
the
UART
debugging
capability,
2
“CyFxUVCApplnI2CInit” to initialize the I C block of FX3,
and “CyFxUVCApplnInit” to initialize the rest of the
required blocks, DMA channels and USB endpoints. In the
“CyFxUVCApplnInit”
function,
“CyU3PUsbSetDesc”
function calls to ensure that FX3 enumerates as a UVC
device. The UVC descriptors are defined in cyfxuvcdscr.c
file. These descriptors are defined for an image sensor
sending 16 bits per pixel on average, uncompressed
YUY2 image format (4:2:2 downsampled), and 1280 x 720
pixels at 30 frames per second. Please refer to the UVC
specification if you need to change these settings.
Starting the Streaming
The USB Host application (such as AMCAP or
VirtualDub), which sits on top of the UVC driver to stream
the images, sets the USB interface and USB alternate
setting to the one that streams the video (usually Interface
0 Alternate setting 1), and sends a probe/commit control.
This is an indication of the stream event. On stream event,
the USB host application will start requesting image data
from FX3; FX3 is supposed to start sending the image
data from the image sensor to the USB 3.0 host. In the
firmware, “UVCAppThread_Entry” function has an infinite
for loop. When there is no streaming, the application
thread will wait in this for loop until there is a stream event.
Note If there is no stream event, FX3 does not need to
transfer any data. So, the GPIF II state machine need not
be initialized to transfer data. Otherwise, all the buffers
would be full before the host application starts to pull data
out of the buffers, and FX3 would transmit a bad frame.
Hence, the GPIF II state machine should be initialized only
if there is stream event.
When FX3 receives a stream event, the main application
thread will start the GPIF II state machine. On power up,
the GPIF II waveform descriptor is not loaded into the
memory. So, the firmware loads the GPIF II waveform in
the memory by using “UVCAppThread_Entry 
CyFxUvcAppGpifInit  CyU3PGpifLoad” function. Then,
firmware starts the GPIF II state machine using
“UVCAppThread_Entry

CyFxUvcAppGpifInit

CyU3PGpifSMStart” function.
Handling the Buffers During Streaming
When
designing
the
DMA
channel
in
the
“CyFxUVCApplnInit” function, the firmware created a
www.cypress.com
manual channel with call back notification for the consume
events. This notification is used to track the amount of
data read by the host. At the end of frame transmission,
the DMA channel is reset as a part of the clean up
process. It is safe to reset the DMA channel only after all
data has been drained out of the pipe. If the DMA channel
is reset with valid image frame data in the pipe, the data
would be lost. Therefore, this notification plays an
important role in keeping the firmware operational.
The firmware manually handles the buffers when
streaming the data. In the main thread it checks for a
produced buffer via “CyU3PDmaMultiChannelGetBuffer”.
A buffer is produced when the producer buffer (PIB buffer
in our case) is committed from the GPIF II side or if it is
forcefully wrapped up by the FX3 CPU. During an active
frame (frame valid signal asserted) period, the image
sensor is streaming data and GPIF II will produce full PIB
buffers. At this time, the FX3 CPU has to commit 16,380
bytes of data to the USB.
At the end of a frame, usually the last buffer is partially
filled. In this case, the firmware must forcefully wrap up the
buffer on the producer side to trigger a produce event and
then commit the buffer to the USB with the appropriate
byte count. The forceful wrap-up of the PIB buffer is
executed in the GPIF II call back function “CyFxGpifCB 
CyFxUvcAppCommitEOF

CyU3PDmaMultiChannelSetWrapUp”. The “CyFxGpifCB”
function is triggered when GPIF II sets a CPU interrupt. As
shown in Figure 6 from the GPIF II State Machine Design
section, this interrupt is caused when a frame valid signal
is deasserted (basically end of frame). It should be noted
that the hitFV variable is set to indicate that the frame
capturing from the image sensor has ended.
Note UVC header carries information regarding the frame
identifier and the end-of-frame marker. At the end of a
frame, the least significant bit of the second byte of the
UVC header needs to be set to ‘1’ (refer
“CyFxUVCAddHeader”), and at every start of a new frame,
the second last significant bit of the second byte of the
UVC header needs to be toggled (see Clean Up section).
The variables prodCount and consCount track every
produce and consume event, respectively. These
variables help keep track of the buffers that were
produced and consumed, to ensure that all the data has
been drained from the FX3 FIFO. This is useful to know
for the firmware so that it can reset the channel at the end
of a frame transmission from FX3 to host.
Clean Up
At the end of a frame, the GPIF II state machine generates
a CPU interrupt, resulting in a call-back function. This
function helps to facilitate committing the last buffer to the
USB. Now, the firmware waits for the USB to drain all the
data in the ‘for’ loop of “UVCAppThread_Entry” function.
Here it checks for hitFV (which is set by the GPIF callback function) and equality of prodCount and consCount.
Document No. 001-75779 Rev. *B
14
®
How to Implement an Image Sensor Interface with EZ-USB FX3™ in a USB Video Class (UVC) Framework
As soon as both conditions are met, the firmware cleans
up the FIFO, resets the channel, toggles the UVC header
bit for frame index, and calls “CyU3PGpifSMSwitch”
function. Looking at the state machine diagram in
Figure 6, after the GPIF II state machine issues a CPU
interrupt, it stops transferring data. “CyU3PGpifSMSwitch”
function switches the state of the GPIF II state machine to
start, and it can begin to stream data on the next frame
start event (positive edge on frame valid signal).
Aborting the Streaming
There are three ways to abort image streaming: The FX3
can be disconnected from the host, the USB host utility
may be turned off, or the USB host may issue a reset or
suspend command to FX3. All of these actions trigger the
“CY_FX_UVC_STREAM_ABORT_EVENT” event (refer to
the “CyFxUVCApplnAbortHdlr” function). This action does
not always happen when there is no data in the FX3 FIFO.
This means proper clean up is required. The firmware
resets streaming-related variables, cleans up the FIFO,
and resets the channel in the “UVCAppThread_Entry” for
loop. It will not call the CyU3PGpifSMSwitch function,
because there is no streaming required, and then wait for
the next streaming event to occur. When the application is
closed, it issues a clear feature request on a Windows
platform or a set interface with alternate setting = 0
request on MAC platform. Streaming stops when this
request is received. This request is handled in
“CyFxUVCApplnUSBSetupCB” function under switch case
“CY_U3P_USB_TARGET_ENDPT”
and
request
“CY_U3P_USB_SC_CLEAR_FEATURE”.
this application note. Use this GPIF II Designer project for
any required customization to the descriptor.
uvc.c file
This file illustrates the UVC application example. The
video streaming is accomplished by configuring a manyto-one manual DMA channel (two producer sockets at PIB
side and one consumer socket at UIB side). The two
producer sockets alternate transferring data from the
sensor to the DMA buffer for consumption by USB. This
allows continuous video streaming. The UVC-specific
header is added to the video payload data in the call-back
function CyFxDMACallback(). The call-back function is
called when a DMA produce event occurs. It contains all
the functions (except sensor module initialization) that
enable FX3 to run the UVC application.
uvc.h file
This file contains switches to enable particular
functionalities in the UVC application project.

UVC_PTZ_SUPPORT switch can be used to
enable pan-tilt-zoom function placeholders.

BACKFLOW_DETECT switch will turn on the
GPIF overflow error code detection when GPIF II
block would overflow FX3 buffers. (A backflow
occurs when the image sensors send data at a
rate that exceeds the speed at which the USB
host can read, causing FX3 buffers to overflow.)
At this point, you might be tempted to implement
a recovery mechanism (stop streaming – flushing
endpoints and resetting pipe – start streaming),
but by doing so you cannot recover lost data. The
FX3 data streaming sequence would be
misaligned when backflow occurs and it would
send incomplete frames to the UVC host. UVC
host applications would always discard
incomplete frames, which will cause glitching on
the display. Nonetheless, the cleanup at the end
of a frame would correct this misalignment and
would not cause a firmware lock-up. However,
extended or frequent backflow conditions can
cause a reduction in frame rates.

DEBUG_PRINT_FRAME_COUNT switch can be
used to enable UART prints of frame count to
check if FX3 is streaming.

USB_DEBUG_INTERFACE switch can be used
to enable the debug interface. Refer to the Debug
Interface section for details.
Firmware Example Project File
Details
This section describes the different files and placeholders
contained in the example firmware project associated with
this application note.
The example firmware project incorporates the GPIF II
descriptor for an image sensor interface and supports
UVC. To use this project, first install the FX3 SDK and
then import the project into the Eclipse IDE workspace.
Please note this is not a complete project and may not
function as it is. Certain sections of code specific to the
image sensor may need to be filled out if the application is
such that FX3 needs to configure the image sensor. In
case the configuration is handled by a different controller
and FX3 only needs to stream the image data, no
modifications may be required. The sections that need to
be completed are pointed out below. The following are the
main components of the firmware project:
cyfxgpif2config.h
This is the header file generated by the GPIF II Designer
tool. It contains the GPIF II descriptor for the image sensor
interface. The GPIF II Designer project is available with
www.cypress.com
cyfxuvcdscr.c file
This file contains the USB enumeration descriptors for the
UVC application example. Using these descriptors, the
FX3 device enumerates as a UVC device on the USB
host. The class specific descriptors indicate the supported
configurations of the image sensor to the host. These
Document No. 001-75779 Rev. *B
15
®
How to Implement an Image Sensor Interface with EZ-USB FX3™ in a USB Video Class (UVC) Framework
configurations include image resolution, frame rate, and
video control support, such as brightness or contrast.
sensor.c file
This part of the firmware is specific to the image sensor
being used. Certain sections of the code specific to the
sensor being used must be filled in if FX3 needs to control
or configure the image sensor. A placeholder is added for
the sensor_init() function in the sensor.c file. Example
implementations of I2C write and read commands to the
sensor are provided. The user must define the value of
SensorSlaveAddress in the sensor.h header file. This file
also contains the placeholder for the get and set functions
of brightness value from the image sensor. These need to
be populated with code specific to brightness control.
and EP 4 IN is configured as the debug response bulk
endpoint. When the firmware (with debug interface
enabled) is loaded, FX3 will enumerate three interfaces.
The first two are standard UVC interfaces for control and
streaming; the third is the debug interface. The third
interface needs to be bound to CyUSB3.sys driver, which
is provided as a part of the FX3 SDK. You can install the
driver according to the following instructions (the 64-bit
Windows 7 system is used as an example. XP users will
have to modify the .inf file to include the VID/PID and then
use the modified .inf file to bind the cyusb3.sys driver to
this interface.)
1.
Open Device Manager, right-click on FX3 (or
equivalent) under “Other devices”, and choose
“Update Driver Software…”
2.
Choose “Browse my computer for driver
software” on the next screen
3.
Choose “Let me pick from a list of device drivers
on my computer” on the following screen
camera_ptzcontrol.c file
This file contains PTZ control-related parameters and
placeholder functions that need to be completed per the
user’s application setup for PTZ control implementation.
“uvc.h/ UVC_PTZ_SUPPORT” should be defined to
enable PTZ control code.
Because the following components in the firmware
example are specific to the GPIF II descriptor
implementation, incorporate them as is:
DMA Channel Configuration
The P-to-U DMA channel must be set up as a many-toone channel, as shown in the uvc.c file in the firmware
project. The firmware uses PIB sockets 0 and 1 as the two
producer sockets. Note that these socket numbers are
configured by the GPIF II descriptor. The descriptor writes
data into thread 0 and thread 1, which are mapped to
socket 0 and socket 1. Do not change the PIB socket
numbers in the DMA channel configuration. You can
change the UIB consumer socket number to whichever IN
socket is used for video streaming.
Interrupt handling
The GPIF II descriptor generates several CPU interrupts
that the firmware must manage. The CyFxGpifCB() in the
uvc.c file is the call-back function that handles the GPIF
interrupts.
Debug Interface
Because the UVC application itself does not have a
mechanism to properly debug the firmware, you must add
a
non-UVC
interface.
The
#define
USB_DEBUG_INTERFACE switch in the uvc.h file
enables the relevant code in cyfxuvcdscr.c, uvc.h and
uvc.c files to implement the debug interface. The current
implementation of the debug interface allows reading from
and writing to image sensor registers over I2C as an
example of a debug mechanism.
Debug Interface Details
Two bulk endpoints are defined for this interface. EP 4
OUT is configured as the debug command bulk endpoint,
www.cypress.com
Document No. 001-75779 Rev. *B
16
®
How to Implement an Image Sensor Interface with EZ-USB FX3™ in a USB Video Class (UVC) Framework
4.
Click “Next” on the following screen
8.
5.
Click “Have Disk…” to choose the driver
6.
Browse for the cyusb3.inf file in the <SDK
installation>\<version>\driver\bin\<OS>\x86 or
\x64 folder, and click “OK”
7.
Choose a model and click “Next” to install
Click “Yes” on the warning dialogue box if it
appears
After the driver is bound to the third interface, the device
will show up in Control Center. You can access the FX3
firmware from here as a vendor-specific device. The
current implementation of the debug interface allows
reading and writing of the image sensor registers by using
the command/response bulk endpoints. EP 4 OUT is used
as the command bulk endpoint and EP 4 IN as the
response bulk endpoint. These endpoints can be
accessed in the Control Center under <FX3 device name>
 Configuration 1  Interface 2  Alternate Setting 0, as
shown in the following figure. Use the Data Transfer tab to
send a command or get a response after sending a
command.
Using the Debug Interface
Four commands are implemented in the debug interface:
single read, sequential read, single write, and sequential
write. There is no error check in this part of the code, so
the commands should be used in the correct manner. You
can implement error checks to ensure proper functionality.
It is assumed that the register is 16 bits wide and is also
addressed with 16 bits.
Single Read:
1.
www.cypress.com
Choose the command endpoint and type the
command in hex under the Data transfers. The
format of a single read is 0x00 <register address
high byte> <register address low byte>. The
figure shows the read command for register
address 0x3002. Do not use space while typing
in the hex data box. For example, click on the
hex data field and type “003002”.
Document No. 001-75779 Rev. *B
17
®
How to Implement an Image Sensor Interface with EZ-USB FX3™ in a USB Video Class (UVC) Framework
command failed. The following bytes indicate that
the value of the register read back is 0x0004.
Here is an example that shows a failed transfer
where the status is non-zero and the rest of the
bytes are stale values from a previous transfer.
2.
Click “Transfer Data-Out” to send the command.
Sequential Read:
3.
1.
Choose the command endpoint and type the
command in hex under the Data Transfers. The
format of a sequential read is 0x00 <register
address high byte> <register address low byte>
<N>. The figure shows a read command for four
(N=4) registers starting at register address
0x3002.
2.
The Bytes to Transfer for the response is (N*2+1)
= 9 for this case. The figure shows the values
read by FX3.
Choose the response endpoint and set the Bytes
to Transfer field to 3 to read out the response of
the single read command
4.
Click “Transfer Data-IN” to receive the response
5.
The first byte of response is the status. Status=0
means command passed, Status!=0 means
www.cypress.com
Document No. 001-75779 Rev. *B
18
®
How to Implement an Image Sensor Interface with EZ-USB FX3™ in a USB Video Class (UVC) Framework
Singe Write:
Sequential Write:
1.
Choose the command endpoint and type the
command in hex under the Data Transfers. The
format of a single write is 0x01 <register address
high byte> <register address low byte> <register
value high byte> <register value low byte>. The
figure shows the write command to write a value
of 0x0006 at register address 0x3002.
2.
The response for a single write contains three
bytes: <Status> <register value high byte>
<register value low byte>. These register values
are read back after writing, which means you will
see the same values sent in the command.
1.
Choose the command endpoint and type the
command in hex under Data Transfers. The
format of a sequential write is 0x01 <register
address high byte> <register address low byte>
((<register value high byte> <register value low
byte>) * N times) to write N number of registers.
The figure shows writing values 0x0006, 0x0008,
and 0x03C0 to registers 0x3002, 0x3004, and
0x3006 sequentially (N=3).
2.
The response is <Status> (<register value high
byte> <register value low byte>) * N values. For
this example, the total bytes to transfer is (2*N+1)
= 7.
Thus, the debug interface can be used to write or read data to or
from FX3 while it is functioning as a UVC device. This data can be
sensor settings, rolling debug buffers about events, counts of
committed packets per frame, or more, depending on the debug
requirement. Table 2 summarizes the functionalities of the main
files of the project. Descriptions of the hardware and software
setups to test this application follow.
www.cypress.com
Document No. 001-75779 Rev. *B
19
®
How to Implement an Image Sensor Interface with EZ-USB FX3™ in a USB Video Class (UVC) Framework
Table 2. Summary of the Main Files Included in the
Project Associated with This Application Note
File
sensor.c
camera_ptzcontrol.c
cyfxtx.c
cyfxgpif2config.h
uvc.c
cyfxuvcdscr.c
uvc.h
www.cypress.com
Sections to be
added/changed
Sensor_init() function needs to
be completed
2
I C commands to initialize
sensor need to be added
2
(examples of I C writes and
reads are provided)
Set and get brightness value
need to be completed
This file has the placeholder
functions for the PTZ control. If
needed, these functions need
to be completed as required
by the application setup.
“uvc.h/ UVC_PTZ_SUPPORT”
should be defined to enable
PTZ control code.
No changes needed. Please
use this file as provided with
the project associated with this
application note. Note the
default cyfxtx.c file provided
with other example projects in
the SDK may be different.
Header file generated by the
GPIF II Designer tool. No
direct changes are required. If
the interface needs to be
changed, a new header file
should be generated from the
GPIF II Designer tool.
Main source file for UVC
application. Changes needed
when modifying the code to
support controls other than
brightness and PTZ, and when
modifying to add support for
multiple video modes
Contains the USB
enumeration descriptors for
the UVC application. This file
needs to be changed if the
frame rate, image resolution,
bit depth, or supported video
controls need to be changed.
The UVC specification has all
the required details.
Contains switches to modify
the application behavior to turn
on/off Pan-Tilt-Zoom support,
debug interface, or backflow
detection.
Hardware Setup
The current project has been tested on a setup that
includes FX3 DVK and the Aptina image sensor MT9M114
along with an interconnect board to connect the two.
Details about this setup (along with the firmware source)
can be obtained from Cypress only after an NDA with
Aptina has been verified. All the details regarding
obtaining this setup are posted on Cypress’s website at
EZ-USB® FX3™ HD 720p Camera Kit and are repeated in
the following section for your convenience.
Hardware Procurement
1.
2.
3.
4.
5.
Sign NDA with Aptina (email your request to
[email protected] for an expedited process). If
you already have the NDA, please send it to
[email protected]. We will provide the Aptina
specific source files for the project after NDA
verification.
Buy the Aptina MT9M114 Image Sensor
headboard from Aptina Distributors.
Buy the EZ-USB FX3 Development Kit
(CYUSB3KIT-001).
Buy the interconnection board from a Cypress
Design Partner (Scroll halfway down the landing
page on this website).
Make sure to have a USB 3.0 host-enabled
computer for evaluating the best performance.
Hardware Setup
1.
Connect the jumpers on the FX3 DVK, as shown
in the figure.
2.
Connect the interconnect board on the FX3 DVK
J77. The connector types on the interconnect
Document No. 001-75779 Rev. *B
20
®
How to Implement an Image Sensor Interface with EZ-USB FX3™ in a USB Video Class (UVC) Framework
3.
4.
5.
board are unique, and the sockets are keyed so
that they will fit only in the correct orientation.
Connect the image sensor module on the
interconnect board.
Plug the FX3 DVK board into the USB computer
using the USB 3.0 cable (provided with the DVK).
Load the firmware on the FX3 using the Control
Center application provided as part of FX3 SDK.
6.
Choose Device  FX3 (Direct Show), and this
will start streaming images
7.
The bottom right shows the actual frame rate
8.
Video  Capture Pin can be used to select
between different supported resolutions and to
see which resolution is currently active.
9.
Video  Levels can be used to change
brightness (change slider position to change
value) or other supported control commands.
Find additional control commands under Video>Capture Filter.
UVC Host Applications
Various host applications allow you to preview and capture
video from a UVC device. One of the most widely used is
AMCap on the Windows platform. Version 8.0 in particular
has demonstrated it is stable when streaming, whereas
later versions of AMCap slow down stream rendering. In
addition, there are two other applications for Windows:
VirtualDub (an open-source application) and Debut Video
Capture software. Linux systems can use V4L2 driver and
VLC media player to stream video. Mac platforms can use
Face Time, iChat, Photo Booth, and Debut Video Capture
software to create an interface with the UVC device to
stream video.
Running the Demo
1.
2.
3.
4.
5.
Compile the firmware after making the required
changes to the firmware project to initialize the
image sensor in the correct configuration.
Load the firmware image on the FX3 UVC setup
(see Hardware Setup section for connection
details).
At this point, the setup will re-enumerate as a
UVC device. The UVC drivers would be installed
for this device by the operating system; no
additional drivers are required.
Open the UVC host application (for example,
VirtualDub).
Choose File  Capture AVI
www.cypress.com
Document No. 001-75779 Rev. *B
21
®
How to Implement an Image Sensor Interface with EZ-USB FX3™ in a USB Video Class (UVC) Framework
Summary
Troubleshooting
If you have a black screen, follow these debugging steps:
1.
2.
3.
4.
5.
6.
There is a “DEBUG_PRINT_FRAME_COUNT”
switch in the uvc.h file. Enable it to find out
whether FX3 is streaming images. This switch
will enable UART prints for frame counts. The
UART settings used are 115,200 baud, no parity,
1 stop bit, no flow control, and 8-bit data. If you
do not see the prints of the incremental frame
counter, there is probably a problem with the
interface between FX3 and image sensor (GPIF
or sensor control/initialization).
If you see the prints of the incremental frame
counter, the image data that is being sent out
needs to be verified. A USB trace can show the
amount of data being transferred per frame.
To check the total amount of data being sent out
per frame, find the data packets that have the
end-of-frame bit set in the header. (The second
byte of header is either 0x8E or 0x8F for end-offrame transfer). The total image data transferred
in a frame (not including the UVC header) should
be width * height * 2.
If this is not the amount from the USB trace, there
is an issue with the image sensor setting or with
the GPIF interface.
If the total amount of image data is correct and a
UVC host application still does not show any
images, change the host computer.
If the problems still persist, please create a
technical support case.
www.cypress.com
AN75779 described how a UVC webcam works.
Specifically, it showed:

How the UVC host application and driver interact
with a UVC device

How the UVC device manages UVC-specific
requests

How to program an FX3 interface to get data
from typical image sensors

How to display video streams and change
webcam properties in a host application

How to find host applications available on
different platforms, including an open-source host
application project
In addition, AN75779 described how to troubleshoot and
debug the FX3 firmware if required.
About the Author
Name:
Karnik Shah
Title:
Applications Engineer
Contact:
[email protected]
Document No. 001-75779 Rev. *B
22
®
How to Implement an Image Sensor Interface with EZ-USB FX3™ in a USB Video Class (UVC) Framework
Appendix A
Designing with the GPIF II Designer
Figure 10. Open File Menu and Select New Project
After a state machine is fixed, as Figure 6 shows, you can
create the configuration file easily using the GPIF II
designer. The example project shown here creates a
GPIF II descriptor for an image sensor interface, as
described in the Introduction of this application note, while
also adhering to UVC requirements. This process involves
three steps:
1.
Create a project using the GPIF II designer.
2.
Choose the Interface Definition
3.
Draw the state machine on the canvas
Figure 11. Enter Project Name and Location
Creating the Project
Start GPIF II Designer. The GUI appears as Figure 9.
Follow the step-by-step instructions as indicated in the
following figures. Each figure includes sub-steps. For
advance information on any of the steps, refer to the
GPIF II user guide.
Figure 9. Start GPIF II Designer
The project creation is now complete, and the GPIF II
Designer opens up access to the interface definition and
state machine tabs. The next step is to set the interface.
Choosing the Interface Definition
In this project, the image sensor connected to the FX3
device has an 8-bit data bus interface width. It uses GPIO
28 for Line valid signal, GPIO 29 for Frame valid signal,
and GPIO 22 for sensor reset as an active low input. This
2
image sensor also uses the I C connection for FX3 to load
the register values to configure the sensor in 720p mode.
By selecting the choices on the interface definition page,
the direction and polarity of the GPIO signals are set.
In addition, the indicated input signals become available in
the next phase to create transition equations. Figure 12
shows which interface settings (marked in blue boxes or
circles) to choose from the “Interface Definition” tab.
These are listed in order:
www.cypress.com
1.
Choose 2 inputs (for LV and FV)
2.
Choose 1 output (for nSensor_Reset)
3.
Select I2C in the “FX3 peripherals used”
4.
Select “Interface type” as “Slave”
5.
Select “Communication type” as “Synchronous”
6.
Select “Clock settings” as “External”
7.
Select “Active clock edge” as “Positive”
Document No. 001-75779 Rev. *B
23
®
How to Implement an Image Sensor Interface with EZ-USB FX3™ in a USB Video Class (UVC) Framework
8.
Select “Endianness” as “Little endian” (unless the
bytes need to be flipped)
9.
Select “Address/data bus usage” as “8 bit”
After all these settings are completed, the “I/O Matrix
configuration” should look like that shown in Figure 13.
Now, you can modify the properties of the input and the
output signals. The properties include the name of the
signals, the pin mapping (i.e., which GPIO acts as the
input or the output), the polarity of the signal, and the initial
value of the output signal.
Double-click “INPUT0” text in the Application Processor
area to open the properties for that input signal, as
Figure 14 shows.
Figure 14. “INPUT0” Default Properties
Figure 12. Selecting the Interface Settings
Change the name of the signal to “LV” (for “line valid”).
Change the “GPIO to use” to the line used for LV. This
value is GPIO_28. Keep the polarity to “Active High”. The
properties appear as shown in Figure 15. Click OK.
Figure 15. Changing Properties of Input0 Signal
Figure 13. I/O Matrix Configuration without
Modification
Next, open the properties box of “INPUT1” signal and
change the properties as follows: “Name”  “FV” for frame
valid, “GPIO to use”  GPIO_29, “Polarity”  “Active
High”. The properties box appears as shown in Figure 16.
www.cypress.com
Document No. 001-75779 Rev. *B
24
®
How to Implement an Image Sensor Interface with EZ-USB FX3™ in a USB Video Class (UVC) Framework
Figure 16. Changing Properties of Input1 Signal
Drawing the State Machine on the Canvas
Click on the State Machine tab to open the state machine
canvas. Drawing on the canvas is easy. It involves
creating new states, changing state properties, creating
transitions between states, creating transition equations
for each transition, and adding actions for the state
machine to do in each state. Figure 18 through Figure 39
show how to design the complete waveform, which
satisfies the requirements described in the Image Sensor
Interface section and keeps in mind the DMA capabilities
of FX3. Usually, the interface width, the GPIO line
numbers for FV/LV/Reset signals, and the count limit for
the counter are the only values specific to a given
interface. The other values remain the same from design
to design. The final design should appear similar to the
design in Figure 6 except there are small modifications
required, as explained in step 23 of Drawing Image
Sensor Interface State Machine for GPIF II section.
Change the properties of the “OUTPUT0” signal to the
following: “Name”  “nSensor_Reset”, “GPIO to use” 
“GPIO_22”, “Initial value”  “High”, “Polarity”  “Active
Low”, and “Signal mode”  Assert. The properties for the
output signal appear as shown in Figure 17.
Figure 17. Changing Properties of Output0 Signal
Note If the interface width is 32 bits, the iomatrix
configuration in the SDK should reflect “isDQ32bit” setting
as “CyTrue”. Refer to the FX3 SDK API guide to set this
parameter. In all other cases “isDQ32bit” can be set to
false to utilize the pins for other purposes.
B a s i c D r aw i n g Ac t i o n s o n t h e C a n va s
The following steps are basic actions you can use to draw
on the canvas.
1.
Adding a new state
To add a new state, right-click on the empty space in the
canvas and choose Add State from the menu, as
Figure 18 shows. Figure 19 shows the newly added state.
Figure 18. Right-Click to Open Menu to Add New State
This sets all the required Interface Settings. With these
settings in place, state machine canvas should give
options for LV and FV input signals in transition equations
and nSensor_Reset output signal in DR_GPIO action.
www.cypress.com
Document No. 001-75779 Rev. *B
25
®
How to Implement an Image Sensor Interface with EZ-USB FX3™ in a USB Video Class (UVC) Framework
Figure 19. New State Added to the Canvas
line is started to the state where the transition line is
dropped. See Figure 21; which shows the transition from
IDLE to WAIT_FOR_FRAME has been drawn.
Step 1: Point the mouse exactly between the square in the
center of the IDLE state. When the mouse reaches the
center, it turns into the black plus sign. Step 2: Drag and
drop a line from the black plus sign to the center of the
WAIT_FOR_FRAME state.
Figure 21. Drawing Transition between Two States
2.
Change the properties of the state
Double-click on the highlighted part (blue rectangle) of the
state in Figure 20 to open the properties dialog box for that
state. Make the required changes and click OK to apply
them.
Figure 20. Change the State Properties
Note While drawing the transition, you must connect the
center squares (or any other small squares around the
states) of the states. If the mouse click is dropped
somewhere other than these small squares, the transition
line will not appear.
The transition between the states appears as Figure 22.
Notice that the transition has an arrow (circled blue), and it
is in the direction from IDLE to WAIT_FOR_FRAME. This
indicates that this transition carries unidirectional
properties. To have a transition in the other direction (i.e.,
WAIT_FOR_FRAME to IDLE), another transition must be
drawn in the reverse direction.
Figure 22. Transition between Two States
3.
Move the state location on the canvas
Drag and drop the state using the same area as indicated
by the blue rectangle in Figure 20 to move the location of
the state on the canvas. This action does not change any
properties or connections associated with the state.
4.
Connecting two states – drawing transition
To connect two states, draw a transition line from the
center of one state to the center of another. There is a
direction associated with this transition line. The direction
of this transition line is from the state where the transition
www.cypress.com
Document No. 001-75779 Rev. *B
26
®
How to Implement an Image Sensor Interface with EZ-USB FX3™ in a USB Video Class (UVC) Framework
5.
Changing properties of the transition – adding the
transition equation.
Figure 24. Transition Equation “(FV) and (not LV)”
Double-clicking on the transition arrow, highlighted by the
blue circle in Figure 22, opens the Properties dialog box
for the transition—it’s called the Transition Equation Entry
dialog box. Add the transition equation, which the GPIF II
will use to switch from one state to another as linked by
the transition line. This dialog box allows access to all
trigger options and to basic logic operations. The triggers
include the inputs defined in the Interface Definition tab.
Figure 23 shows a sample Transition Equation Entry
dialog box.
Figure 23. Transition Equation Entry Dialog Box
6.
Adding action to the states and modifying
Various actions are available on the right side of the
GPIF II Designer in the State Machine tab under the
Action List pane. These actions, which can be performed
by the GPIF II state machine while in a particular state,
include the following:
To add an entry in the Equation Entry field, type the
equation in the box or click to add. Triggers are added by
double clicks, and logic operations are added by single
clicks. For example, to add an entry of “(FV) and (not LV)”,
take the following steps:

reading data into buffers from the data bus or
writing data on the data bus from the buffers

loading or counting the counters

driving outputs

interrupting CPU
To add these actions to a state, drag and drop the action
from the actions list into the marked area (blue rectangle
in Figure 25) inside the state.
Figure 25. Adding Action to State
Step 1: Double-click on FV in the Triggers.
Step 2: Single-click on the And button.
Step 3: Single-click on the ( button on the right.
Step 4: Single-click on the Not button..
Step 5: Double-click on LV in the Triggers.
Step 6: Single-click on the ) button. The result is shown in
Figure 24.
www.cypress.com
Document No. 001-75779 Rev. *B
27
®
How to Implement an Image Sensor Interface with EZ-USB FX3™ in a USB Video Class (UVC) Framework
After adding the action, the action appears inside the
state. For example, if IN_DATA action is added to the
state in Figure 25, the result is Figure 26.
Figure 28. Result of Steps 1, 2, and 3
Figure 26. Action IN_DATA Added to the State
Some actions have properties associated with them. To
changes the properties, double-click the action inside the
state (marked by a blue rectangle in Figure 26).
4.
All of these basic actions are already described in detail in
the GPIF II Designer’s user guide. With these basic
actions, you can create the state machine diagram, as
described in Figure 6.
Figure 29. Transition Equation Entry from IDLE State
to WAIT_FOR_FRAME_START State
Edit the equation for the transition from IDLE to
WAIT_FOR_FRAME_START to “not FV”, as shown in
Figure 29. The transition appears as Figure 30.
D r aw i n g I m a g e S e n s o r I n t e r f a c e S t a t e
Machine for GPIF II
Click on the State Machine tab to open the canvas. An
unedited canvas has only two states. A START state,
which has a transition to STATE0 with a LOGIC_ONE
transition equation as shown in Figure 27 is the blank
canvas.
Figure 27. Blank Canvas at Start
Figure 30. Transition from IDLE to
WAIT_FOR_FRAME_START
1.
Edit the name of STATE0 to IDLE
2.
Add a new state and change
WAIT_FOR_FRAME_START.
3.
Create
transition
from
WAIT_FOR_FRAME_START.
www.cypress.com
the
name
IDLE
to
to
Document No. 001-75779 Rev. *B
28
®
How to Implement an Image Sensor Interface with EZ-USB FX3™ in a USB Video Class (UVC) Framework
5.
Add
action
“LD_DATA_COUNT”
and
“LD_ADDR_COUNT” to WAIT_FOR_FRAME_START
state. Edit the properties of both actions, as shown in
Figure 31.
The limit value is calculated for the example project
according to Equation 1. The DATA counter is used for
counting data to switch thread0/socket0 at buffer
boundary, and the ADDR counter is used for counting data
to switch thread1/socket1 at buffer boundary. As a result,
these must be reset before entering the state that will
increment them.
Figure 31. LD_DATA_COUNT/LD_ADDR_COUNT
Action Settings
10. Add action IN_DATA to PUSH_DATA_SCK0 state.
This action helps read the data from the data bus into
the DMA buffers.
11. Add
action
LD_ADDR_COUNT
to
PUSH_DATA_SCK0 state to reload the ADDR
counter. This action is added here because, in one of
the next states, ADDR counter is used to count the
amount of data transferred into SCK1.
12. Edit the properties of the action IN_DATA in the
PUSH_DATA_SCK0 state, as shown in Figure 33.
Figure 33. IN_DATA Action for PUSH_DATA_SCK0
13. Add a new state with the name PUSH_DATA_SCK1.
14. Add action COUNT_ADDR to PUSH_DATA_SCK1
state.
15. Add action IN_DATA to PUSH_DATA_SCK1 state.
6.
Add a new state with the name PUSH_DATA_SCK0.
7.
Create a transition from WAIT_FOR_FRAME_START
state to PUSH_DATA_SCK0 state.
8.
Edit this transition equation entry to “FV and LV”. The
resulting state machine is shown in Figure 32.
16. Add
action
PUSH_DATA_SCK1
counter.
LD_DATA_COUNT
state to reload the
to
DATA
17. Edit the properties of the IN_DATA action of the
PUSH_DATA_SCK1 state to use Thread1, as shown
in Figure 34.
Figure 32. PUSH_DATA_SCK0 State Created
Figure 34. IN_DATA Action for PUSH_DATA_SCK1
9.
Add action COUNT_DATA to PUSH_DATA_SCK0
state. This increments the counter value.
www.cypress.com
Document No. 001-75779 Rev. *B
29
®
How to Implement an Image Sensor Interface with EZ-USB FX3™ in a USB Video Class (UVC) Framework
18. Create a transition from PUSH_DATA_SCK0 state to
PUSH_DATA_SCK1 state.
26. Add a new state “WAIT_TO_FILL_SCK0” below the
LINE_END_SCK0 state.
19. Edit this transition’s equation entry with the equation
“LV and DATA_CNT_HIT”.
27. Add a new state “WAIT_TO_FILL_SCK1” below the
LINE_END_SCK1 state.
20. Create a transition from PUSH_DATA_SCK1 state to
PUSH_DATA_SCK0 state. (Reverse direction)
These two states are entered when the buffers are not full
but the line valid is deasserted.
21. Edit this transition’s equation entry with the equation
“LV and ADDR_CNT_HIT”.
28. Create a transition from the LINE_END_SCK0 state to
the WAIT_TO_FILL_SCK0 state with the transition
equation “(not DATA_CNT_HIT)”, as Figure 36 shows.
These transitions occur when the state machine has to
switch between sockets or threads during an active line at
buffer boundaries to prevent data loss. Figure 35 shows
the resulting state machine diagram.
Figure 36. LINE_END_SCK0 to WAIT_TO_FILL_SCK0
Figure 35. PUSH_DATA_SCK0 and
PUSH_DATA_SCK1
22. Add a new state “LINE_END_SCK0” to the left of the
PUSH_DATA_SCK0 state.
23. Add a new state “LINE_END_SCK1” to the right of the
PUSH_DATA_SCK1 state.
These two states are entered when the line valid signal is
deasserted (i.e., image sensor switching to the next line).
Because the same operation is executed in different
sockets/threads alternatively, the states and the transition
need to be copied on both socket0 and socket1 sides.
As discussed in the GPIF II State Machine Design section
in Figure 6, there are three transitions out of
PUSH_DATA_ states. GPIF II hardware block implements
only two transitions out of any single state. To
accommodate this requirement, add a dummy state in the
state machine. LINE_END_ states behave as this dummy
state. Therefore, a transition scenario of the type “AB,
AC, AD” is now converted to “AB, AE, EC,
ED”, where A and B are PUSH_DATA_ states, C
represents WAIT_TO_FILL_ states, D represents
WAIT_FULL_ states, and E represents LINE_END_
states.
24. Create a transition from the PUSH_DATA_SCK0 state
to the LINE_END_SCK0 state with the transition
equation “(not LV)”.
25. Create a transition from the PUSH_DATA_SCK1 state
to the LINE_END_SCK1 state with the transition
equation “(not LV)”.
www.cypress.com
29. Create
a
transition
back
from
the
“WAIT_TO_FILL_SCK0”
state
to
the
“PUSH_DATA_SCK0” state with the equation “LV”.
The data transfer resumes in the same socket as
soon as the line is active.
30. Create a transition from the “LINE_END_SCK1” state
to the “WAIT_TO_FILL_SCK1” state with the
transition equation “(not ADDR_CNT_HIT)”.
31. Create
a
transition
back
from
the
“WAIT_TO_FILL_SCK1”
state
to
the
“PUSH_DATA_SCK1” state with the equation “LV”.
32. Add a new state “WAIT_FULL_SCK0_NEXT_SCK1”
below the “PUSH_DATA_SCK0” state.
33. Add a new state “WAIT_FULL_SCK1_NEXT_SCK0”
below the “PUSH_DATA_SCK1” state.
34. During these two states (WAIT_FULL_), the image
sensor is switching lines at buffer boundaries.
Therefore, the next data transfer has to happen
through the other (other than the currently active)
socket/thread.
35. Create a transition from the “LINE_END_SCK0” state
to the “WAIT_FULL_SCK0_NEXT_SCK1” state with
the equation “DATA_CNT_HIT”. The state machine
appears as Figure 37.
Document No. 001-75779 Rev. *B
30
®
How to Implement an Image Sensor Interface with EZ-USB FX3™ in a USB Video Class (UVC) Framework
The PARTIAL_BUF_ states are an indication of the end of
the frame, where the last buffer being written to has some
data that is not equal to full length L. The CPU must
commit this partial buffer manually.
Figure 37. WAIT_FULL_ states added
41. Add a new state “FULL_BUF_IN_SCK0” below the
“WAIT_FULL_SCK0_NEXT_SCK1” state.
42. Add a new state “FULL_BUF_IN_SCK1” below the
“WAIT_FULL_SCK1_NEXT_SCK0” state.
The FULL_BUF_ states are an indication that the end of
the frame data is the last byte in the buffer associated with
the corresponding socket/thread. There may not be any
special handling required, depending on the application
used.
36. Create a transition from the “LINE_END_SCK1” state
to the “WAIT_FULL_SCK1_NEXT_SCK0” state with
the equation “ADDR_CNT_HIT”.
37. Create
a
transition
from
the
“WAIT_FULL_SCK0_NEXT_SCK1” state to the
“PUSH_DATA_SCK1” state with the equation “LV”.
Notice that doing so will create a cross link in the
diagram.
38. Create
a
transition
from
the
“WAIT_FULL_SCK1_NEXT_SCK0” state to the
“PUSH_DATA_SCK0” state with the equation “LV”.
The resultant diagram appears as Figure 38.
Figure 38. Wait When Buffers Are Full
43. Create a transition from the “WAIT_TO_FILL_SCK0”
state to the “PARTIAL_BUF_IN_SCK0” state with the
equation “not FV”.
44. Create a transition from the “WAIT_TO_FILL_SCK1”
state to the “PARTIAL_BUF_IN_SCK1” state with the
equation “not FV”.
a
transition
from
the
45. Create
“WAIT_FULL_SCK0_NEXT_SCK1” state to the
“FULL_BUF_IN_SCK0” state with the equation “not
FV”.
a
transition
from
the
46. Create
“WAIT_FULL_SCK1_NEXT_SCK0” state to the
“FULL_BUF_IN_SCK1” state with the equation “not
FV”.
47. Add action “Intr_CPU” in each of the states
“PARTIAL_BUF_IN_SCK0”,
“PARTIAL_BUF_IN_SCK1”, “FULL_BUF_IN_SCK0”
and “FULL_BUF_IN_SCK1”.
The final state machine appears as Figure 39. This is the
final image that you need to create. This can be compared
directly with the state machine created in Figure 6 except
that there is an additional state from PUSH_DATA_ states
to ensure there are only two transitions out of the
PUSH_DATA_ states.
39. Add a new state “PARTIAL_BUF_IN_SCK0” below
the “WAIT_TO_FILL_SCK0” state.
48. Save the project.
40. Add a new state “PARTIAL_BUF_IN_SCK1” below
the “WAIT_TO_FILL_SCK1” state.
www.cypress.com
Document No. 001-75779 Rev. *B
31
®
How to Implement an Image Sensor Interface with EZ-USB FX3™ in a USB Video Class (UVC) Framework
Figure 39. Final State Machine Diagram for Image Sensor Interface
49. Build the project using the Build icon highlighted in
Figure 40.
Figure 40. Building the Project
50. Check the output of the project. This appears as a
header file called cyfxgpif2config.h generated under
the project directory, as shown in Figure 41. The
project output window indicates that the project was
built successfully.
Figure 41. Project Output in the GPIF II Designer Output Window (Located Below)
www.cypress.com
Document No. 001-75779 Rev. *B
32
®
How to Implement an Image Sensor Interface with EZ-USB FX3™ in a USB Video Class (UVC) Framework
Figure 42. Project Output on the Hard Drive
Editing the GPIF II Interface Details
This section describes how to change the interface
settings, if required. As an example, if the image
sensor/ASIC has a 16-bit-wide data bus, you would need
to change the GPIF II interface to accommodate the data
bus. To accomplish this, take the following steps:
1.
Open the ImageSensorInterface.cyfx project in the
GPIF II Designer. (This project may not be directly
compiled.)
2.
Go to File->Save Project As.
3.
Save the project in a convenient location with a
convenient name in the following dialog box.
4.
Close the currently open project (File->Close Project).
5.
Open the project that was saved in Step 3.
6.
Go to the Interface Definition tab and choose the 16
Bit option for Address/Data Bus Usage setting.
7.
Go to the State Machine tab.
8.
In the state machine canvas, double-click the
LD_DATA_COUNT
action
inside
the
WAIT_FOR_FRAME_START state. Change the
counter limit value to 8183, as stated in the GPIF II
State Machine Design section based on Equation 1.
9.
Do the same for LD_ADDR_COUNT action.
10. Save the Project.
11. Build the Project.
12. Copy the newly generated cyfxgpif2config.h header
file from the location selected in Step 3 to the firmware
project directory. (You might need to overwrite the
existing cyfxgpif2config.h file.)
www.cypress.com
Document No. 001-75779 Rev. *B
33
®
How to Implement an Image Sensor Interface with EZ-USB FX3™ in a USB Video Class (UVC) Framework
Document History
Document Title: How to Implement an Image Sensor Interface with EZ-USB® FX3™ in a USB Video Class (UVC)
Framework
Document Number: 001-75779
Revision
ECN
Orig. of
Change
Submission
Date
**
3591590
SHAH
04/19/2012
*A
3646722
SHAH
06/14/2012
Description of Change
New Application Note
Changed AN title to match the scope of the new version of AN
Added firmware project
Added explanation of the firmware project
Added the UVC application related details,
Revised the functional block diagram
Moved the steps to generate the GPIF II descriptor using the GPIF II
Designer tool to Appendix A
Added section in Appendix to show users how to modify a the given GPIF
II -Designer project
Clarified certain topics with explicit information
Updated the all the links in the document to point to the correct locations
within and outside the document
Removed references to AN75310
Removed references to the Slave FIFO application note
*B
3938382
SHAH
03/20/2013
Updated AN title
Updated the Software Version required
Updated the Abstract with information on newly added features
Updated TOC
Added more description on how UVC application works
Added general block diagram of UVC class requests
Modified the description of the file structure based on the new structure in
the associated project for ease of use
Added a section on USB descriptors for UVC application
Added details section on UVC class requests
Added description of sample control requests (brightness and PTZ)
included as new features in the updated associated project
Added a section on the UVC streaming requests
Added a section on the UVC video format and UVC header insertion
Updated the firmware application description section with appropriate
content to reflect the changes in the associated firmware project
Added a section describing an optional debug interface implemented as a
new feature and documentation on how to use this new interface
Added a section on the hardware setup instructions
Added a section on host applications available in market for viewing video
over UVC
Added a section on basic troubleshooting
Updated the GPIF II state machine design steps in the Appendix A to
accommodate the updated state machine used in the associated project
www.cypress.com
Document No. 001-75779 Rev. *B
34
®
How to Implement an Image Sensor Interface with EZ-USB FX3™ in a USB Video Class (UVC) Framework
Worldwide Sales and Design Support
Cypress maintains a worldwide network of offices, solution centers, manufacturer’s representatives, and distributors. To find
the office closest to you, visit us at Cypress Locations.
PSoC® Solutions
Products
Automotive
cypress.com/go/automotive
psoc.cypress.com/solutions
Clocks & Buffers
cypress.com/go/clocks
PSoC 1 | PSoC 3 | PSoC 5
Interface
cypress.com/go/interface
Lighting & Power Control
cypress.com/go/powerpsoc
cypress.com/go/plc
Memory
cypress.com/go/memory
Optical Navigation Sensors
cypress.com/go/ons
PSoC
cypress.com/go/psoc
Touch Sensing
cypress.com/go/touch
USB Controllers
cypress.com/go/usb
Wireless/RF
cypress.com/go/wireless
Cypress Developer Community
Community | Forums | Blogs | Video | Training
Technical Support
cypress.com/go/support
EZ-USB® and FX3™ are registered trademarks of Cypress Semiconductor Corp. All other trademarks or registered trademarks referenced herein are
the property of their respective owners.
Cypress Semiconductor
198 Champion Court
San Jose, CA 95134-1709
Phone
Fax
Website
: 408-943-2600
: 408-943-4730
: www.cypress.com
© Cypress Semiconductor Corporation, 2012-2013. The information contained herein is subject to change without notice. Cypress Semiconductor
Corporation assumes no responsibility for the use of any circuitry other than circuitry embodied in a Cypress product. Nor does it convey or imply any
license under patent or other rights. Cypress products are not warranted nor intended to be used for medical, life support, life saving, critical control or
safety applications, unless pursuant to an express written agreement with Cypress. Furthermore, Cypress does not authorize its products for use as
critical components in life-support systems where a malfunction or failure may reasonably be expected to result in significant injury to the user. The
inclusion of Cypress products in life-support systems application implies that the manufacturer assumes all risk of such use and in doing so indemnifies
Cypress against all charges.
This Source Code (software and/or firmware) is owned by Cypress Semiconductor Corporation (Cypress) and is protected by and subject to worldwide
patent protection (United States and foreign), United States copyright laws and international treaty provisions. Cypress hereby grants to licensee a
personal, non-exclusive, non-transferable license to copy, use, modify, create derivative works of, and compile the Cypress Source Code and derivative
works for the sole purpose of creating custom software and or firmware in support of licensee product to be used only in conjunction with a Cypress
integrated circuit as specified in the applicable agreement. Any reproduction, modification, translation, compilation, or representation of this Source
Code except as specified above is prohibited without the express written permission of Cypress.
Disclaimer: CYPRESS MAKES NO WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, WITH REGARD TO THIS MATERIAL, INCLUDING, BUT
NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. Cypress reserves the
right to make changes without further notice to the materials described herein. Cypress does not assume any liability arising out of the application or
use of any product or circuit described herein. Cypress does not authorize its products for use as critical components in life-support systems where a
malfunction or failure may reasonably be expected to result in significant injury to the user. The inclusion of Cypress’ product in a life-support systems
application implies that the manufacturer assumes all risk of such use and in doing so indemnifies Cypress against all charges.
Use may be limited by and subject to the applicable Cypress software license agreement.
www.cypress.com
Document No. 001-75779 Rev. *B
35