Forums

OV6620 image sensor headaches

Started by john orlando May 11, 2004
Hello,
I am currently working on a project using the Omnivision OV6620 CMOS
color image sensor (the same one used by the CMUcam).  I have the
OV6620 interfaced to an Atmel AVR microcontroller, and I am able to
extract images from the camera and display them on a PC (through a PC
app I wrote...the images get sent up over the serial port).

Anyway...my question has to do with the Bayer image format used by the
camera.  The datasheets claim that the camera is a progressive scan
camera  (some of the later models such as the OV7620 offer both
interlaced and progressive).  However, there is an FODD output of the
camera (which is used to indicate when the "odd" portion of the frame
is being output).  Further, this output toggles with every tick of
VSYNC (vertical synch pulse), which would seem to indicate that the
camera is really doing an interlaced output.  The datasheet even makes
mention in its register definition section of several things that are
active in only progressive or interlaced mode.  Also, the first Note
on page 20 of the data sheet makes reference to the Even/Odd field
timing when interlaced mode is active, and my oscilloscope shows the
same timing pulses as they mention for interlaced mode.

Incidentally, I am using the camera in its 16-bit mode, utilizing both
the Y and the UV busses of the camera.  Thus, for each tick of the
pixel clock, I read one red and one green (or one green and one blue)
pixel, depending on where I'm at in the current line.  This is
different than the CMUcam, which uses the 8-bit mode.

This whole thing is causing my headaches because the color images I am
extracting from the camera are not quite right.  For example, when I
hold a red object up to the camera, the image I capture with it has
green lines every fourth line that shouldn't be there (I believe my
display code thinks that green pixels should be displayed for these
lines, but is somehow out of synch and is reading red values, thus
ending up with bright green lines).  I've viewed both the raw Bayer
data as well as a bi-linear interpolation of the data (to smooth
things out) and neither looks quite right.

One possible culprit is that I am sampling a line of pixels (actually,
two lines of pixels), sending them up over serial, waiting for the
next frame, sampling the next two lines of pixels, etc...which is why
I am wondering if there is something I'm not understanding regarding
the possibility of the images being interlaced.

Are there any OV6620 experts out there?  I've emailed Omnivision but
haven't heard back from them.  If they only knew...oh well...

Any help would be appreciated.  I'd be more than happy to give as much
detail as needed regarding this project if someone could offer some
help.

Thanks in advance!

John
john@jrobot.net (john orlando) wrote in 
news:cc4b5d44.0405110517.4271d58b@posting.google.com:

> This whole thing is causing my headaches because the color images I am > extracting from the camera are not quite right. For example, when I > hold a red object up to the camera, the image I capture with it has > green lines every fourth line that shouldn't be there (I believe my > display code thinks that green pixels should be displayed for these > lines, but is somehow out of synch and is reading red values, thus > ending up with bright green lines). I've viewed both the raw Bayer > data as well as a bi-linear interpolation of the data (to smooth > things out) and neither looks quite right.
Hi John, This is a complete stab in the dark; is it possible you're not quite meeting the pixel-rate requirements in your software, and occasionally missing a pixel and reading the rest of the line with the red-green pixels swapped? I'm guessing that you resynchronize with the horizontal clock at the beginning of every line, which would stop the error propogating for more than one line? Peter.
"john orlando" <john@jrobot.net> wrote in message
news:cc4b5d44.0405110517.4271d58b@posting.google.com...
> Hello, > I am currently working on a project using the Omnivision OV6620 CMOS > color image sensor (the same one used by the CMUcam). I have the > OV6620 interfaced to an Atmel AVR microcontroller, and I am able to > extract images from the camera and display them on a PC (through a PC > app I wrote...the images get sent up over the serial port).
What I remember from working with VLSI Vision sensors is that the ODD signal just indicates an ODD frame with respect to the pattern of the sync pulses on the video output, which are different for Even and Odd frames. The sensor I used, didn't even have an interlaced mode, it just output the same frame two times and shifted the sync pulses in the video to be compatible with a CCIR signal. The ODD signal indicated whether a new frame was output or just a copy of the previous one. Meindert
"CodeSprite" <debate@codesprite.com> wrote in message
news:Xns94E66056A6F32oHcRiKeYwotID@63.218.45.215...
> john@jrobot.net (john orlando) wrote in > news:cc4b5d44.0405110517.4271d58b@posting.google.com: > > This is a complete stab in the dark; is it possible you're not quite > meeting the pixel-rate requirements in your software, and occasionally > missing a pixel and reading the rest of the line with the red-green pixels > swapped? > > I'm guessing that you resynchronize with the horizontal clock at the > beginning of every line, which would stop the error propogating for more > than one line?
Only if he does not use the pixel clock to sample his data.... Meindert
"Meindert Sprang" <mhsprang@NOcustomSPAMware.nl> wrote in message
news:10a1lh8bh1e8cbb@corp.supernews.com...

So, to make a long story short: from digital sampling point of view, there
was no such thing as interacing. Whe I simply used the Frame Start to reset
my image pointer and sampled the right amount of pixels using the PCLOCK, I
got a frame. Period.

Meindert


"Meindert Sprang" <mhsprang@NOcustomSPAMware.nl> wrote in
news:10a1ljlhv63lof6@corp.supernews.com: 

> "CodeSprite" <debate@codesprite.com> wrote in message > news:Xns94E66056A6F32oHcRiKeYwotID@63.218.45.215... >> john@jrobot.net (john orlando) wrote in >> news:cc4b5d44.0405110517.4271d58b@posting.google.com: >> >> This is a complete stab in the dark; is it possible you're not quite >> meeting the pixel-rate requirements in your software, and >> occasionally missing a pixel and reading the rest of the line with >> the red-green pixels swapped? >> >> I'm guessing that you resynchronize with the horizontal clock at the >> beginning of every line, which would stop the error propogating for >> more than one line? > > Only if he does not use the pixel clock to sample his data.... > > Meindert > >
Mostly true... but the sampling rate may be marginal to catch one of the edges, in which case you could occasionally miss a whole pixel quite easily, even when checking the pixel clock. Been there, done that :( John - can you temporarily reduce the pixel clock rate to see if the problem goes away? Peter.
"Meindert Sprang" <mhsprang@NOcustomSPAMware.nl> wrote in message news:<10a1lh8bh1e8cbb@corp.supernews.com>...
> "john orlando" <john@jrobot.net> wrote in message > news:cc4b5d44.0405110517.4271d58b@posting.google.com... > > Hello, > > I am currently working on a project using the Omnivision OV6620 CMOS > > color image sensor (the same one used by the CMUcam). I have the > > OV6620 interfaced to an Atmel AVR microcontroller, and I am able to > > extract images from the camera and display them on a PC (through a PC > > app I wrote...the images get sent up over the serial port). > > What I remember from working with VLSI Vision sensors is that the ODD signal > just indicates an ODD frame with respect to the pattern of the sync pulses > on the video output, which are different for Even and Odd frames. The sensor > I used, didn't even have an interlaced mode, it just output the same frame > two times and shifted the sync pulses in the video to be compatible with a > CCIR signal. The ODD signal indicated whether a new frame was output or just > a copy of the previous one. > > Meindert
Wow...thanks so far for the flurry of answers....here is the scoop. I am actually clocking the OV6620 image sensor with the same clock driving the AVR. Thus, I am guaranteed to be synchronous and don't need to sample the pixel clock to determine where I should be sampling the data busses where pixel info is output. This will be used eventually, when I'm running at full speed. For now....yes, I have slowed the camera down, and am only running at 1 or 2 frames per second while I'm in this "debug" mode. Going slower helps diagnosing the problem. The green lines seem to be consistent from the top of the image to the bottom, again, making me think that somehow I am not understanding the format of the data and thus mistaking the red line for green data somehow. Meindert: Very interesting thought, that the odd/even may not necessarily relate to the fields but just the frame type (event thought the documentation seems to indicate it relates to fields within a single frame). There is a slight timing difference between the two frames (i.e., the VSYNC pulse that starts off each frame is longer on the odd frames, compared to the even). The image stream produced is CCIR compliant, so I will do some tests tonight to see if each line of data is repeated (to "fake" compliance with this standard). Another reason this idea has me curious is that it may explain some other things....it seems as though when running the sensor at its fastest frame rate(no prescalar on the pixel clock), there are 60 VSYNC signals per second, and I was always under the impression that the sensor did 30 frampes/sec nominally. Even more interesting is that there does seem to be a full "suite" of horizontal pulses (i.e., HREF pulses) per frame, not half the total number as would be expected if it were truly interlaced. For example, if the total number of rows in the image were 144, I would assume the "odd" field would have 77 horizontal rows, and the "even" field would have 77 horizontal rows...but not so...there seems to be 144 rows per frame...hmmmm....either this thing is doing interlaced video, or it wants to be compliant with a signalling standard (such as CCIR) and thus MUST supply the odd/even signalling, but is REALLY progressive. Any other ideas out there? Closing in on a solution (I hope), John
"john orlando" <john@jrobot.net> wrote in message
news:cc4b5d44.0405111259.16056abb@posting.google.com...
> I am actually clocking the OV6620 image sensor with the same clock > driving the AVR. Thus, I am guaranteed to be synchronous and don't > need to sample the pixel clock to determine where I should be sampling > the data busses where pixel info is output.
I wouldn't be too sure about that. Can you read from the datasheet that the pixel clock is continuous? I know for sure that the sensors I used (VVL) had no pixel clocks during the sync pulses. So IMO the pixel clock MUST be used to sample valid pixels. Even within one image line, the pixel clock was ony present during visible pixel time and not during the HSYNC pulse. Meindert
john@jrobot.net (john orlando) wrote in message news:<cc4b5d44.0405110517.4271d58b@posting.google.com>...

> One possible culprit is that I am sampling a line of pixels (actually, > two lines of pixels), sending them up over serial, waiting for the > next frame, sampling the next two lines of pixels, etc...which is why > I am wondering if there is something I'm not understanding regarding > the possibility of the images being interlaced.
John, I haven't worked with this specific sensor, so can't offer you any solutions. It sounds like you have to wait for 30 frames (or whatever the max fps this sensor can do) before you can get a whole image. If you have SRAM in your configuration, maybe you can arrange to capture the whole frame in SRAM before uploading it to the PC. Just a thought. Good luck.
"Meindert Sprang" <mhsprang@NOcustomSPAMware.nl> wrote in message news:<10a2i5uliafm823@corp.supernews.com>...
> "john orlando" <john@jrobot.net> wrote in message > news:cc4b5d44.0405111259.16056abb@posting.google.com... > > I am actually clocking the OV6620 image sensor with the same clock > > driving the AVR. Thus, I am guaranteed to be synchronous and don't > > need to sample the pixel clock to determine where I should be sampling > > the data busses where pixel info is output. > > I wouldn't be too sure about that. Can you read from the datasheet that the > pixel clock is continuous? I know for sure that the sensors I used (VVL) had > no pixel clocks during the sync pulses. So IMO the pixel clock MUST be used > to sample valid pixels. Even within one image line, the pixel clock was ony > present during visible pixel time and not during the HSYNC pulse. > > Meindert
Well, I guess I can't be positive, but I think my approach is valid. The plan here is to synch up with the frame when VSYNCH transitions, followed by synching with each line based on the pixel clock going high for the first pixel in the line (I have the pixel clock set so that it is gated with HREF). I have the pixel clock tied to an interrupt that is used to determine when the first pixel in each line is generated (again, after VSYNC transitions). Once the interrupt occurs, the pixel clock interrupt is turned off. Then, for the rest of this line, I know that 16 clock cycles pass on the microcontroller between valid data on the pixel busses...I sit in a tight loop doing things with the pixels as they stream in, and believe that this shouldn't cause any problems. The pixel clock is also an input to a hardware counter, and thus I can get an overflow at the end of the line, indicating that the line is complete. I then process the line, and, when complete, turn the pixel clock interrupt back on, and since the pixel clock is gated with HREF, I won't get another pixel until the first pixel in the next line comes in....and repeat. Incidentally, while I have all of this coded up, the code I am using to extract color images right now does not implement this since I am only trying to determine the exact format of the data coming out. Like I said...I'm still in debug mode, running at 1 or 2 frames per second, manually sampling the pixel clock to determine when I should be sampling the data busses. Does this make sense? I'm interested to hear your take on this.... TIA, John