Reply by ":: aH[sIM] ::" January 13, 20082008-01-13
Rolf,

Can I have their emails?

Regards,

Sim

----- Original Message ----
From: Rolf Viehmann
To: f...
Cc: s...@yahoo.com; n...@imap.cc
Sent: Wednesday, January 9, 2008 5:28:57 PM
Subject: Re: [fpga-cpu] Re: Frame Grabber using FPGA thru webcam

> On Tue, 8 Jan 2008 00:14:01 -0800 (PST), ":: aH[sIM] ::"
> said:
>>
>> I've abandon the plan to use USB webcam.
>>
>> [...]
>>
>> So, are there any other image sensor company that you guys would
>> suggest?
>>
>> The device has to be able to down sample the its resolution to at
least
>> 50x50 and the output should be in YCbCr or BW/grayscale digital
format.
>> What is your advice on selecting other alternatives?
>
> How about using the element out of an optical mouse? As far as I
know,
> these are all made by Agilent and are effectively a fast,
low-resolution
> greyscale camera with a DSP attached. With some models more or less
> direct access can be had to the image capture data via an I2C bus
> interface.
>
> Here's an example datasheet:
> http://literature.agilent.com/litweb/pdf/5988-4289EN.pdf
>
> I'd suggest the easiest way to procure one would be to simply salvage
> from a working mouse. That way you get a decent lens too.

That's a possible way to go, some fellows at my university are doing
some
experiments with mouse sensors at the moment, and if I got everything
correct, it's not very hard to get a picture out of the camera chip,
but
there are some drawbacks:
- The resolution is not very high, the highest resolution I heard of so
far is 32x32 pixels, the Agilent chip with the datasheet above is only
16x16.
- The bitrate (depth) is not very high, most of the time it is only 6
bit
(64 grayscale levels).
- The samplerate is very high if the DSP does the processing, but the
debug mode of the DSP can only provide a handful of frames/sec (the
debug
mode can be used to dump the picture from the sensor, but only a few
times/sec). But it the sensor chip is read directly (eliminating the
DSP
completely), framerate should be not a problem at all.

If you'd like to ask them some questions, I can ask them for their
email
addresses, and give them to you.

> Neil

Rolf

To post a message, send it to: f...
To unsubscribe, send a blank message to:
f...
Reply by John Kent January 12, 20082008-01-12
Hi Richard,

I have not got to chasing up the book yet.
We have had days of 40+C heat here in Melbourne Australia last week,
(that's over 100F) and I don't have air conditioning, so I'm sweating it out
and don't feel like doing much.

I think I had the book Robot Vision, or something like it, but it's in
a bookshelf in the lounge room, which is off limits to me at the moment
because my little friend has her junk strewn over the floor.
I use a walking frame, so it's a little difficult to navigate through
the mess.

I'm not a great one for mathematics, which is why I'm buying the book.
I would like to undertake a PhD this year, finances willing, so need to
brush up on my maths. I'm particularly interested in the maths for graphics
rendering of 3D models.

Years ago, I bought a Sublogic wire frame graphics package for the 6800.
We used it for various primitive flight simulators on the 6800 and 6809
using a "chunky graphics" memory mapped display and Bressenham line
drawing routines. It was not real fast but at 1 MHz the system was still
fast enough to play a rudimentary 3D graphics game.

I guess the biggest speed up would be in implementing the line drawing
routines in hardware, but implementing a 3D perspective engine in an
FPGA would be pretty neat too. Having a mouse of tracker ball that
allowed you to rotate a wire frame mesh in real time would be pretty cool.

The next step would be to have hardware to warp and texture map surfaces
onto the wire frame. You'd need a Z buffer to prioritize occlusions, so that
you only rendered the front most surfaces.

There was an article in IEEE Computer I think last year on GPUs.
Much of the perspective and warping transformations would apply
to both the video input and video output devices.

John.
rtstofer wrote:
>
> John,
>
> So, how's it going with the book?
>
> I think, perhaps, I skipped too many math courses in favor of hardware
> design. I didn't even do that well with divergence when it was used
> for Maxwell's Equations.
>
> Maybe I would be better served by something at a more introductory
> level like "Robot Vision" by Berthold.
>
> Richard
>

--
http://www.johnkent.com.au
http://members.optushome.com.au/jekent

To post a message, send it to: f...
To unsubscribe, send a blank message to: f...
Reply by rtstofer January 11, 20082008-01-11
--- In f..., John Kent wrote:

> I took a very quick look at the roborealm web site and saw the blob
tracker.
> I've ordered a book on mathematical algorithms for machine vision that
> I saw on wikipedia, so I hope I'll be able to brush up on my geometry
> and cross products and so on.


John,

So, how's it going with the book?

I think, perhaps, I skipped too many math courses in favor of hardware
design. I didn't even do that well with divergence when it was used
for Maxwell's Equations.

Maybe I would be better served by something at a more introductory
level like "Robot Vision" by Berthold.

Richard

To post a message, send it to: f...
To unsubscribe, send a blank message to: f...
Reply by rtstofer January 10, 20082008-01-10
> A question on the sensor. Can it take image of an object
> like say 10-20cm away? Cause, this kind of sensor suppose to capture
image of a mouse pad which is very near to the sensor.---

Without a focusing lense, I don't see how it could produce much of an
image. Besides, the pixel count is too low.

Richard

To post a message, send it to: f...
To unsubscribe, send a blank message to: f...
Reply by ":: aH[sIM] ::" January 10, 20082008-01-10
I follow your suggestion and I just gave my two old mice in the
garage an operation. I found out it uses Pixart sensor the datasheet
can be found here http://www.datasheets.org.uk/pdf/pan101-datasheet/pan101-datasheet.html while the another mouse from Microsoft is using s2083, it's a DIP-8 IC. From this website http://www.ic2ic.com/search.jsp?sSearchWord=S2083&prefix=S it seems that it comes from agilent, but i cant find the datasheet for it.

A question on the sensor. Can it take image of an object
like say 10-20cm away? Cause, this kind of sensor suppose to capture image of a mouse pad which is very near to the sensor.---

----- Original Message ----
From: Rolf Viehmann
To: f...
Cc: s...@yahoo.com; n...@imap.cc
Sent: Wednesday, January 9, 2008 5:28:57 PM
Subject: Re: [fpga-cpu] Re: Frame Grabber using FPGA thru webcam

> On Tue, 8 Jan 2008 00:14:01 -0800 (PST), ":: aH[sIM] ::"
> said:
>>
>> I've abandon the plan to use USB webcam.
>>
>> [...]
>>
>> So, are there any other image sensor company that you guys would
>> suggest?
>>
>> The device has to be able to down sample the its resolution to at
least
>> 50x50 and the output should be in YCbCr or BW/grayscale digital
format.
>> What is your advice on selecting other alternatives?
>
> How about using the element out of an optical mouse? As far as I
know,
> these are all made by Agilent and are effectively a fast,
low-resolution
> greyscale camera with a DSP attached. With some models more or less
> direct access can be had to the image capture data via an I2C bus
> interface.
>
> Here's an example datasheet:
> http://literature.agilent.com/litweb/pdf/5988-4289EN.pdf
>
> I'd suggest the easiest way to procure one would be to simply salvage
> from a working mouse. That way you get a decent lens too.

That's a possible way to go, some fellows at my university are doing
some
experiments with mouse sensors at the moment, and if I got everything
correct, it's not very hard to get a picture out of the camera chip,
but
there are some drawbacks:
- The resolution is not very high, the highest resolution I heard of so
far is 32x32 pixels, the Agilent chip with the datasheet above is only
16x16.
- The bitrate (depth) is not very high, most of the time it is only 6
bit
(64 grayscale levels).
- The samplerate is very high if the DSP does the processing, but the
debug mode of the DSP can only provide a handful of frames/sec (the
debug
mode can be used to dump the picture from the sensor, but only a few
times/sec). But it the sensor chip is read directly (eliminating the
DSP
completely), framerate should be not a problem at all.

If you'd like to ask them some questions, I can ask them for their
email
addresses, and give them to you.

> Neil

Rolf

Send instant messages to your online friends http://uk.messenger.yahoo.com



To post a message, send it to: f...
To unsubscribe, send a blank message to: f...
Reply by Rolf Viehmann January 10, 20082008-01-10
> On Tue, 8 Jan 2008 00:14:01 -0800 (PST), ":: aH[sIM] ::"
> said:
>>
>> I've abandon the plan to use USB webcam.
>>
>> [...]
>>
>> So, are there any other image sensor company that you guys would
>> suggest?
>>
>> The device has to be able to down sample the its resolution to at least
>> 50x50 and the output should be in YCbCr or BW/grayscale digital format.
>> What is your advice on selecting other alternatives?
>
> How about using the element out of an optical mouse? As far as I know,
> these are all made by Agilent and are effectively a fast, low-resolution
> greyscale camera with a DSP attached. With some models more or less
> direct access can be had to the image capture data via an I2C bus
> interface.
>
> Here's an example datasheet:
> http://literature.agilent.com/litweb/pdf/5988-4289EN.pdf
>
> I'd suggest the easiest way to procure one would be to simply salvage
> from a working mouse. That way you get a decent lens too.

That's a possible way to go, some fellows at my university are doing some
experiments with mouse sensors at the moment, and if I got everything
correct, it's not very hard to get a picture out of the camera chip, but
there are some drawbacks:
- The resolution is not very high, the highest resolution I heard of so
far is 32x32 pixels, the Agilent chip with the datasheet above is only
16x16.
- The bitrate (depth) is not very high, most of the time it is only 6 bit
(64 grayscale levels).
- The samplerate is very high if the DSP does the processing, but the
debug mode of the DSP can only provide a handful of frames/sec (the debug
mode can be used to dump the picture from the sensor, but only a few
times/sec). But it the sensor chip is read directly (eliminating the DSP
completely), framerate should be not a problem at all.

If you'd like to ask them some questions, I can ask them for their email
addresses, and give them to you.

> Neil

Rolf

To post a message, send it to: f...
To unsubscribe, send a blank message to: f...
Reply by Neil Stainton January 9, 20082008-01-09
On Tue, 8 Jan 2008 00:14:01 -0800 (PST), ":: aH[sIM] ::"
said:
>
> I've abandon the plan to use USB webcam. I
> decided to use image sensors with documentation. Due to the lack of time,
> it's not worth the fuss to reverse engineer the webcam. But the thing
> is, I've contacted Omnivision which was used by the AVRCAM and CMUCAM and
> requested sample image sensor from
> them but they are slow in responding. I'm worried that I might lose
> precious time while waiting for them.
>
> So, are there any other image sensor company that you guys would suggest?
>
> The device has to be able to down sample the its resolution to at least
> 50x50 and the output should be in YCbCr or BW/grayscale digital format.
> What is your advice on selecting other alternatives?
>

How about using the element out of an optical mouse? As far as I know,
these are all made by Agilent and are effectively a fast, low-resolution
greyscale camera with a DSP attached. With some models more or less
direct access can be had to the image capture data via an I2C bus
interface.

Here's an example datasheet:
http://literature.agilent.com/litweb/pdf/5988-4289EN.pdf

I'd suggest the easiest way to procure one would be to simply salvage
from a working mouse. That way you get a decent lens too.

Neil
To post a message, send it to: f...
To unsubscribe, send a blank message to: f...
Reply by rtstofer January 8, 20082008-01-08
--- In f..., ":: aH\[sIM\] ::"
wrote:
>
> I've just successfully found out the
> a simple way of implementing the algorithm using very simple
mathematic
> calculation during these last couple of weeks. So, that wouldn't be
a
> big problem anymore.
>
> I've abandon the plan to use USB webcam. I
> decided to use image sensors with documentation. Due to the lack of
time,
> it's not worth the fuss to reverse engineer the webcam. But the
thing
> is, I've contacted Omnivision which was used by the AVRCAM and
CMUCAM and requested sample image sensor from
> them but they are slow in responding. I'm worried that I might lose
> precious time while waiting for them.
>
> So, are there any other image sensor company that you guys would
suggest?
>
> The device has to be able to down sample the its resolution to at
least
> 50x50 and the output should be in YCbCr or BW/grayscale digital
format.
> What is your advice on selecting other alternatives?

If I was in a hurry, I would remove the image sensor from a CMUCam
2. It just plugs in to a header on the uC board. The parallel
interface for that image sensor should be easily available. I'm not
able to get through to www.cs.cmu.edu at the moment, their site may
be down.

Anyway, it uses the OV6620 and I found the datasheet here:
http://www.cmucam.org/attachment/wiki/Documentation/OV6620.PDF

You'll have to review the datasheet but it would seem to me that
downsampling would be done outside the sensor. It does seem to
handle YCbCr.

Another advantage of using the sensor from a CMUCam 2 is that you can
always plug it back in to the uC board and see if it is still working!

Richard

To post a message, send it to: f...
To unsubscribe, send a blank message to: f...
Reply by ":: aH[sIM] ::" January 8, 20082008-01-08
I've just successfully found out the
a simple way of implementing the algorithm using very simple mathematic
calculation during these last couple of weeks. So, that wouldn't be a
big problem anymore.

I've abandon the plan to use USB webcam. I
decided to use image sensors with documentation. Due to the lack of time,
it's not worth the fuss to reverse engineer the webcam. But the thing
is, I've contacted Omnivision which was used by the AVRCAM and CMUCAM and requested sample image sensor from
them but they are slow in responding. I'm worried that I might lose
precious time while waiting for them.

So, are there any other image sensor company that you guys would suggest?

The device has to be able to down sample the its resolution to at least
50x50 and the output should be in YCbCr or BW/grayscale digital format.
What is your advice on selecting other alternatives?
----- Original Message ----
From: John Kent
To: f...
Sent: Thursday, December 27, 2007 10:59:43 PM
Subject: Re: [fpga-cpu] Re: Frame Grabber using FPGA thru webcam

Hi Richard,

I'm sorry I did not look at your first email closely enough.

The OV9655 looks very similar to the web cam chip I referred to.

It's a pity the camera is not sold separately.

The Surveyor SRV1 looks pretty groovy.

I took a very quick look at the roborealm web site and saw the blob tracker.

I've ordered a book on mathematical algorithms for machine vision that

I saw on wikipedia, so I hope I'll be able to brush up on my geometry

and cross products and so on.

There are a few projects I would like to work on with image warping

and so on. I have the VDEC-1 digitizer for the Spartan 3E starter board

that's been sitting here for the last year gathering dust. And I have the

XESS XST-3.0 board with digitizer on it doing nothing.

I'm ordering a couple of analog cameras for an image processing project

but I need them to be synchronized, so the OV9655 would probably

be easier to do that with.

John.

rtstofer wrote:

> The hardware interface between the FPGA and the sensor certainly looks

> simple. I haven't played with video (yet!) but there is some

> discussion of the transformations in the datasheet. In particular,

> they discuss setting the color gains to get a particular gray scale.

>

> I may start playing with video next year with the Surveyor See

> www.surveyor. com

>

> There is a user project linked to the page where the objective is to

> track small orange squares of tape on a reflective black background

> (floor tile) under adverse lighing (reflections, etc). Several

> processing steps are taken to remove the reflections, locate and fill

> 'blobs' and then do edge detection. See

> http://www.roboreal m.com/tutorial/ Surveyor_ SRV1b_Trail/ slide010. php

>

> Richard

>

>

>

--

http://www.johnkent .com.au

http://members. optushome. com.au/jekent







Send instant messages to your online friends http://uk.messenger.yahoo.com



To post a message, send it to: f...
To unsubscribe, send a blank message to: f...
Reply by lee ainscough January 2, 20082008-01-02
unsubscribe
_________________________________________________________________
Fancy some celeb spotting?
https://www.celebmashup.com



To post a message, send it to: f...
To unsubscribe, send a blank message to: f...