User avatar
jansendup
Posts: 10
Joined: Mon Aug 19, 2013 9:02 am
Location: Stellenbosch, South Africa

Re: Frame synchronisation

Wed Sep 18, 2013 9:56 pm

I'm also trying to synchronize multiple PI cams. My first approach was to try and control the offset of the cams by varying the frame rate while streaming. I tried it but it seems that after enabling of the port subsequent calls to set port format or setting the fps port parameter in the mmal api gets ignored. I noted that once the port is enabled the pixel clock going out to the camera starts. This must mean that the camera is constantly sending frames even though no buffer headers have been sent to the ports. By writing a kernel module that hooks the vchiq_queue_message call that enables the camera port and reschedule it to get executed on 'n hrtimer event I'm able to finely control the time at which the camera starts capturing. I still need to measure the jitter on the time it takes for the gpu to enable the camera but I think it would be low enough to control the starting time of capture within 100us. There might be some jitter on the arm side as well. If it is significant one could lock all the mutexes used by vchiq_queue_message before rescheduling the call, then when it's time to execute the call disable pre-emption, unlock the mutexes, call the postponed vchiq_queue_message and then re-enable pre-emption again(haven’t tested this yet).

Danara
Posts: 13
Joined: Mon Dec 31, 2012 8:12 pm
Location: Ottawa Canada

Re: Frame synchronisation

Tue Oct 22, 2013 4:17 pm

jbeale wrote: ...[snip]...
If you're willing to wait for a while until the cameras sync, you could also pull the R-Pi board's 19 MHz crystal X2 (or use an external VCO clock signal) and a PLL setup to gradually slow down or speed up the second R-Pi's system clock, until the accumulated phase lag causes the frame start times of the two cameras to match. Note, that clock signal is multiplied up internally many times to get the 700 MHz, etc. system clock so it likely has stringent duty cycle and jitter requirements.
I would like to run many PiCams (perhaps 5 or 6, each with its own RPi), and have their video synchronised, in order to create a very large hi-res video from stitching together all of the video. I suspect clock drift will cause issues, especially over several hours. Would it make sense to mod the camera board to have all the camera boards use one master clock? I can likely use a GPIO signal to start all the captures at the same exact time, but I think something will need to keep the cameras in sync relative to each other. If the 19 MHz crystal is disabled on each PiCam, and an external clock is used, would this make sense? The camera boards will all be located next to each other, just pointed at different angles, so the clock signal path will be less than a few inches.

User avatar
jbeale
Posts: 3500
Joined: Tue Nov 22, 2011 11:51 pm
Contact: Website

Re: Frame synchronisation

Tue Oct 22, 2013 4:53 pm

Danara wrote:If the 19 MHz crystal is disabled on each PiCam, and an external clock is used, would this make sense? The camera boards will all be located next to each other, just pointed at different angles, so the clock signal path will be less than a few inches.
Note that 19 MHz is the R-Pi crystal, but the camera board uses 25 MHz. "Y1" on the camera pcb is a 25 MHz oscillator package which generates a 3V square wave output. There is even a little round test point next to it, that is the 25 MHz signal which you could provide externally if your remove Y1. There are "clock distribution chips" which can provide 4 or more outputs from a master clock. I think what you suggest could work, but I certainly have not tried such a thing. I believe the original R-Pi camera used a clock from the main SoC but that design was abandoned due to too much signal leaking off the unshielded cable to pass the EMI testing.

As it stands, the R-Pi powers the camera up and down when not in use. Not sure what would happen if you provide the 25 MHz clock while the chip is powered down, maybe you should either gate your external clock, or hardwire each RPi camera to "on".

Kozuch
Posts: 64
Joined: Sun Aug 10, 2014 10:35 am

Re: Frame synchronisation

Sun Sep 14, 2014 12:54 pm

Stereoscopic camera capture now seems to be implemented. Could anyone please confirm there is frame synchronization being done for this "stereoscopic feature"? I asked in the thread to get some info on the topic.

EDIT: Silly me, that "stereoscopic" feature is SW only, still no genlock.
Last edited by Kozuch on Mon Sep 22, 2014 7:38 am, edited 1 time in total.

xpeace
Posts: 8
Joined: Fri Sep 05, 2014 11:19 am

Re: Frame synchronisation

Sat Sep 20, 2014 7:00 pm

Still no success enabling vsync output ?
This would be useful for an EyeTracker application as well,
which would require that an ir ledflash is synced to the camera shutter.
Otherwise the IR light will damage the Eyes, ...

Kozuch
Posts: 64
Joined: Sun Aug 10, 2014 10:35 am

Re: Frame synchronisation

Mon Sep 22, 2014 7:35 am

How do I feed an external clock signal to the camera? Do I physically have to remove the Y1 or is it enough to cut the power supply line on the camera's flat cable (=remove supply from Y1) and bring new clock to the little round test point next to Y1?

tingiant
Posts: 1
Joined: Thu Sep 25, 2014 9:11 pm

Re: Frame synchronisation

Thu Sep 25, 2014 9:45 pm

Has anybody had any success modifying the registers to enable vsync to see if it even works?? It probably has to be somebody who has access to the driver or somebody willing to hookup a 2nd I2C driver to the SCCB pins and write it yourself.

From a previous project I am not 100% sure that pin works when in MIPI mode, I can't remember or maybe never tried. Its interesting that its connected on the camera side but not on the pi side.. I have never used this specific sensor but I have used other OV sensors within the same family in both DVP and MIPI mode. I just can't recall if its valid or not in MIPI mode as all the MIPI sync info is within the bit stream itself.

Xuth
Posts: 2
Joined: Mon Dec 22, 2014 9:29 pm

Re: Frame synchronisation

Mon Dec 22, 2014 9:49 pm

Since the rationale was that almost nobody cared about frame or exposure sync, I thought I'd weigh in that this is something that I've been looking for for many years in a camera module that I can afford to play with. I have zero experience with board design or playing with surface mount chips and wouldn't be able to do this on my own. There are a litany of applications that I've wanted to play with but am limited because at best I can buy used cameras on ebay and get _some_ of the functionality that I want.

Some of the applications that I play with or want to play with:
* Stereo photography,
* 3d modelling based on simultaneous images from cameras surrounding a subject,
* inverting the previous bullet point and get making a hemispherical or spherical panorama,
* synchronizing photos to triggers,
* "bullet time" and many variations on this.

jamesh
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 23976
Joined: Sat Jul 30, 2011 7:41 pm

Re: Frame synchronisation

Tue Dec 23, 2014 9:23 am

Xuth wrote:Since the rationale was that almost nobody cared about frame or exposure sync, I thought I'd weigh in that this is something that I've been looking for for many years in a camera module that I can afford to play with. I have zero experience with board design or playing with surface mount chips and wouldn't be able to do this on my own. There are a litany of applications that I've wanted to play with but am limited because at best I can buy used cameras on ebay and get _some_ of the functionality that I want.

Some of the applications that I play with or want to play with:
* Stereo photography,
* 3d modelling based on simultaneous images from cameras surrounding a subject,
* inverting the previous bullet point and get making a hemispherical or spherical panorama,
* synchronizing photos to triggers,
* "bullet time" and many variations on this.
Stereo is possible on the compute module as you can attach two camera and the software is present to use them in stereo mode (stills and video?). Bullet time has also been done with multiple Pi's using ethernet to signal captures - it generally fast enough to give a decent result although syncing via GPIO is faster with less jitter. It certainly quite feasible to have captures started using GPIO's, I'm sure many people have already done that.
Principal Software Engineer at Raspberry Pi (Trading) Ltd.
Contrary to popular belief, humorous signatures are allowed. Here's an example...
“I think it’s wrong that only one company makes the game Monopoly.” – Steven Wright

Xuth
Posts: 2
Joined: Mon Dec 22, 2014 9:29 pm

Re: Frame synchronisation

Tue Dec 23, 2014 7:23 pm

I've seen the "bullet time" rig that was posted widely but it really only works for static subjects. Without millisecond timing these things aren't worthless but they're very limiting. Some of my favorite subjects include moving water, fireworks, and people who are juggling, hooping or playing with fire.

If it's possible to get millisecond timing then people would be well served to put a link to how to do that here since this thread comes up as one of the top searches for quite a few different sets of search terms regarding synchronizing camera modules.

jamesh
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 23976
Joined: Sat Jul 30, 2011 7:41 pm

Re: Frame synchronisation

Tue Dec 23, 2014 7:49 pm

Not sure you can get to ms timing tbh. Might be able to pulse GPIO's and the Pi should be able to react quickly to that, but the time take to take the picture itself once the request has been received may be too jittery·
Principal Software Engineer at Raspberry Pi (Trading) Ltd.
Contrary to popular belief, humorous signatures are allowed. Here's an example...
“I think it’s wrong that only one company makes the game Monopoly.” – Steven Wright

youthreewire
Posts: 14
Joined: Tue Jan 13, 2015 4:20 am

Re: Frame synchronisation

Tue Jan 13, 2015 8:07 am

We wanted to do synchronous image grabbing.So we opened the camera sensor up and soldered a wire to the strobe pin (on a camera which already went bad earlier,just to try) but when the tried on a working camera the board got damaged.We are planning to buy another camera and machine (pick place robot kind of machine) solder a wire to the strobe pin.We want to sync IR leds to the frame being captured.We tried to modify the Dqueue code but we were not getting sync however the LEDs were turning on and off at 30fps.

youthreewire
Posts: 14
Joined: Tue Jan 13, 2015 4:20 am

Re: Frame synchronisation

Tue Jan 13, 2015 8:14 am

soldered wire to pad 10 on the camera sensor.
Attachments
IMG_20150113_115206.jpg
IMG_20150113_115206.jpg (18.53 KiB) Viewed 5151 times

youthreewire
Posts: 14
Joined: Tue Jan 13, 2015 4:20 am

Re: Frame synchronisation

Tue Jan 13, 2015 8:21 am

Can do sync with the VSYNC? Is VSYNC working on the 24 pin connector?

xpeace
Posts: 8
Joined: Fri Sep 05, 2014 11:19 am

Re: Frame synchronisation

Tue Jan 13, 2015 12:04 pm

renambot wrote: Both cameras driven with a modified version of raspivid blocking before the capture command:
MMAL_PARAMETER_CAPTURE
The cameras are triggered using a physical switch wired to a GPIO pin on each Pi board.
The cameras are set to the same white balance (fluorescent if I remember) and auto-exposure.
Hey Luc !
As there are many people looking for further vsync related info here,
can you make your code available or give some tips ?

youthreewire
Posts: 14
Joined: Tue Jan 13, 2015 4:20 am

Re: Frame synchronisation

Fri Jan 16, 2015 9:06 am

Any info on how to get VSYNC working?

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 7458
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Frame synchronisation

Fri Jan 16, 2015 11:47 am

We've got several new users posting on this thread. EXACTLY what are you trying to achieve?

I've reread the datasheet from http://www.seeedstudio.com/wiki/images/ ... 7_full.pdf (not the final and official datasheet, but that is under NDA so I can't discuss it).

Yes, the headline spec includes "support for internal and external frame synchronisation for frame exposure mode". That is then expanded in section 4.10:
In FREX mode, whole frame pixels start integration at the same time, rather than integrating row by row. After the user-defined exposure time, the shutter closes, preventing further integration and the image begins to read out. After the readout finishes, the shutter opens again and the sensor resumes normal mode, waiting for the next FREX request
ie you MUST have an external shutter mechanism. The Pi camera does not have one, so this mechanism is not applicable.

Other things that have been mentioned:
- VSYNC. This is primarily driven for the parallel output mode. We're using the CSI-2 bus, so it isn't used. This is also when the READOUT gets to the end of the frame. Due to the use of the rolling shutter, it does NOT reflect when the exposure of any particular line starts or stops. I can probably enable the relevant registers in the firmware, but I don't think it will do what you want.

- External shutter trigger to synchronise multiple cameras - other than the FREX mode listed above which isn't applicable to Pi, there isn't one. The best you can do is run them off the same pixel clock (24.8 or 25MHz oscillator) and command them all to start/stop at roughly the same moment. At least then they won't drift. That is the architecture that was originally intended for stereoscopic on BCM2835, with 2835 providing the reference. However the Pi Camera ended up with an independent oscillator for EMC reasons, so won't remain perfectly locked over an extended duration.

So those are the restrictions. If your requirement fits within those, then please state exactly what you are wanting and I can have a look into it.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

xpeace
Posts: 8
Joined: Fri Sep 05, 2014 11:19 am

Re: Frame synchronisation

Fri Jan 16, 2015 10:01 pm

Is there any Info on how to get FREX running ?

Seems like the FREX pin(11) on the OV5647 is not exposed through the flat camera cable that goes from the camera to the pi. However the SDA line may eventually be used. The datasheet mentions that FREX can also be controlled using the SCCB register and that exposure time can be set using 0x3B01, 0x3B04 and 0x3B05.

Could this be successful ?

I want to get frames from the Camera with fixed exposure.
Having full control of the process.(As fast as possible).
E.g.
Enable light by setting GPIO->Frex Capture Frame from camera->disable Illumination
process current frame then start again. (Illumination, FrexCapture, disable Illumination, process frame, ...)

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 7458
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Frame synchronisation

Sat Jan 17, 2015 7:45 am

xpeace wrote:Is there any Info on how to get FREX running ?

Seems like the FREX pin(11) on the OV5647 is not exposed through the flat camera cable that goes from the camera to the pi. However the SDA line may eventually be used. The datasheet mentions that FREX can also be controlled using the SCCB register and that exposure time can be set using 0x3B01, 0x3B04 and 0x3B05.

Could this be successful ?

I want to get frames from the Camera with fixed exposure.
Having full control of the process.(As fast as possible).
E.g.
Enable light by setting GPIO->Frex Capture Frame from camera->disable Illumination
process current frame then start again. (Illumination, FrexCapture, disable Illumination, process frame, ...)
Feel free to read the datasheet yourself, but my reading says FREX REQUIRES a shutter on the sensor. Pi Camera doesn't have one, therefore is not plausible. Using the strobe line to trigger scene illumination won't really work as the bottom lines will be exposed for the exposure time + the readout time, so the bottom of the image will be brighter than the top.
So whilst it would be possible to spend the time adding code to trigger FREX over SCCB, in my view it's not sensible to do it as there is no shutter.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

xpeace
Posts: 8
Joined: Fri Sep 05, 2014 11:19 am

Re: Frame synchronisation

Thu Jan 22, 2015 10:39 pm

I think I`ve read that Frex doesn`t use a rolling shutter but uses a global readout instead.
When enabling frex and eg IR light, on a microsecond scale, I don`t see the need for an external shutter mechanism.
But I have to admit there have been some implications I did not understand.
I thought exposure time can be adjusted in frex mode (no auto exp available).
The bottom lines would only be brighter when a rolling shutter would be used,
and the illumination is kept up longer than the exposure time.
6by9 wrote:Using the strobe line to trigger scene illumination won't really work as the bottom lines will be exposed for the exposure time + the readout time, so the bottom of the image will be brighter than the top.
So whilst it would be possible to spend the time adding code to trigger FREX over SCCB, in my view it's not sensible to do it as there is no shutter.

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 7458
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Frame synchronisation

Wed Jan 28, 2015 8:06 am

xpeace wrote:I think I`ve read that Frex doesn`t use a rolling shutter but uses a global readout instead.
When enabling frex and eg IR light, on a microsecond scale, I don`t see the need for an external shutter mechanism.
But I have to admit there have been some implications I did not understand.
I thought exposure time can be adjusted in frex mode (no auto exp available).
The bottom lines would only be brighter when a rolling shutter would be used,
and the illumination is kept up longer than the exposure time.
You have to get the data converted from the CMOS sensor using an A/D for each pixel to produce the output data stream. There is no internal memory on the sensor.
The only difference between rolling shutter and Frex is when the pixels get "reset" and exposure commences. The readout mechanism is identical in both.

Frex mode will reset all pixels, wait for the exposure period, and then start reading out starting from one corner at about 12ns per pixel. There is no way for it to read out every single pixel immediately at the end of the exposure period. There is also no electrical way to stop those pixels responding to incoming photons, therefore if you don't have a physical way to stop light coming into the sensor (eg external shutter), the bottom lines of the image will be brighter.

For some cases you may be able to say that the readout time for the frame is significantly less than the exposure time being used, or external illumination is tightly enough controlled with an almost perfect blackout for after the shot, but I'd say that was unlikely for most users, therefore I'm not going to look at implementing frex mode on the Pi camera.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

n8ers
Posts: 4
Joined: Tue Feb 10, 2015 3:42 am

Re: Frame synchronisation

Tue Feb 10, 2015 5:33 am

6by9 wrote:
xpeace wrote: I think I`ve read that Frex doesn`t use a rolling shutter but uses a global readout instead.
When enabling frex and eg IR light, on a microsecond scale, I don`t see the need for an external shutter mechanism.
But I have to admit there have been some implications I did not understand.
I thought exposure time can be adjusted in frex mode (no auto exp available).
The bottom lines would only be brighter when a rolling shutter would be used,
and the illumination is kept up longer than the exposure time.
You have to get the data converted from the CMOS sensor using an A/D for each pixel to produce the output data stream. There is no internal memory on the sensor.
The only difference between rolling shutter and Frex is when the pixels get "reset" and exposure commences. The readout mechanism is identical in both.

Frex mode will reset all pixels, wait for the exposure period, and then start reading out starting from one corner at about 12ns per pixel. There is no way for it to read out every single pixel immediately at the end of the exposure period. There is also no electrical way to stop those pixels responding to incoming photons, therefore if you don't have a physical way to stop light coming into the sensor (eg external shutter), the bottom lines of the image will be brighter.

For some cases you may be able to say that the readout time for the frame is significantly less than the exposure time being used, or external illumination is tightly enough controlled with an almost perfect blackout for after the shot, but I'd say that was unlikely for most users, therefore I'm not going to look at implementing frex mode on the Pi camera.
"There is also no electrical way to stop those pixels responding to incoming photons..."

Hmm, are you sure about this? My understanding is that this is the whole point of electronic global shutters, that they *do* block out most of the light. Here is an article discussing this technology. I quote: "Global Shutter...requires an in-pixel memory element that stores the signal after capture by the photodiode."

This technology is used in most industrial machine vision cameras and works extremely well. It means the difference between being able to do sophisticated SLAM algorithms, industrial process automation, tracking objects while moving, etc. and not.

Given that many, if not most, Raspberry Pi users are doing machine vision, utilizing the global shutter on the OV5647 will be a boon to the community. I think this feature is well worth the implementation time, will be greatly appreciated by the community, and will give the Raspberry Pi an additional advantage over other SOC platforms. In other words: pleeeease! ;-)

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 7458
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Frame synchronisation

Tue Feb 10, 2015 2:33 pm

I'm fairly certain that there is no per-pixel sample and hold circuit on OV5647, so no point in investing time into it.
I am however exchanging emails with an Omnivision Apps Engineer on a different subject, so I will ask him for confirmation. Even if he says it does support it, it is not a trivial task to plumb in.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

wsutton17
Posts: 1
Joined: Tue Feb 10, 2015 4:47 pm

Re: Frame synchronisation

Tue Feb 10, 2015 5:32 pm

I'm interested in tracking the ball in ping pong. Stereovision should help.

If this can be done in real-time and at a high enough fps, it would be really cool to use a quadcopter to volley the ping pong ball !!

The idea I'm working with right now is to decrease the processing time for comparing the two images thus speeding up the analyzed fps.

This could be done by only applying the stereovision-compare computation to a certain segment of each image pair - namely the window where the ball "should be" based on the path it is flying, which has been computed from earlier frames.

Another thing which should help in this domain: blob detection could have a high signal to noise as I can paint the ball any color that stands out from the background.

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 7458
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Frame synchronisation

Thu Feb 12, 2015 4:37 pm

6by9 wrote:I am however exchanging emails with an Omnivision Apps Engineer on a different subject, so I will ask him for confirmation. Even if he says it does support it, it is not a trivial task to plumb in.
I've just had a chat with the Apps Engineer from Omnivision as he happened to come into our office today to talk about a different subject. This is therefore pretty much direct from the horses mouth.

The FREX pin is a global reset of all pixels. Line 1 will then start reading out at the end of the configured exposure time, but readout is the same as in rolling shutter mode. If you don't have an external shutter, then the lower lines will have a steadily increasing exposure time as the frame is read out.
Therefore FREX is not a feasible feature to use with the standard Pi Camera.

Some Omnivision sensors have a FSIN pin. That is "frame sync in" and would allow exact synchronisation of the frame starts between multiple sensors. OV5647 does not have this pin, therefore it is not an option.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

Return to “Camera board”