shusheer
Posts: 18
Joined: Wed Oct 28, 2015 3:36 pm

Re: Frame synchronisation

Thu Feb 18, 2016 1:17 am

Hi All,

This rather long thread seems to have died down, but I can't seem to find anyone reporting a nice way to hardware-synchronise multiple cameras for video without drift (e.g. 3D video using 2 cameras).

I'm wondering if something along the following lines might work?
* Have a network of Pis all exchange timestamps for some kind of MMAL-derived frame synchronisation (as VSYNC hardware pin doesn't work)
* Have the camera crystal replaced with a the filtered output of a Numerically Controlled Oscillator (NCO) running in e.g. a PIC16F18313, so we can adjust the output frequency smoothly
* Have each Pi talk to it's PIC/NCO to tell it how much to adjust the oscillator in a feedback loop to bring all Pis as close as possible

A bit more detail:
*Run PTP on a network of Pis so that local clocks are synchronised to ~usec level, or if a Compute Module with 2 cameras was used rather than a network of Pis, the need for PTP and local time synchronisation between Pis disappears
*On each Pi/Camera, set the video framerate desired, which presumably instructs the OV5647 to change the PLL from the local oscillator on the camera board to an appropriate setting. Crucially, we replace the crystal with our NCO.
*Measure some regular feature of frames using a convenient MMAL function (what would that be? frame complete or similar? From this thread it looks like the VSYNC pin from the OV5647 isn't functioning, hence the reliance on MMAL to get something good enough)
*Have the various Pis/Cameras report to each other via e.g. UDP the local time of this frame synchronisation measurement for each frame
*Have each Pi/Camera then decide how much to adjust it's PIC/NCO in order to bring it's observed time offset closer to the average value for the recent history of offsets measured by this and the other Pi(s) in the network.

I think this set-up would require just a PIC16F18313 and low-quality 100MHz resonator for the PIC's clock, plus a reasonably high-Q filter on the output of the NCO from the PIC - so a couple of dollars max cost to achieve effectively hardware-synchronised video.

This critically relies on two things:
1) The crystal being replaceable by the output of an NCO
2) Some consistent frame synchronisation signal available from MMAL or other mechanism

Can anyone confirm that either, or preferably both both of these conditions are able to be met?

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 7403
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Frame synchronisation

Thu Feb 18, 2016 8:21 am

shusheer wrote:This critically relies on two things:
1) The crystal being replaceable by the output of an NCO
Depends on how good your SMD rework skills are.
shusheer wrote:2) Some consistent frame synchronisation signal available from MMAL or other mechanism
Not an accurate signal that I can think of. MMAL only sees the frames after they've been through the ISP to convert from Bayer to YUV, and there will be jitter in the timing of that. What you really want is a trigger off the frame start interrupt, but that's not exposed. Linux task scheduling latencies are likely to be moderately significant even if I added a MMAL message sent by the GPU on frame start interrupt. (The GPU is running an RTOS, so scheduling is controllable).

Your NCO sounds like overkill. Framerate is controlled by adding/removing padding lines to the frame readout (consider it a variable vertical blanking period).
As described in viewtopic.php?f=43&t=48238&start=50#p763533, MMAL_PARAMETER_VIDEO_FRAME_RATE offers tweaks to the framerate in steps of 1/256th fps (ignoring rounding within the driver). If my maths is correct that should give you down to 4nsec accuracy control at 30fps - jitter will be greater than that. How you sort out a control loop to adjust that is up to you.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

zenmsav
Posts: 4
Joined: Mon Jul 13, 2015 7:42 pm

Re: Frame synchronisation

Thu Feb 18, 2016 8:57 am

I've abandoned any hopes of frame synchronization as its not already built into the hardware. It appears like a solution to this on R-pi hardware will require external hardware running modified protocol. I have had more success with the Playstation eye camera which has the added benefit of potentially higher frame rate at lower resolution which has more relevant hardware to complete the task.

shusheer
Posts: 18
Joined: Wed Oct 28, 2015 3:36 pm

Re: Frame synchronisation

Thu Feb 18, 2016 3:44 pm

6by9 wrote:
Not an accurate signal that I can think of. MMAL only sees the frames after they've been through the ISP to convert from Bayer to YUV, and there will be jitter in the timing of that. What you really want is a trigger off the frame start interrupt, but that's not exposed. Linux task scheduling latencies are likely to be moderately significant even if I added a MMAL message sent by the GPU on frame start interrupt.
Not too much of a problem over a long period of time - at least in my case I'm after a solution to long-term drift between cameras that may amount to substantial portions of a frame between cameras. I believe the xtal uncertainty will be something like 100ppm between cameras, so 1 frame error in 333sec = 5.5 minutes. However if we were to say we wanted 0.1 frame drift max, that suggests we have 30 sec = 900 frames to collect statistics over for our feedback loop, which sounds entirely capable of dealing with large amounts of scheduling jitter.
6by9 wrote:
MMAL_PARAMETER_VIDEO_FRAME_RATE offers tweaks to the framerate in steps of 1/256th fps (ignoring rounding within the driver). If my maths is correct that should give you down to 4nsec accuracy control at 30fps - jitter will be greater than that.
Brilliant - that does remove the need for the NCO and soldering skills entirely, and again for my purposes easily fine enough control (however I calculate 1/256th of a frame @ 30Hz as being 130usec adjustment, not 4ns, but either way well, well within my requirements.
Is there any example code using MMAL_PARAMETER_VIDEO_FRAME_RATE to adjust frame-rate on-the-fly that anyone could point me to? Google seems to be remarkably unhelpful...

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 7403
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Frame synchronisation

Thu Feb 18, 2016 4:34 pm

shusheer wrote:
6by9 wrote:
MMAL_PARAMETER_VIDEO_FRAME_RATE offers tweaks to the framerate in steps of 1/256th fps (ignoring rounding within the driver). If my maths is correct that should give you down to 4nsec accuracy control at 30fps - jitter will be greater than that.
Brilliant - that does remove the need for the NCO and soldering skills entirely, and again for my purposes easily fine enough control (however I calculate 1/256th of a frame @ 30Hz as being 130usec adjustment, not 4ns, but either way well, well within my requirements.
It's 1/256th of a frame/second, not of a frame time.
It was early - I got my decimal point in the wrong place.
I'd worked on 30+(1/256) = 30.00390625 fps. Reciprocate is 33.32899ms/frame. 30.0fps would be 33.3333ms. Difference is -0.004339ms, or 4.34usecs. That will vary depending on what your base framerate is (eg at 5fps, I reckon it's 156usecs difference).
shusheer wrote:Is there any example code using MMAL_PARAMETER_VIDEO_FRAME_RATE to adjust frame-rate on-the-fly that anyone could point me to? Google seems to be remarkably unhelpful...
It seems not, but it should be as trivial as

Code: Select all

MMAL_PARAMETER_FRAME_RATE_T fps = {{MMAL_PARAMETER_VIDEO_FRAME_RATE, sizeof(fps)},
         {desired_fps_8p8, 1}};
mmal_port_parameter_set(camera->output[0], &fps.hdr);
mmal_port_parameter_set(camera->output[1], &fps.hdr);
If you are using output[1] to encode, then that is taken as the master value for fps, otherwise output[0] is used.

If you want to allow AE to vary the frame rate within limits, then you can set the framerate in the port format or via MMAL_PARAMETER_VIDEO_FRAME_RATE to 0, and use MMAL_PARAMETER_FPS_RANGE to specify a minimum and maximum. That's the approach the V4L2 driver takes.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

BerryPicker
Posts: 177
Joined: Tue Oct 16, 2012 3:03 pm
Location: The East of England

Re: Frame synchronisation

Mon Apr 25, 2016 9:14 pm

shusheer wrote: Is there any example code using MMAL_PARAMETER_VIDEO_FRAME_RATE to adjust frame-rate on-the-fly that anyone could point me to?
ivannaz has now made available his C++ proof of concept here
viewtopic.php?p=957382#p957382

For those wishing to use python I have found this example code works

Code: Select all

def _set_new_framerate(self, Change):
    fps = mmal.MMAL_PARAMETER_FRAME_RATE_T(
        mmal.MMAL_PARAMETER_HEADER_T(
            mmal.MMAL_PARAMETER_VIDEO_FRAME_RATE,
            ct.sizeof(mmal.MMAL_PARAMETER_FRAME_RATE_T)
            ),
            (long(30*256 - Change), 256)
            )
    mmal_check(
        mmal.mmal_port_parameter_set(self._camera[0].output[1], fps.hdr),
                prefix="New framerate couldn't be set")
It does lock in on the fly making a change of rate every frame, but for me the results look better if the error signal is smoothed by averaging over 8 frames.

Edit: You should also add to your code
import picamera
import ctypes as ct
import picamera.mmal as mmal
from picamera.exc import (mmal_check)

User avatar
jbeale
Posts: 3498
Joined: Tue Nov 22, 2011 11:51 pm
Contact: Website

Re: Frame synchronisation

Mon Apr 25, 2016 11:29 pm

or even more averaging. The actual correction needed should be very predictable and slowly changing, because it is the relative drift of the SOC clock and the camera clock, which are both crystal oscillators. Unless you're heating one of them up suddenly, hitting it with a hammer, or have an unstable power supply, the frequency difference should be very stable. The software timestamp signal by contrast will be very noisy due to unpredictable latency from linux process scheduling, so lots of filtering and outlier-rejection is in order.

Kozuch
Posts: 64
Joined: Sun Aug 10, 2014 10:35 am

Re: Frame synchronisation

Thu Jun 28, 2018 2:31 pm

Looks like this topic has been idle for some time. However since last summer it is possible to get HW camera sync pulses to a GPIO so one can precisely measure the level of sync.

It has been said in another thread that on a stereo camera it may be possible to achieve a "nearest line level" sync of the left and right frames by adding / removing a blanking interval line from one of the cameras. Has anyone accidentally tried to play with this? Or wants to try?

Return to “Camera board”