rownyr
Posts: 41
Joined: Wed Jul 11, 2012 1:25 am

how to get clock running

Thu Jul 04, 2013 11:39 pm

I'm scratching my head over this for the past few days...

I have the following setup (arrows mean tunnels):

Code: Select all

    ______________            _________________           ______________
130] video_decode [131---->10] video_scheduler [11---->90] video_render |
   |______________|    +-->12]                 |         |______________|
                       |     |_________________|
                       |
   ______________      |
  |    clock     [80---+     __
  |              [81-------X   \
  |              [82-------X    \
  |              [83-------X     > DISABLED PORTS
  |              [84-------X    /
  |              [85-------X __/
  |______________|
Playback is working but it always performs at full speed. There's no framerate synchronization (maybe I should set it somewhere?).

When I set clock state to OMX_TIME_ClockStateWaitingForStartTime and read it back I always get OMX_TIME_ClockStateRunning:

Code: Select all

auto cstate = getHeader<OMX_TIME_CONFIG_CLOCKSTATETYPE>();
cstate.eState = OMX_TIME_ClockStateWaitingForStartTime;
cstate.nWaitMask = OMX_CLOCKPORT0;
auto err = OMX_SetParameter(clock->handle(), OMX_IndexConfigTimeClockState, &cstate);
assert(err == OMX_ErrorNone);
err = OMX_GetParameter(clock->handle(), OMX_IndexConfigTimeClockState, &cstate);
assert(err == OMX_ErrorNone);

// here cstate.eState == OMX_TIME_ClockStateRunning
Also, current media time always returns -1097652560 for the whole playback duration. The code I use:

Code: Select all

auto tstamp = getHeader<OMX_TIME_CONFIG_TIMESTAMPTYPE>();
tstamp.nPortIndex = OMX_ALL;
auto err = OMX_GetConfig(clock->handle(), OMX_IndexConfigTimeCurrentMediaTime, &tstamp);
assert(err == OMX_ErrorNone);
printf("current media time: %d\n", tstamp.nTimestamp);
What I'm doing wrong? Please help.

linuxstb
Posts: 77
Joined: Sat Jul 07, 2012 11:07 pm

Re: how to get clock running

Fri Jul 05, 2013 7:35 am

What does your code look like that sends the video packets to video_decode ? Are you setting the timestamps there?

rownyr
Posts: 41
Joined: Wed Jul 11, 2012 1:25 am

Re: how to get clock running

Fri Jul 05, 2013 9:49 am

linuxstb wrote:What does your code look like that sends the video packets to video_decode ? Are you setting the timestamps there?
It looks similar to hello_video code, but I'm using my helper classes. I'm not setting any timestamps:

Code: Select all

	for (int i = 0; i < decodeInput->actualBufferCount(); i++)
	{
		auto len = fread(inputBufs[i]->pBuffer, 1, decodeInput->bufferSize(), file);
		if (len == 0) {
			if (i == 0)
				return 1;
			else
				break;
		}
		inputBufs[i]->nOffset = 0;
		inputBufs[i]->nFilledLen = len;
		inputBufs[i]->nFlags = i == 0 ? OMX_BUFFERFLAG_STARTTIME : OMX_BUFFERFLAG_TIME_UNKNOWN;
		videoDecode->EmptyThisBuffer(inputBufs[i]);
	}

	while (true)
	{
		auto tstamp = getHeader<OMX_TIME_CONFIG_TIMESTAMPTYPE>();
		tstamp.nPortIndex = OMX_ALL;
		auto err = OMX_GetConfig(clock->handle(), OMX_IndexConfigTimeCurrentMediaTime, &tstamp);
		assert(err == OMX_ErrorNone);
		printf("current media time: %d\n", tstamp.nTimestamp);

		auto cb = videoDecode->waitForCallback();
		if (cb.isEmptyBufferDone()) {
			auto len = fread(cb.pBuffer->pBuffer, 1, decodeInput->bufferSize(), file);
			if (len == 0)
				break;
			cb.pBuffer->nOffset = 0;
			cb.pBuffer->nFilledLen = len;
			cb.pBuffer->nFlags = OMX_BUFFERFLAG_TIME_UNKNOWN;
			videoDecode->EmptyThisBuffer(cb.pBuffer);
		}
		else if (cb.isEventPortSettingsChanged()) {
			decodeOutput->refresh();
			auto v = decodeOutput->definition.format.video;
			float fps = (float)v.xFramerate / (float)65536;
			printf("video stream details: %dx%d %f fps, bitrate: %d\n",
					v.nFrameWidth, v.nFrameHeight, fps, v.nBitrate);
			printf("output buffer size: %d\n", decodeOutput->bufferSize());

			decodeOutput->tunnelTo(schedulerInput);
			schedulerOutput->tunnelTo(renderInput);
			clockOutput->enable();
			schedulerInput->enable(true);
			schedulerOutput->enable(true);
			schedulerClock->enable(true);
			decodeOutput->enable(true);

			videoScheduler->setState(OMX_StateIdle);
			clockOutput->waitForEnable();
			videoRender->setState(OMX_StateIdle);
			videoScheduler->waitForState(OMX_StateIdle);
			videoRender->waitForState(OMX_StateIdle);

			videoScheduler->setState(OMX_StateExecuting);
			videoRender->setState(OMX_StateExecuting);
			videoScheduler->waitForState(OMX_StateExecuting);
			videoRender->waitForState(OMX_StateExecuting);
		} else {
			printf("error: unknown event\n");
			return 1;
		}
	}

rownyr
Posts: 41
Joined: Wed Jul 11, 2012 1:25 am

Re: how to get clock running

Fri Jul 05, 2013 8:32 pm

This is weird. I think the problem is with setting OMX_TIME_ClockStateWaitingForStartTime clock state.
I added some test code to hello_video example:

Code: Select all

   memset(&cstate, 0, sizeof(cstate));
   cstate.nSize = sizeof(cstate);
   cstate.nVersion.nVersion = OMX_VERSION;
   cstate.eState = OMX_TIME_ClockStateWaitingForStartTime;
   cstate.nWaitMask = 1;
   if(clock != NULL && OMX_SetParameter(ILC_GET_HANDLE(clock), OMX_IndexConfigTimeClockState, &cstate) != OMX_ErrorNone)
      status = -13;

   // ADDED CODE BEGIN:
   if (OMX_GetParameter(ILC_GET_HANDLE(clock), OMX_IndexConfigTimeClockState, &cstate) == OMX_ErrorNone)
      printf("clock in waiting state: %d\n", cstate.eState == OMX_TIME_ClockStateWaitingForStartTime);
   else
      printf("can't query clock state\n");
   // ADDED CODE END
And guess what? hello_video happily prints "clock in waiting state: 1" output! I'm doing everything in the same way in my program but I always get "clock in waiting state: 0" (it immediately jumps to state OMX_ClockStateRunning)! I really don't know what is going on.

To summarize what I'm doing:
  • 1. OMX_GetHandle(), use "OMX.broadcom.clock" name
    2. disable all ports (I tried with OMX_ALL and one by one)
    3. call the code from the 1st post (starting with "auto cstate")
hello_video example does it in exactly same way but I get different results with my program! Do you know where to search for the problem?

rownyr
Posts: 41
Joined: Wed Jul 11, 2012 1:25 am

Re: how to get clock running

Sat Jul 06, 2013 7:41 pm

OMG! After 4 days I found it...

The problem was related to the size of OMX_TIME_CONFIG_CLOCKSTATETYPE structure. In hello_video example, sizeof(cstate) == 32. In my program it was 40 bytes... #pragma pack(1) solved the issue. And the video now play smoothly...

Why, oh why the linaro's cross compiler for raspbian used packing values that are different from the "onboard" gcc? :/

davidohyer
Posts: 1
Joined: Tue Jul 16, 2013 8:16 pm

Re: how to get clock running

Tue Jul 16, 2013 8:21 pm

I'm trying to get the current playback timestamp as well as set the current playback timestamp. This is the best thread I could find that deals with what I'm doing. I try the following but it just returns a timestamp value of 0:

OMX_TIME_CONFIG_TIMESTAMPTYPE tt;
tt.nPortIndex = OMX_ALL;
if (clock != NULL && OMX_GetConfig(ILC_GET_HANDLE(clock), OMX_IndexConfigTimeCurrentMediaTime, &tt) != OMX_ErrorNone)
printf("timestamp: %d", tt.nTimestamp);

I do this in the while loop of hello_video. Any ideas? Thanks.

Return to “OpenMAX”