jldeon
Posts: 18
Joined: Thu Apr 18, 2013 2:45 pm

OMX_IndexParamPortDefinition on video_decode port 130

Thu Apr 18, 2013 4:33 pm

I'm working on a video decode pipeline using the Raspberry Pi and OMX/ilclient.

I'm experiencing a bottleneck in that I need to be able to accept a large number of small buffers and feed them into the video_decode component. I noticed that the default allocation is 20 buffers of 80k, which is a smaller amount of larger buffers than what I think would be ideal for my uses.

I was going to try to adjust this value with OMX_SetParameter using OMX_IndexParamPortDefinition on the decoder's input port (130).

However, it seems that even if I just GetParameter on this port, and then turn right around and SetParameter on the port without changing the values, the decoder stops working properly. Data goes in, but the port settings don't change, and there are never any output frames.

Does this call work? If so, is it even possible to adjust the number and size of the buffers on this port?

dom
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 5340
Joined: Wed Aug 17, 2011 7:41 pm
Location: Cambridge

Re: OMX_IndexParamPortDefinition on video_decode port 130

Thu Apr 18, 2013 4:38 pm


jldeon
Posts: 18
Joined: Thu Apr 18, 2013 2:45 pm

Re: OMX_IndexParamPortDefinition on video_decode port 130

Thu Apr 18, 2013 5:57 pm

Hmm, I must be doing something differently then. I'll take a look at the way they're calling it and compare against my code.

mpr
Posts: 22
Joined: Wed Mar 27, 2013 6:56 pm

Re: OMX_IndexParamPortDefinition on video_decode port 130

Thu Apr 18, 2013 7:11 pm

Changing the parameters and configs is very order-sensitive. Components have to be in certain states, events have to be sent at certain times, etc.

I find this document gives me some clues as to how things should be ordered:

http://www.khronos.org/registry/omxil/s ... cation.pdf

in there:
When an OMX_EventPortSettingsChanged event is emitted from a component’s
output port, the component shall cease transferring data on that port until the IL client
takes an action to commence it again
. Ceasing the transfer of buffers is required in order
to coordinate the updated port settings with any downstream component and possibly
facilitate the re-allocation of new port resources (e.g. port buffers). In order to commence
the emission of data again on the output port, the IL client shall disable and re-enable the
port – this action will also give the component port the opportunity to reallocate any new
buffer requirements associated with the port settings change between the IL client or its
tunneled port.
So possibly you just need to disable/re-enable the port, or modify the port definition earlier in your program. Or it could be something else. I haven't been able to reliably change these settings myself.

Also, are you checking the return code when you set the config? It may be silently failing.

jldeon
Posts: 18
Joined: Thu Apr 18, 2013 2:45 pm

Re: OMX_IndexParamPortDefinition on video_decode port 130

Thu May 02, 2013 8:05 pm

mpr wrote:Changing the parameters and configs is very order-sensitive. Components have to be in certain states, events have to be sent at certain times, etc.

I find this document gives me some clues as to how things should be ordered:

http://www.khronos.org/registry/omxil/s ... cation.pdf

So possibly you just need to disable/re-enable the port, or modify the port definition earlier in your program. Or it could be something else. I haven't been able to reliably change these settings myself.

Also, are you checking the return code when you set the config? It may be silently failing.
Thanks for the reminder about the spec, I found it helpful for another issue I was having (although I had to go off-spec anyhow...).

I'm not 100% sure that the component saves/honors the buffer counts if you set them differently. If you call GetParameter later, it seems like the defaults have sprung back again.

To close the loop on the original issue, as it turned out input buffers were not my bottleneck. I was trying to play a relatively high bitrate 1080p30 file and was only getting around 15-20 fps. I thought perhaps I was blocking waiting for more input buffers to become free, but it turns out I was blocking waiting for the OMX_EmptyThisBuffer call instead. I have yet to get to the bottom of this particular issue.

Return to “OpenMAX”