Skip to content

Instantly share code, notes, and snippets.

@roxlu
Created March 23, 2020 13:52
Show Gist options
  • Star 2 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save roxlu/34db82b421bcdc2d4260b26800400e64 to your computer and use it in GitHub Desktop.
Save roxlu/34db82b421bcdc2d4260b26800400e64 to your computer and use it in GitHub Desktop.
OpenMAX experimenting
/*
2020 OPENMAX TEST V2
====================
LOG:
2020-03-21:
I started to cleanup this code a bit; it's still work in
progress and not yet working. Currently my call order is
incorrect, [this][13] reference might be a good one. I created
a couple of `openmax_*()` helper functions. Still need to setup
a lot.
2020-03-18:
I Created this file as a start of my research into OpenMAX to
create a video decoder that can decode a h264 video stream
from a Logitech C920 webcam. My goal is to create a simple
video/audio streaming client that I can use with my parents
and parent-in-law to setup a video stream so Gijs can enjoy
his grandpas/grandmas during the corona viris crisis.
Currently I'm initialising, getting/printing some information
and deinitialising. Tomorrow I hope to find some time to
configure the decoder in such a way that it can handle h264
nals.
GENERAL INFO:
I started looking into OpenMax again during the corona crisis
in 2020. I was thinking that it would be very helpfull to
have a super simple video/audio application that could be
used to stream video from/to my parent and parent-in-law.
As it was about ~2.5 years since I worked on OpenMax I
basically had to restart my research, with the huge
difference that I have a working version that can decode
h264. See the `test-openmax-v4` which decodes a short
video. Though to recompile this, make sure that the cmake
includes the necessary .cpp files.
Also see LOGBOOK.org where I've written some notes; also see
my iPad Pro nodes, march 2020 which contains some hand
written notes.
While restarting this project I used the 1.1.2 [OpenMax][0]
specification.
CALL FLOW
- Initialize:
bcm_host_init()
OMX_Init()
vcos_init()
vchi_initialise()
- Configure
- Get decoder handle: `OMX_GetHandle()`
- Find input/output video ports: OMX_GetParameter(..., OMX_IndexParamVideoInit, ..)
- Iterate over `OMX_PORT_PARAM_TYPE::nPorts`
- Get the `OMX_PARAM_PORTDEFINITIONTYPE` for each port.
- Check `eDir` for the input/output ports.
- Set video format:
- Create `OMX_VIDEO_PARAM_PORTFORMATTYPE` and set the `eCompressionFormat` to `OMX_VIDEO_CodingAVC`
- Call `OMX_SetParameter()` with `OMX_IndexParamVideoPortFormat`
- Create buffers
- At some point we have to allocate buffers; I'm not yet
sure when and how to do this.
CALL FLOW V1 SUMMARY:
- Normal init of bcm_host_init() etc.
- OMX_GetHandle() of video decode
- OMX_GetHandle() of video render
- Configure video render
- Create tunnel
- Allocate buffers
- Switch decoder to idle
- Use buffers
- Switch decoder to executing
- Switch render to idle
- Switch render to executing
GETTING STARTED NOTES:
- While diving into this again [this][0] reference is a good
start. Start by reading the first chapters which give a
goot overview.
- [This][2] document (filename:
lm80-p0436-16_omx_video_decoder.pdf) contains a high-level
overview and programming guide on how to use the video
decoder.
- [This][1] website describes a high level overview of
the video decoding system on Raspberry Pi.
- See [the OpenMax Intergration Layer Application Note][7]
which is really recommended, especially when you have to
read up on how to use buffers.
BUFFERS
- The most complex aspect of OpenMax seems to be how one
should use/create/deallocate buffers. How and when
you allocate buffers may depend on the state of the
component.
- A great high-level description [can be found here][8], I'm
pasting some relevant notes from that page here:
11.3 Buffers
-------------
Data is transferred into and out of components using
buffers, which are just byte arrays. Each component has
ports and the ports each have a number of buffers. The
program portinfo of the Components chapter can be used to
list how many buffers a port has, how many it must
minimally have and what the recommended buffer size is.
When a component is created it has an array of pointers
to buffer headers (a place holder for buffers and
information about them) but doesn't actually have any
memory allocated either for the headers or for the
buffers. These have to be allocated. In the next section
we look at the OpenMAX calls to do this, and then in the
following section how the IL Client library manages this.
11.4 OpenMAX Buffer Allocation
-------------------------------
There are two OpenMAX calls to get buffers for a
port. The first is OMX_AllocateBuffer where the IL client
asks the component to do all the work for them (Section
3.2.2.15 of the specification). Typical code is
for (i = 0; i < pClient-> nBufferCount; i++) {
OMX_AllocateBuffer(hComp,
&pClient->pBufferHdr[i],
pClient->nPortIndex,
pClient,
pClient->nBufferSize);
}
The component must be in the Loaded state or the port
must be disabled. Typically, an application will disable
the port and then allocate the buffers.
The other call is OMX_UseBuffer (Section 3.2.2.14). This
takes a buffer either allocated by the client or by
another component and uses that as its own buffer.
Once buffers have been allocated, the port can be moved
to enabled state.
11.6 Writing to and reading from buffers
----------------------------------------
The whole point of using OpenMAX on the RPi is to get
data into and out of the Broadcom GPU using the OpenMAX
components. You do this by loading up the input buffers
of a component with data and asking the component to
empty them by a call OMX_EmptyThisBuffer. When the
component has processed the data it signals a
OMX_FillBufferDone event for an output buffer, at which
point your application will take the processed data and
do something with it.
When a component has finished processing its input buffer
data it signals back to the client with an
OMX_EmptyBufferDone event. At this point, the client can
put more data into the buffer for processing by the
component.
Similarly, when a client has finished doing something
with the output data, it returns the buffer back to the
component by calling a function OMX_FillThisBuffer.
One input buffer may produce none or more buffers of
output data. An output buffer can consume data from one
or more input buffers. There is no direct correlation
between emptying an input buffer and filling an output
buffer. Essentially these processes should be done
concurrently by the client.
Also see [this reference/article][9] which similarly [to this
post][8] tells us that we have to disable the ports first.
ACRONYMS
- VideoCore, VC: This is the GPU on a Raspberry Pi
- VCHIQ: VideoCore Host Interface Queue
- VCOS: VideoCore OS Abstraction
OPENGL INTEROP:
- I haven't lookint into this at the time of writing, but
there has been [some discussion][3] about this. From
quickly scanning that post, it's possible using the
`OMX.broadcom.egl_render` and attaching it to the render
output.
- See the [Raspberry Pi documentation][4].
- See [this][11] article for some info.
RASPBERRY PI QUIRKS
- See [this][1] website which describes some of the quirks
that are necessary on the Raspberry Pi before you can use
the OpenMAX API.
REFERENCES:
[0]: https://www.khronos.org/registry/OpenMAX-IL/specs/OpenMAX_IL_1_1_2_Specification.pdf
[1]: https://elinux.org/Raspberry_Pi_VideoCore_APIs "Raspberry Pi VideoCore APIs"
[2]: https://developer.qualcomm.com/download/db410c/openmax-omx-video-decoder.pdf "OpenMAX (OMX) Video Decoder"
[3]: https://www.raspberrypi.org/forums/viewtopic.php?t=6577 "Forum post about EGL interop. "
[4]: https://github.com/raspberrypi/firmware/tree/master/documentation/ilcomponents "Raspberry Pi Firmware info."
[5]: https://github.com/rjsh/notes/blob/master/media/omx.md "Great step by step information"
[6]: http://www.jvcref.com/files/PI/documentation/ilcomponents/ "Nice port info reference; see what ports are used."
[7]: https://www.khronos.org/registry/OpenMAX-IL/specs/OpenMAX_IL_1_1_2_Application_Note_318.pdf "OpenMAX Integration Layer Application Note"
[8]: https://jan.newmarch.name/RPi/OpenMAX/Buffers/ "Great resource on how to use buffers."
[9]: https://kwasi-ich.de/blog/2017/11/26/omx/ "Introduction to OpenMAX on the Raspberry Pi"
[10]: https://github.com/kwasmich/OMXPlayground/blob/master/omxJPEGDec.c "OpenMAX Playground - has nice abstraction."
[11]: http://thebugfreeblog.blogspot.com/2012/12/decoding-and-rendering-to-texture-h264.html "Decoding and Rendering to Texture H264 with OpenMax on Raspberry Pi"
[12]: https://github.com/tjormola/rpi-openmax-demos/blob/master/rpi-camera-playback.c "Nice debug print functions."
[13]: https://github.com/jean343/RPI-GPU-rdpClient/blob/master/RPI-Client/client.cpp "Possibly good reference that shows the call order."
*/
#include <stdio.h>
#include <stdlib.h>
#include <unistd.h> /* For sleep() */
#include <bcm_host.h> /* For `bcm_host_init(), [see this website][1] */
#include <IL/OMX_Broadcom.h>
#include <interface/vchiq_arm/vchiq_if.h> /* For `vchi_initialise()`. */
#include <string>
#include <vector>
#include <fstream>
extern "C" {
# include <libavformat/avformat.h>
}
/* ------------------------------------------------------------------------- */
static std::string openmax_error_to_string(uint32_t err);
static std::string openmax_videocodingtype_to_string(uint32_t type);
static std::string openmax_colorformattype_to_string(uint32_t type);
static std::string openmax_naluformattype_to_string(uint32_t fmt);
static std::string openmax_dirtype_to_string(uint32_t type);
static std::string openmax_eventtype_to_string(uint32_t type);
static void openmax_print_components();
/* ------------------------------------------------------------------------- */
#define MODE_BLOCKING true
#define MODE_NON_BLOCKING false
static int openmax_port_get_input(OMX_HANDLETYPE handle, uint32_t& port);
static int openmax_port_get_output(OMX_HANDLETYPE handle, uint32_t& port);
static int openmax_port_get(OMX_HANDLETYPE handle, uint32_t direction, uint32_t& port);
static int openmax_port_disable(OMX_HANDLETYPE handle, uint32_t port, bool blocking);
static int openmax_port_enable(OMX_HANDLETYPE handle, uint32_t port, bool blocking);
static int openmax_port_change_state(OMX_HANDLETYPE handle, uint32_t port, uint8_t state, bool blocking);
static int openmax_port_buffers_allocate(OMX_HANDLETYPE handle, uint32_t port, uint8_t*** buffers, uint32_t& numAllocated);
static int openmax_port_buffers_use(OMX_HANDLETYPE handle, uint32_t port, OMX_BUFFERHEADERTYPE*** headers, uint8_t** buffers, uint32_t numBuffers); /* This will call `OMX_UseBuffer` to instruct the component to use the given data. */
static int openmax_component_state_change_to_idle(OMX_HANDLETYPE handle, bool blocking);
static int openmax_component_state_change_to_executing(OMX_HANDLETYPE handle, bool blocking);
static int openmax_component_state_change(OMX_HANDLETYPE handle, uint32_t state, bool blocking);
static int openmax_component_video_decoder_configure(OMX_HANDLETYPE handle, uint32_t inputPort);
static int openmax_component_video_render_configure(OMX_HANDLETYPE handle, uint32_t inputPort);
/* ------------------------------------------------------------------------- */
static OMX_ERRORTYPE callback_video_event(OMX_HANDLETYPE component, OMX_PTR user, OMX_EVENTTYPE event, OMX_U32 data1, OMX_U32 data2, OMX_PTR eventData);
static OMX_ERRORTYPE callback_video_empty_buffer_done(OMX_HANDLETYPE component, OMX_PTR user, OMX_BUFFERHEADERTYPE* buffer);
static OMX_ERRORTYPE callback_video_fill_buffer_done(OMX_HANDLETYPE component, OMX_PTR user, OMX_BUFFERHEADERTYPE* buffer);
static OMX_ERRORTYPE callback_render_event(OMX_HANDLETYPE component, OMX_PTR user, OMX_EVENTTYPE event, OMX_U32 data1, OMX_U32 data2, OMX_PTR eventData);
static OMX_ERRORTYPE callback_render_empty_buffer_done(OMX_HANDLETYPE component, OMX_PTR user, OMX_BUFFERHEADERTYPE* buffer);
static OMX_ERRORTYPE callback_render_fill_buffer_done(OMX_HANDLETYPE component, OMX_PTR user, OMX_BUFFERHEADERTYPE* buffer);
/* ------------------------------------------------------------------------- */
static void print_port_definition(OMX_HANDLETYPE handle, uint32_t portIndex);
/* ------------------------------------------------------------------------- */
int main(int argc, char* argv[]) {
printf("\n");
printf("2020 - OpenMax - V1\n");
printf("--------------------\n");
printf("---------------------------------------------------------------\n");
printf("---------------------------------------------------------------\n");
printf("---------------------------------------------------------------\n");
printf("@todo I just wrote some code that creates the renderer and which switches \n"
"the components into the executing state. It's still not working, but making \n"
"some progress.\n");
printf("---------------------------------------------------------------\n");
printf("---------------------------------------------------------------\n");
printf("---------------------------------------------------------------\n");
OMX_ERRORTYPE omx_res = OMX_ErrorNone;
VCOS_STATUS_T vcos_res = VCOS_SUCCESS;
int32_t vchi_res = VCHIQ_SUCCESS;
VCHI_INSTANCE_T vchi_ctx = {};
int r = 0;
int32_t k = 0;
uint32_t i = 0;
uint32_t decoder_input_port = UINT32_MAX;
uint32_t decoder_output_port = UINT32_MAX;
OMX_PORT_PARAM_TYPE param = {};
OMX_PARAM_PORTDEFINITIONTYPE port_def = {};
OMX_HANDLETYPE decoder_handle = 0; /* Handle to our `OMX.broadcom.video_decode` component. */
OMX_CALLBACKTYPE decoder_cb = {}; /* Callback functions that we provide when opening the decoder. */
OMX_VIDEO_PARAM_PORTFORMATTYPE decoder_video_format = {};
OMX_PARAM_PORTDEFINITIONTYPE decoder_port_def = {};
OMX_VIDEO_PORTDEFINITIONTYPE decoder_video_def = {};
OMX_NALSTREAMFORMATTYPE decoder_stream_format = {};
OMX_BUFFERHEADERTYPE** decoder_buffer_headers = nullptr;
uint8_t** decoder_buffer_data = nullptr;
uint32_t decoder_buffer_count = 0;
OMX_HANDLETYPE render_handle = 0;
OMX_CALLBACKTYPE render_cb = {};
OMX_CONFIG_DISPLAYREGIONTYPE render_region = {};
uint32_t render_input_port = UINT32_MAX; /* It seems that the `video_render` component has no output port. */
AVFormatContext* av_ctx = nullptr;
AVPacket av_pkt = {};
bool is_first_packet = true;
/* ------------------------------------------------------ */
/* INITIALIZATION */
/* ------------------------------------------------------ */
/*
See [this][1] website, the `Quirks` section that describes
that on a RPI we have to call `bcm_host_init()` before we
can call `OMX_Init()`.
*/
bcm_host_init();
/*
`OMX_Init()` initializes the OpenMAX framework and must
be called before any other OMX function is used. When you're
ready using the OpenMAX functions, you call it's counterpart
`OMX_Deinit()`.
*/
omx_res = OMX_Init();
if (OMX_ErrorNone != omx_res){
printf("Failed to init OMX. (exiting).\n");
exit(EXIT_FAILURE);
}
/*
Initialize the VideoCore OS layer. This is necessary before
using any of the other `vcos_*()` functions. See `vcos_init.h`
for more information. We deinitialize this using `vcos_deinit()`.
*/
vcos_res = vcos_init();
if (VCOS_SUCCESS != vcos_res) {
printf("Failed to initialized the VideoCore OS layer.\n");
exit(EXIT_FAILURE);
}
/*
See the Quirks section at [this page][1] for some background
info about why we have to call this; but in short before we
can use any of the `vc_*_` functions we have to initialise
them. Also, when we don't call this function we cannot get
the video decoder handle.
@todo I'm not sure how/when I have to deinitialise the VCHI.
Can't find any good documentation about this.
*/
vchi_res = vchi_initialise(&vchi_ctx);
if (VCHIQ_SUCCESS != vchi_res) {
printf("Failed to initialise the VideoCore Host Interface Queue.\n");
exit(EXIT_FAILURE);
}
/* For FFMPEG < 4.x we still need to make this call. */
av_register_all();
/* ------------------------------------------------------ */
/* INFORMATION */
/* ------------------------------------------------------ */
openmax_print_components();
/* Create decoder component */
decoder_cb.EventHandler = callback_video_event;
decoder_cb.EmptyBufferDone = callback_video_empty_buffer_done;
decoder_cb.FillBufferDone = callback_video_fill_buffer_done;
omx_res = OMX_GetHandle(&decoder_handle, (OMX_STRING)"OMX.broadcom.video_decode", nullptr, &decoder_cb);
if (OMX_ErrorNone != omx_res) {
printf("Failed to get the video decoder handle.\n");
exit(EXIT_FAILURE);
}
/* Create render component. */
render_cb.EventHandler = callback_render_event;
render_cb.EmptyBufferDone = callback_render_empty_buffer_done;
render_cb.FillBufferDone = callback_render_fill_buffer_done;
omx_res = OMX_GetHandle(&render_handle, (OMX_STRING)"OMX.broadcom.video_render", nullptr, &render_cb);
if (OMX_ErrorNone != omx_res) {
printf("Failed to create the renderer handle.\n");
exit(EXIT_FAILURE);
}
if (0 != openmax_port_get_input(decoder_handle, decoder_input_port)) {
printf("Failed to get the decoder input port.\n");
exit(EXIT_FAILURE);
}
if (0 != openmax_port_get_output(decoder_handle, decoder_output_port)) {
printf("Failed to get the decoder output port.\n");
exit(EXIT_FAILURE);
}
/* Find the input and output port indices of the renderer */
if (0 != openmax_port_get_input(render_handle, render_input_port)) {
printf("Failed to get the input port for the renderer.\n");
exit(EXIT_FAILURE);
}
if (0 != openmax_port_disable(decoder_handle, decoder_input_port, MODE_NON_BLOCKING)) {
printf("Failed to disable the decoder input port.\n");
exit(EXIT_FAILURE);
}
if (0 != openmax_port_disable(decoder_handle, decoder_output_port, MODE_NON_BLOCKING)) {
printf("Failed to disabled the decoder output port.\n");
exit(EXIT_FAILURE);
}
if (0 != openmax_port_disable(render_handle, render_input_port, MODE_BLOCKING)) {
printf("Failed to disable the input port of the renderer.\n");
exit(EXIT_FAILURE);
}
omx_res = OMX_SetupTunnel(decoder_handle, decoder_output_port, render_handle, render_input_port);
if (OMX_ErrorNone != omx_res) {
printf("Failed to create the tunnel between the decoder output port and renderer input port.\n");
exit(EXIT_FAILURE);
}
printf("video_decoder.input_port: %u\n", decoder_input_port);
printf("video_decoder.output_port: %u\n", decoder_output_port);
printf("video_render.input_port: %u\n", render_input_port);
if (0 != openmax_component_video_decoder_configure(decoder_handle, decoder_input_port)) {
printf("Failed to configure the video decoder.\n");
exit(EXIT_FAILURE);
}
if (0 != openmax_port_buffers_allocate(decoder_handle, decoder_input_port, &decoder_buffer_data, decoder_buffer_count)) {
printf("Failed to allocate the buffers.\n");
exit(EXIT_FAILURE);
}
if (0 != openmax_component_state_change_to_idle(decoder_handle, MODE_NON_BLOCKING)) {
printf("Failed to switch decoder into idle state.\n");
exit(EXIT_FAILURE);
}
/*
CALL ORDER NOTE:
When you disable the decoder_input_port and try to use
`OMX_UseBuffer()` you'll get a
`OMX_ErrorIncorrectStateOperation` error. Therefore we have
to enable the ports again.
*/
openmax_port_enable(decoder_handle, decoder_input_port, MODE_BLOCKING);
printf("video_decoder.buffer_count: %u\n", decoder_buffer_count);
if (0 != openmax_port_buffers_use(decoder_handle, decoder_input_port, &decoder_buffer_headers, decoder_buffer_data, decoder_buffer_count)) {
printf("Failed to `use` the buffers.\n");
exit(EXIT_FAILURE);
}
/*
CALL ORDER NOTE:
Even though we have allocate the buffers we cannot "use"
them at this point. It seems that we first have to create
the render component and setup a tunnel after which we -can-
use the allocated buffers.
*/
/*
if (0 != openmax_port_buffers_use(decoder_handle, decoder_input_port, &decoder_buffer_headers, decoder_buffer_data, decoder_buffer_count)) {
printf("Failed to `use` the buffers.\n");
exit(EXIT_FAILURE);
}
*/
/* ---------------------------------------------------------------------- */
/*
*/
/*
if (0 != openmax_component_state_change_to_idle(render_handle, MODE_BLOCKING)) {
printf("Failed to change render component to idle mode.\n");
exit(EXIT_FAILURE);
}
*/
/* ---------------------------------------------------------------------- */
/*
Here we initialize avformat that we use to read nals from a
.h264 fie.
*/
r = avformat_open_input(&av_ctx, "./test-4.h264", nullptr, nullptr);
if (r < 0) {
printf("Failed to open the h264 input file.\n");
exit(EXIT_FAILURE);
}
r = avformat_find_stream_info(av_ctx, nullptr);
if (r < 0) {
printf("Failed to find stream info from the video file.\n");
exit(EXIT_FAILURE);
}
/* ---------------------------------------------------------------------- */
/* ------------------------------------------------------ */
/* CONFIGURE */
/* ------------------------------------------------------ */
printf("\n[render input port]\n");
print_port_definition(render_handle, render_input_port);
printf("\n[decoder input port]\n");
print_port_definition(decoder_handle, decoder_input_port);
printf("\n[decoder output port]\n");
print_port_definition(decoder_handle, decoder_output_port);
/*
Read the nals from the source file and start feeding them
into the decoder.
*/
#if 0
while (true) {
r = av_read_frame(av_ctx, &av_pkt);
if (r < 0) {
printf("Failed to read a packet.\n");
break;
}
printf("AVPacket.size: %d.\n", av_pkt.size);
/* @todo we have to retrieve a free buffer here. */
OMX_BUFFERHEADERTYPE* buffer = buffer_headers[0];
buffer->pBuffer = av_pkt.data;
buffer->nFilledLen = av_pkt.size;
buffer->nOffset = 0;
buffer->nFlags = 0; // OMX_BUFFERFLAG_TIME_UNKNOWN;
//buffer->nFlags = OMX_BUFFERFLAG_STARTTIME;
//buffer->hMarkTargetComponent = decoder_handle;
if (true == is_first_packet) {
buffer->nFlags = OMX_BUFFERFLAG_STARTTIME;
is_first_packet = false;
}
else {
buffer->nFlags = OMX_BUFFERFLAG_TIME_UNKNOWN;
}
printf("%02X %02X %02X %02X %02X\n",
av_pkt.data[0],
av_pkt.data[1],
av_pkt.data[2],
av_pkt.data[3],
av_pkt.data[4]
);
omx_res = OMX_EmptyThisBuffer(decoder_handle, buffer);
if (OMX_ErrorNone != omx_res) {
printf("Failed to empty the buffer.\n");
break;
}
printf("Handed over a h264 nal.\n");
}
#endif
/* ------------------------------------------------------ */
/* SHUTDOWN */
/* ------------------------------------------------------ */
if (nullptr != av_ctx) {
avformat_close_input(&av_ctx);
av_ctx = nullptr;
}
/*
`vcos_deinit()` deinitializese the VideoCore OS layer and
is the counterpart of `vcos_init()`.
*/
vcos_deinit();
/* `OMX_Deinit()` deinitalizes the OpenMAX framework. */
omx_res = OMX_Deinit();
if (OMX_ErrorNone != omx_res) {
printf("Failed to cleanly deinit OMX. (exiting)\n");
exit(EXIT_FAILURE);
}
/* And cleanup the Raspberry Pi quirk. */
bcm_host_deinit();
return 0;
}
/* ------------------------------------------------------------------------- */
/*
These functions allow you to retrieve the input and output
port for a compont. We follow the solution that is also used
in [this][12] example and as described in the specfication,
section `3.1.2.8 OMX_PORT_PARAM_TYPE`.
*/
static int openmax_port_get_input(OMX_HANDLETYPE handle, uint32_t& port) {
return openmax_port_get(handle, OMX_DirInput, port);
}
static int openmax_port_get_output(OMX_HANDLETYPE handle, uint32_t& port) {
return openmax_port_get(handle, OMX_DirOutput, port);
}
static int openmax_port_get(OMX_HANDLETYPE handle, uint32_t direction, uint32_t& port) {
uint32_t i = 0;
uint32_t init_dx = UINT32_MAX;
uint32_t init_count = 4;
uint32_t init_types[] = {
OMX_IndexParamVideoInit,
OMX_IndexParamAudioInit,
OMX_IndexParamImageInit,
OMX_IndexParamOtherInit
};
OMX_ERRORTYPE omx_res = OMX_ErrorNone;
OMX_PORT_PARAM_TYPE param = {};
OMX_PARAM_PORTDEFINITIONTYPE port_def = {};
for (i = 0; i < init_count; ++i) {
param.nSize = sizeof(param);
param.nVersion.nVersion = OMX_VERSION;
omx_res = OMX_GetParameter(handle, (OMX_INDEXTYPE)init_types[i], &param);
if (OMX_ErrorNone == omx_res) {
init_dx = i;
break;
}
}
if (UINT32_MAX == init_dx) {
printf("Failed to get the `OMX_IndexParam{...}Init` parameter.\n");
return -1;
}
for (i = 0; i < param.nPorts; ++i) {
memset((char*)&port_def, 0x00, sizeof(port_def));
port_def.nSize = sizeof(port_def);
port_def.nPortIndex = param.nStartPortNumber + i;
port_def.nVersion.nVersion = OMX_VERSION;
omx_res = OMX_GetParameter(handle, OMX_IndexParamPortDefinition, &port_def);
if (OMX_ErrorNone != omx_res) {
printf("Failed to get port definition.\n");
return -2;
}
if (direction == port_def.eDir) {
port = port_def.nPortIndex;
return 0;
}
}
printf("No port found.\n");
return -3;
}
static int openmax_port_enable(OMX_HANDLETYPE handle, uint32_t port, bool blocking) {
return openmax_port_change_state(handle, port, 1, blocking);
}
static int openmax_port_disable(OMX_HANDLETYPE handle, uint32_t port, bool blocking) {
return openmax_port_change_state(handle, port, 0, blocking);
}
static int openmax_port_change_state(
OMX_HANDLETYPE handle,
uint32_t port,
uint8_t state,
bool blocking)
{
OMX_ERRORTYPE res = OMX_ErrorNone;
uint32_t states[] = { OMX_CommandPortDisable, OMX_CommandPortEnable } ;
if (0 == handle) {
printf("Invalid handle, cannot disable the port.\n");
return -1;
}
/* Non-blocking variant. */
res = OMX_SendCommand(handle, (OMX_COMMANDTYPE)states[state], port, nullptr);
if (OMX_ErrorNone != res) {
printf("Failed to disable the port %u.\n", port);
return -2;
}
if (true == blocking) {
OMX_PARAM_PORTDEFINITIONTYPE port_def = {};
port_def.nSize = sizeof(port_def);
port_def.nVersion.nVersion = OMX_VERSION;
port_def.nPortIndex = port;
while (true) {
res = OMX_GetParameter(handle, OMX_IndexParamPortDefinition, &port_def);
if (OMX_ErrorNone != res) {
printf("Failed to get the port definition.\n");
return -3;
}
if (state == port_def.bEnabled) {
return 0;
}
usleep(1);
}
}
return 0;
}
/* ------------------------------------------------------------------------- */
/*
This function will allocate buffers for the given port. This
function is supposed to be called by the user of the OpenMAX
API. See the specification `3.2.2.14 OMX_UseBuffer` for some
background info when we can use these buffers.
*/
static int openmax_port_buffers_allocate(
OMX_HANDLETYPE handle,
uint32_t port,
uint8_t*** buffers,
uint32_t& numAllocated
)
{
int r = 0;
uint32_t i = 0;
OMX_ERRORTYPE omx_res = OMX_ErrorNone;
OMX_PARAM_PORTDEFINITIONTYPE port_def = {};
uint8_t** data = nullptr;
if (0 == handle) {
printf("Given handle is invalid, cannot allocate port buffers.\n");
r = -1;
goto error;
}
if (nullptr == buffers) {
printf("Given output `buffers` is nullptr.\n");
r = -2;
goto error;
}
/* Get the necessary buffer information for the given port. */
port_def.nSize = sizeof(port_def);
port_def.nPortIndex = port;
port_def.nVersion.nVersion = OMX_VERSION;
omx_res = OMX_GetParameter(handle, OMX_IndexParamPortDefinition, &port_def);
if (OMX_ErrorNone != omx_res) {
printf("Failed to get the port def.\n");
r = -4;
goto error;
}
if (0 == port_def.nBufferCountActual) {
printf("The `nBufferCountActual` is 0, which means we don't have to / can't allocate buffers.\n");
r = -5;
goto error;
}
/* Preallocate the buffers before setting. */
data = (uint8_t**)(malloc(port_def.nBufferCountActual * sizeof(uint8_t*)));
if (nullptr == data) {
printf("Failed to allocate our array to hold some data.\n");
r = -6;
goto error;
}
for (i = 0; i < port_def.nBufferCountActual; ++i) {
data[i] = (uint8_t*) vcos_malloc_aligned(
port_def.nBufferSize,
port_def.nBufferAlignment,
"port-buffer"
);
if (nullptr == data[i]) {
printf("Failed to allocate a buffer (%u).\n", i);
r = -7;
goto error;
}
}
numAllocated = port_def.nBufferCountActual;
*buffers = data;
error:
if (r < 0) {
printf("Need to deallocate!");
if (nullptr != buffers) {
printf("@todo we should deallocate the buffers.\n");
}
numAllocated = 0;
}
return r;
}
/*
This function assumes that you allocate the given `buffers`
using the `openmax_port_buffers_allocate()` function. We
retrieve the port definition info and use the `nBufferSize`; so
when the given buffers have an difference size you'll run into
unexpected behavior.
CALL ORDER NOTE:
It seems that you cannot use `OMX_UseBuffers()` when you
haven't created/connected a `video_decoder` component to
another component using e.g. the `OMX_SetupTunnel()`
function. When you try to use this function w/o setting up a
tunnel you'll get a `OMX_ErrorIncorrectStateOperation` when
you call `OMX_UseBuffers()`.
*/
static int openmax_port_buffers_use(
OMX_HANDLETYPE handle,
uint32_t port,
OMX_BUFFERHEADERTYPE*** headers, /* We will allocate this. */
uint8_t** buffers, /* The buffers that you should have allocated with `openmax_port_buffers_allocate()`. */
uint32_t numBuffers /* The number of buffers in `buffers`. */
)
{
int r = 0;
int i = 0;
OMX_ERRORTYPE omx_res = OMX_ErrorNone;
OMX_BUFFERHEADERTYPE** hdrs = nullptr;
OMX_PARAM_PORTDEFINITIONTYPE port_def = {};
/* Validate input. */
if (0 == handle) {
printf("Given handle is 0.\n");
r = -1;
goto error;
}
if (nullptr == headers) {
printf("Given `headers` is nullptr.\n");
r = -2;
goto error;
}
if (nullptr == buffers) {
printf("Given `buffers` is nullptr.\n");
r = -3;
goto error;
}
if (0 == numBuffers) {
printf("Given `numBuffers` is 0; should be set to the number of allocated data buffers with the call `openmax_port_buffers_allocate()`. ");
r = -4;
goto error;
}
/* Get the port definition and check if `numBuffers` is correct. */
port_def.nSize = sizeof(port_def);
port_def.nVersion.nVersion = OMX_VERSION;
port_def.nPortIndex = port;
omx_res = OMX_GetParameter(handle, OMX_IndexParamPortDefinition, &port_def);
if (OMX_ErrorNone != omx_res) {
printf("Failed to get the port definition for port: %u.\n", port);
r = -5;
goto error;
}
if (numBuffers != port_def.nBufferCountActual) {
printf("The given `numBuffers` is not the same as the port definition `nBufferCountActual`. Current we enforce that these numbers are the same.\n");
r = -6;
goto error;
}
hdrs = (OMX_BUFFERHEADERTYPE**) malloc(numBuffers * sizeof(OMX_BUFFERHEADERTYPE*));
if (nullptr == hdrs) {
printf("Failed to allocate our buffer header array.\n");
r = -7;
goto error;
}
for (i = 0; i < port_def.nBufferCountActual; ++i) {
omx_res = OMX_UseBuffer(
handle,
&hdrs[i],
port,
nullptr,
port_def.nBufferSize,
buffers[i]
);
if (OMX_ErrorNone != omx_res) {
printf("Failed to provide a buffer to the decoder: %s.\n", openmax_error_to_string(omx_res).c_str());
r = -8;
goto error;
}
}
*headers = hdrs;
error:
if (r < 0) {
printf("@todo we should deallocate the allocated buffers.\n");
}
return r;
}
/* ------------------------------------------------------------------------- */
static int openmax_component_state_change_to_idle(
OMX_HANDLETYPE handle,
bool blocking)
{
return openmax_component_state_change(handle, OMX_StateIdle, blocking);
}
static int openmax_component_state_change_to_executing(
OMX_HANDLETYPE handle,
bool blocking)
{
return openmax_component_state_change(handle, OMX_StateExecuting, blocking);
}
static int openmax_component_state_change(
OMX_HANDLETYPE handle,
uint32_t state,
bool blocking)
{
OMX_ERRORTYPE omx_res = OMX_ErrorNone;
if (0 == handle) {
printf("Cannot change state because the given handle is 0.\n");
return -1;
}
/* blocking variant. */
if (true == blocking) {
do {
omx_res = OMX_SendCommand(handle, OMX_CommandStateSet, (OMX_COMMANDTYPE)state, nullptr);
if (OMX_ErrorNone == omx_res) {
return 0;
}
} while (true);
}
/* non-blocking. */
omx_res = OMX_SendCommand(handle, OMX_CommandStateSet, (OMX_COMMANDTYPE)state, nullptr);
if (OMX_ErrorNone != omx_res) {
printf("Failed to change the state.\n");
return -2;
}
return 0;
}
/* ------------------------------------------------------------------------- */
/*
Here we configure the given video decoder component in such a
way that it's setup to be used for AVC decoding.
*/
static int openmax_component_video_decoder_configure(
OMX_HANDLETYPE handle,
uint32_t inputPort)
{
if (0 == handle) {
printf("Cannot configure the video decoder, given handle is invalid.\n");
return -1;
}
OMX_ERRORTYPE omx_res = OMX_ErrorNone;
OMX_VIDEO_PARAM_PORTFORMATTYPE video_param = {};
video_param.nSize = sizeof(video_param);
video_param.nPortIndex = inputPort;
video_param.nVersion.nVersion = OMX_VERSION;
video_param.eCompressionFormat = OMX_VIDEO_CodingAVC;
omx_res = OMX_SetParameter(handle, OMX_IndexParamVideoPortFormat, &video_param);
if (OMX_ErrorNone != omx_res) {
printf("Failed to configure the video param.\n");
return -2;
}
return 0;
}
/*
This function configures the video render component. We
configure it specifically for this use-case and this test. We
will most likely change this for another test.
*/
static int openmax_component_video_render_configure(
OMX_HANDLETYPE handle,
uint32_t inputPort)
{
OMX_ERRORTYPE omx_res = OMX_ErrorNone;
OMX_CONFIG_DISPLAYREGIONTYPE render_region = {};
if (0 == handle) {
printf("Given handle is invalid. Cannot configure the video render component.\n");
return -1;
}
memset((char*)&render_region, 0x00, sizeof(render_region));
render_region.nSize = sizeof(render_region);
render_region.nVersion.nVersion = OMX_VERSION;
render_region.nPortIndex = inputPort;
render_region.num = 0; /* */
render_region.mode = OMX_DISPLAY_MODE_FILL; /* Display mode; there are a couple of other types, see `OMX_Broadcom.h` */
render_region.fullscreen = OMX_TRUE;
render_region.noaspect = OMX_TRUE;
render_region.dest_rect.x_offset = 0;
render_region.dest_rect.y_offset = 0;
render_region.dest_rect.width = 1920;
render_region.dest_rect.height = 1080;
render_region.set = (OMX_DISPLAYSETTYPE)(OMX_DISPLAY_SET_NUM | OMX_DISPLAY_SET_DEST_RECT | OMX_DISPLAY_SET_SRC_RECT | OMX_DISPLAY_SET_FULLSCREEN | OMX_DISPLAY_SET_NOASPECT | OMX_DISPLAY_SET_MODE );
omx_res = OMX_SetConfig(handle, OMX_IndexConfigDisplayRegion, &render_region);
if (OMX_ErrorNone != omx_res) {
printf("Failed to set the render region.\n");
return -2;
}
return 0;
}
/* ------------------------------------------------------------------------- */
static OMX_ERRORTYPE callback_video_event(OMX_HANDLETYPE component, OMX_PTR user, OMX_EVENTTYPE event, OMX_U32 data1, OMX_U32 data2, OMX_PTR eventData) {
printf("->>>>> callback_video_event()\n");
printf(">> callback_event: %s.\n", openmax_eventtype_to_string(event).c_str());
//return OMX_ErrorNotReady;
return OMX_ErrorNone;
}
static OMX_ERRORTYPE callback_video_empty_buffer_done(OMX_HANDLETYPE component, OMX_PTR user, OMX_BUFFERHEADERTYPE* buffer) {
printf("->>>>> callback_video_empty_buffer_done()\n");
return OMX_ErrorNone;
}
static OMX_ERRORTYPE callback_video_fill_buffer_done(OMX_HANDLETYPE component, OMX_PTR user, OMX_BUFFERHEADERTYPE* buffer) {
printf("->>>>> callback_video_fill_buffer_done()\n");
return OMX_ErrorNone;
}
/* ------------------------------------------------------------------------- */
static OMX_ERRORTYPE callback_render_event(OMX_HANDLETYPE component, OMX_PTR user, OMX_EVENTTYPE event, OMX_U32 data1, OMX_U32 data2, OMX_PTR eventData) {
printf("-##### callback_render_event: %s.\n", openmax_eventtype_to_string(event).c_str());
return OMX_ErrorNone;
}
static OMX_ERRORTYPE callback_render_empty_buffer_done(OMX_HANDLETYPE component, OMX_PTR user, OMX_BUFFERHEADERTYPE* buffer) {
printf("-##### callback_render_empty_buffer_done().\n");
return OMX_ErrorNone;
}
static OMX_ERRORTYPE callback_render_fill_buffer_done(OMX_HANDLETYPE component, OMX_PTR user, OMX_BUFFERHEADERTYPE* buffer) {
printf("-##### callback_render_fill_buffer_done().\n");
return OMX_ErrorNone;
}
/* ------------------------------------------------------------------------- */
static std::string openmax_videocodingtype_to_string(uint32_t type) {
uint32_t types[] = {
OMX_VIDEO_CodingUnused,
OMX_VIDEO_CodingAutoDetect,
OMX_VIDEO_CodingMPEG2,
OMX_VIDEO_CodingH263,
OMX_VIDEO_CodingMPEG4,
OMX_VIDEO_CodingWMV,
OMX_VIDEO_CodingRV,
OMX_VIDEO_CodingAVC,
OMX_VIDEO_CodingMJPEG,
OMX_VIDEO_CodingKhronosExtensions,
OMX_VIDEO_CodingVendorStartUnused,
OMX_VIDEO_CodingVP6,
OMX_VIDEO_CodingVP7,
OMX_VIDEO_CodingVP8,
OMX_VIDEO_CodingYUV,
OMX_VIDEO_CodingSorenson,
OMX_VIDEO_CodingTheora,
OMX_VIDEO_CodingMVC
};
const char* names[] = {
"OMX_VIDEO_CodingUnused",
"OMX_VIDEO_CodingAutoDetect",
"OMX_VIDEO_CodingMPEG2",
"OMX_VIDEO_CodingH263",
"OMX_VIDEO_CodingMPEG4",
"OMX_VIDEO_CodingWMV",
"OMX_VIDEO_CodingRV",
"OMX_VIDEO_CodingAVC",
"OMX_VIDEO_CodingMJPEG",
"OMX_VIDEO_CodingKhronosExtensions",
"OMX_VIDEO_CodingVendorStartUnused",
"OMX_VIDEO_CodingVP6",
"OMX_VIDEO_CodingVP7",
"OMX_VIDEO_CodingVP8",
"OMX_VIDEO_CodingYUV",
"OMX_VIDEO_CodingSorenson",
"OMX_VIDEO_CodingTheora",
"OMX_VIDEO_CodingMVC",
nullptr
};
for (int i = 0; nullptr != names[i]; ++i) {
if (type == types[i]) {
return names[i];
}
}
return "UNKNOWN";
}
static std::string openmax_colorformattype_to_string(uint32_t type) {
uint32_t types[] = {
OMX_COLOR_FormatUnused,
OMX_COLOR_FormatMonochrome,
OMX_COLOR_Format8bitRGB332,
OMX_COLOR_Format12bitRGB444,
OMX_COLOR_Format16bitARGB4444,
OMX_COLOR_Format16bitARGB1555,
OMX_COLOR_Format16bitRGB565,
OMX_COLOR_Format16bitBGR565,
OMX_COLOR_Format18bitRGB666,
OMX_COLOR_Format18bitARGB1665,
OMX_COLOR_Format19bitARGB1666,
OMX_COLOR_Format24bitRGB888,
OMX_COLOR_Format24bitBGR888,
OMX_COLOR_Format24bitARGB1887,
OMX_COLOR_Format25bitARGB1888,
OMX_COLOR_Format32bitBGRA8888,
OMX_COLOR_Format32bitARGB8888,
OMX_COLOR_FormatYUV411Planar,
OMX_COLOR_FormatYUV411PackedPlanar,
OMX_COLOR_FormatYUV420Planar,
OMX_COLOR_FormatYUV420PackedPlanar,
OMX_COLOR_FormatYUV420SemiPlanar,
OMX_COLOR_FormatYUV422Planar,
OMX_COLOR_FormatYUV422PackedPlanar,
OMX_COLOR_FormatYUV422SemiPlanar,
OMX_COLOR_FormatYCbYCr,
OMX_COLOR_FormatYCrYCb,
OMX_COLOR_FormatCbYCrY,
OMX_COLOR_FormatCrYCbY,
OMX_COLOR_FormatYUV444Interleaved,
OMX_COLOR_FormatRawBayer8bit,
OMX_COLOR_FormatRawBayer10bit,
OMX_COLOR_FormatRawBayer8bitcompressed,
OMX_COLOR_FormatL2,
OMX_COLOR_FormatL4,
OMX_COLOR_FormatL8,
OMX_COLOR_FormatL16,
OMX_COLOR_FormatL24,
OMX_COLOR_FormatL32,
OMX_COLOR_FormatYUV420PackedSemiPlanar,
OMX_COLOR_FormatYUV422PackedSemiPlanar,
OMX_COLOR_Format18BitBGR666,
OMX_COLOR_Format24BitARGB6666,
OMX_COLOR_Format24BitABGR6666,
OMX_COLOR_FormatKhronosExtensions,
OMX_COLOR_FormatVendorStartUnused,
OMX_COLOR_Format32bitABGR8888,
OMX_COLOR_Format8bitPalette,
OMX_COLOR_FormatYUVUV128,
OMX_COLOR_FormatRawBayer12bit,
OMX_COLOR_FormatBRCMEGL,
OMX_COLOR_FormatBRCMOpaque,
OMX_COLOR_FormatYVU420PackedPlanar,
OMX_COLOR_FormatYVU420PackedSemiPlanar,
OMX_COLOR_FormatRawBayer16bit,
OMX_COLOR_FormatYUV420_16PackedPlanar,
OMX_COLOR_FormatYUVUV64_16,
OMX_COLOR_FormatYUV420_10PackedPlanar,
OMX_COLOR_FormatYUVUV64_10,
OMX_COLOR_FormatYUV420_UVSideBySide
};
const char* names[] = {
"OMX_COLOR_FormatUnused",
"OMX_COLOR_FormatMonochrome",
"OMX_COLOR_Format8bitRGB332",
"OMX_COLOR_Format12bitRGB444",
"OMX_COLOR_Format16bitARGB4444",
"OMX_COLOR_Format16bitARGB1555",
"OMX_COLOR_Format16bitRGB565",
"OMX_COLOR_Format16bitBGR565",
"OMX_COLOR_Format18bitRGB666",
"OMX_COLOR_Format18bitARGB1665",
"OMX_COLOR_Format19bitARGB1666,"
"OMX_COLOR_Format24bitRGB888",
"OMX_COLOR_Format24bitBGR888",
"OMX_COLOR_Format24bitARGB1887",
"OMX_COLOR_Format25bitARGB1888",
"OMX_COLOR_Format32bitBGRA8888",
"OMX_COLOR_Format32bitARGB8888",
"OMX_COLOR_FormatYUV411Planar",
"OMX_COLOR_FormatYUV411PackedPlanar",
"OMX_COLOR_FormatYUV420Planar",
"OMX_COLOR_FormatYUV420PackedPlanar",
"OMX_COLOR_FormatYUV420SemiPlanar",
"OMX_COLOR_FormatYUV422Planar",
"OMX_COLOR_FormatYUV422PackedPlanar",
"OMX_COLOR_FormatYUV422SemiPlanar",
"OMX_COLOR_FormatYCbYCr",
"OMX_COLOR_FormatYCrYCb",
"OMX_COLOR_FormatCbYCrY",
"OMX_COLOR_FormatCrYCbY",
"OMX_COLOR_FormatYUV444Interleaved",
"OMX_COLOR_FormatRawBayer8bit",
"OMX_COLOR_FormatRawBayer10bit",
"OMX_COLOR_FormatRawBayer8bitcompressed",
"OMX_COLOR_FormatL2,"
"OMX_COLOR_FormatL4,"
"OMX_COLOR_FormatL8,"
"OMX_COLOR_FormatL16,"
"OMX_COLOR_FormatL24,"
"OMX_COLOR_FormatL32",
"OMX_COLOR_FormatYUV420PackedSemiPlanar",
"OMX_COLOR_FormatYUV422PackedSemiPlanar",
"OMX_COLOR_Format18BitBGR666",
"OMX_COLOR_Format24BitARGB6666",
"OMX_COLOR_Format24BitABGR6666",
"OMX_COLOR_FormatKhronosExtensions",
"OMX_COLOR_FormatVendorStartUnused",
"OMX_COLOR_Format32bitABGR8888",
"OMX_COLOR_Format8bitPalette",
"OMX_COLOR_FormatYUVUV128",
"OMX_COLOR_FormatRawBayer12bit",
"OMX_COLOR_FormatBRCMEGL",
"OMX_COLOR_FormatBRCMOpaque",
"OMX_COLOR_FormatYVU420PackedPlanar",
"OMX_COLOR_FormatYVU420PackedSemiPlanar",
"OMX_COLOR_FormatRawBayer16bit",
"OMX_COLOR_FormatYUV420_16PackedPlanar",
"OMX_COLOR_FormatYUVUV64_16",
"OMX_COLOR_FormatYUV420_10PackedPlanar",
"OMX_COLOR_FormatYUVUV64_10",
"OMX_COLOR_FormatYUV420_UVSideBySide",
nullptr
};
for (int i = 0; nullptr != names[i]; ++i) {
if (type == types[i]) {
return names[i];
}
}
return "UNKNOWN";
}
static std::string openmax_naluformattype_to_string(uint32_t fmt) {
uint32_t formats[] = {
OMX_NaluFormatStartCodes,
OMX_NaluFormatOneNaluPerBuffer,
OMX_NaluFormatOneByteInterleaveLength,
OMX_NaluFormatTwoByteInterleaveLength,
OMX_NaluFormatFourByteInterleaveLength
};
const char* names[] = {
"OMX_NaluFormatStartCodes",
"OMX_NaluFormatOneNaluPerBuffer",
"OMX_NaluFormatOneByteInterleaveLength",
"OMX_NaluFormatTwoByteInterleaveLength",
"OMX_NaluFormatFourByteInterleaveLength",
nullptr
};
std::string result;
int dx = 0;
while (nullptr != names[dx]) {
if (!(fmt & formats[dx])) {
dx++;
continue;
}
if (result.size() > 0) {
result +=", ";
}
result += names[dx];
dx++;
}
return result;
}
/* ------------------------------------------------------------------------- */
static std::string openmax_error_to_string(uint32_t err) {
switch (err) {
case OMX_ErrorNone: { return "OMX_ErrorNone"; }
case OMX_ErrorInsufficientResources: { return "OMX_ErrorInsufficientResources"; }
case OMX_ErrorUndefined: { return "OMX_ErrorUndefined"; }
case OMX_ErrorInvalidComponentName: { return "OMX_ErrorInvalidComponentName"; }
case OMX_ErrorComponentNotFound: { return "OMX_ErrorComponentNotFound"; }
case OMX_ErrorInvalidComponent: { return "OMX_ErrorInvalidComponent"; }
case OMX_ErrorBadParameter: { return "OMX_ErrorBadParameter"; }
case OMX_ErrorNotImplemented: { return "OMX_ErrorNotImplemented"; }
case OMX_ErrorUnderflow: { return "OMX_ErrorUnderflow"; }
case OMX_ErrorOverflow: { return "OMX_ErrorOverflow"; }
case OMX_ErrorHardware: { return "OMX_ErrorHardware"; }
case OMX_ErrorInvalidState: { return "OMX_ErrorInvalidState"; }
case OMX_ErrorStreamCorrupt: { return "OMX_ErrorStreamCorrupt"; }
case OMX_ErrorPortsNotCompatible: { return "OMX_ErrorPortsNotCompatible"; }
case OMX_ErrorResourcesLost: { return "OMX_ErrorResourcesLost"; }
case OMX_ErrorNoMore: { return "OMX_ErrorNoMore"; }
case OMX_ErrorVersionMismatch: { return "OMX_ErrorVersionMismatch"; }
case OMX_ErrorNotReady: { return "OMX_ErrorNotReady"; }
case OMX_ErrorTimeout: { return "OMX_ErrorTimeout"; }
case OMX_ErrorSameState: { return "OMX_ErrorSameState"; }
case OMX_ErrorResourcesPreempted: { return "OMX_ErrorResourcesPreempted"; }
case OMX_ErrorPortUnresponsiveDuringAllocation: { return "OMX_ErrorPortUnresponsiveDuringAllocation"; }
case OMX_ErrorPortUnresponsiveDuringDeallocation: { return "OMX_ErrorPortUnresponsiveDuringDeallocation"; }
case OMX_ErrorPortUnresponsiveDuringStop: { return "OMX_ErrorPortUnresponsiveDuringStop"; }
case OMX_ErrorIncorrectStateTransition: { return "OMX_ErrorIncorrectStateTransition"; }
case OMX_ErrorIncorrectStateOperation: { return "OMX_ErrorIncorrectStateOperation"; }
case OMX_ErrorUnsupportedSetting: { return "OMX_ErrorUnsupportedSetting"; }
case OMX_ErrorUnsupportedIndex: { return "OMX_ErrorUnsupportedIndex"; }
case OMX_ErrorBadPortIndex: { return "OMX_ErrorBadPortIndex"; }
case OMX_ErrorPortUnpopulated: { return "OMX_ErrorPortUnpopulated"; }
case OMX_ErrorComponentSuspended: { return "OMX_ErrorComponentSuspended"; }
case OMX_ErrorDynamicResourcesUnavailable: { return "OMX_ErrorDynamicResourcesUnavailable"; }
case OMX_ErrorMbErrorsInFrame: { return "OMX_ErrorMbErrorsInFrame"; }
case OMX_ErrorFormatNotDetected: { return "OMX_ErrorFormatNotDetected"; }
case OMX_ErrorContentPipeOpenFailed: { return "OMX_ErrorContentPipeOpenFailed"; }
case OMX_ErrorContentPipeCreationFailed: { return "OMX_ErrorContentPipeCreationFailed"; }
case OMX_ErrorSeperateTablesUsed: { return "OMX_ErrorSeperateTablesUsed"; }
case OMX_ErrorTunnelingUnsupported: { return "OMX_ErrorTunnelingUnsupported"; }
case OMX_ErrorMax: { return "OMX_ErrorMax"; }
default: { return "UNKNOWN"; }
}
}
/* ------------------------------------------------------------------------- */
static std::string openmax_dirtype_to_string(uint32_t type) {
uint32_t types[] = {
OMX_DirInput,
OMX_DirOutput
};
const char* names[] = {
"OMX_DirInput",
"OMX_DirOutput",
nullptr
};
for (int i = 0; nullptr != names[i]; ++i) {
if (type == types[i]) {
return names[i];
}
}
return "UNKNOWN";
}
/* ------------------------------------------------------------------------- */
static std::string openmax_eventtype_to_string(uint32_t type) {
switch (type) {
case OMX_EventCmdComplete: { return "OMX_EventCmdComplete"; }
case OMX_EventError: { return "OMX_EventError"; }
case OMX_EventMark: { return "OMX_EventMark"; }
case OMX_EventPortSettingsChanged: { return "OMX_EventPortSettingsChanged"; }
case OMX_EventBufferFlag: { return "OMX_EventBufferFlag"; }
case OMX_EventResourcesAcquired: { return "OMX_EventResourcesAcquired"; }
case OMX_EventComponentResumed: { return "OMX_EventComponentResumed"; }
case OMX_EventDynamicResourcesAvailable: { return "OMX_EventDynamicResourcesAvailable"; }
default: { return "UNKNOWN"; }
};
}
/* ------------------------------------------------------------------------- */
/*
This function will query OpenMAX for the available components
and prints their name to stdout.
*/
static void openmax_print_components() {
char comp_name[256] = {};
int dx = 0;
OMX_ERRORTYPE omx_res = OMX_ErrorNone;
while (true) {
omx_res = OMX_ComponentNameEnum(comp_name, sizeof(comp_name), dx);
if (OMX_ErrorNone != omx_res) {
break;
}
printf("> %s\n", comp_name);
dx++;
}
}
/* ------------------------------------------------------------------------- */
static void print_port_definition(OMX_HANDLETYPE handle, uint32_t portIndex) {
OMX_ERRORTYPE omx_res = OMX_ErrorNone;
OMX_PARAM_PORTDEFINITIONTYPE port_def = {};
port_def.nSize = sizeof(port_def);
port_def.nPortIndex = portIndex;
port_def.nVersion.nVersion = OMX_VERSION;
omx_res = OMX_GetParameter(handle, OMX_IndexParamPortDefinition, &port_def);
if (OMX_ErrorNone != omx_res) {
printf("Failed to get port definition for index %u.\n", portIndex);
return;
}
printf("OMX_PARAM_PORTDEFINITIONTYPE.eDir: %s.\n", openmax_dirtype_to_string(port_def.eDir).c_str());
printf("OMX_PARAM_PORTDEFINITIONTYPE.nBufferCountActual: %u\n", port_def.nBufferCountActual);
printf("OMX_PARAM_PORTDEFINITIONTYPE.nBufferCountMin: %u\n", port_def.nBufferCountMin);
printf("OMX_PARAM_PORTDEFINITIONTYPE.nBufferSize: %u\n", port_def.nBufferSize);
printf("OMX_PARAM_PORTDEFINITIONTYPE.bEnabled: %u\n", port_def.bEnabled);
printf("OMX_PARAM_PORTDEFINITIONTYPE.bPopulated: %u\n", port_def.bPopulated);
printf("OMX_PARAM_PORTDEFINITIONTYPE.bBuffersContiguous: %u\n", port_def.bBuffersContiguous);
printf("OMX_PARAM_PORTDEFINITIONTYPE.nBufferAlignment: %u\n", port_def.nBufferAlignment);
printf("OMX_PARAM_PORTDEFINITIONTYPE.format.video.nFrameWidth: %u\n", port_def.format.video.nFrameWidth);
printf("OMX_PARAM_PORTDEFINITIONTYPE.format.video.nFrameHeight: %u\n", port_def.format.video.nFrameHeight);
printf("OMX_PARAM_PORTDEFINITIONTYPE.format.video.nStride: %u\n", port_def.format.video.nStride);
printf("OMX_PARAM_PORTDEFINITIONTYPE.format.video.nSliceHeight: %u\n", port_def.format.video.nSliceHeight);
printf("OMX_PARAM_PORTDEFINITIONTYPE.format.video.xFramerate: %u\n", port_def.format.video.xFramerate);
printf("OMX_PARAM_PORTDEFINITIONTYPE.format.video.bFlagErrorConcealment: %u\n", port_def.format.video.bFlagErrorConcealment);
printf("OMX_PARAM_PORTDEFINITIONTYPE.format.video.eCompressionFormat: %s\n", openmax_videocodingtype_to_string(port_def.format.video.eCompressionFormat).c_str());
printf("OMX_PARAM_PORTDEFINITIONTYPE.format.video.eColorFormat: %s\n", openmax_colorformattype_to_string(port_def.format.video.eColorFormat).c_str());
printf("\n");
}
/* ------------------------------------------------------------------------- */
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment