Android: how to integrate a decoder into a multimedia infrastructure

Recently, I have successfully ported a video decoder to android. We also unloaded the output to the surface and checked the output using our own APIs. Now the next task is to implement playback, pause, streaming, etc., that is, other functions of the media player. This will be a rework, since all these functions are already defined in the Android multimedia infrastructure. I heard that we can make our decoder plugin and integrate it into the Android multimedia environment. Despite the fact that I acted for the same reason, I could hardly find information about this. Therefore, I am kindly asking any of the readers to offer some relevant links or solutions to the above problem. Thanks in advance, awaiting your reply.

+6
source share
1 answer

In the Android SF framework, codecs are registered through media_codecs.xml . In the standard Android distribution for example, media_codecs.xml can be found here . All audiovisual components are registered as OMX components.

1. Registration codec

To register your video decoder, you need to add a new entry to the <Decoders> list. To make sure your codec is always matched, make sure your codec is listed as the first entry for a specific MIME type. An example recording for an H.264 decoder may be as follows.

 <Decoders> <MediaCodec name="OMX.ABC.XYZ.H264.DECODER" type="video/avc" > <Quirk name="requires-allocate-on-input-ports" /> <Quirk name="requires-allocate-on-output-ports" /> </MediaCodec> <MediaCodec name="OMX.google.h264.decoder" type="video/avc" /> 

Where

a. OMX.ABC.XYZ.H264.Decoder is the name of your component

b. video/avc is the MIME type of your component. In this example, it denotes an AVC / H.264 decoder.

c. The following 2 statements indicate special requirements for quirks or your components. In this example, requires-allocate-on-input-ports indicates to the Stagefright infrastructure that the component prefers to allocate buffers on all input ports. Similarly, another quirk reports that the component will also prefer to allocate output ports to it. For a list of supported quirks on the system, you can call the OMXCodec::getComponentQuirks in the OMXCodec.cpp file. These quirks translate into flags, which are then read by the framework to create and initialize the components.

The example illustration shows that your OMX component is registered before the default Google video decoder.

NOTE If you try to use this on the target device, you need to make sure that this entry is reflected in the final media_codecs.xml file.

2. Register OMX Core

To create your component and make sure that you are using the correct factory method , you may need to register your OMX core with the Stagefright framework.

To register a new kernel, you will need to create a new library called libstagefrighthw.so , which will be located in /system/lib on your target system. This library will need to display the createOMXPlugin character, which dlsym will display.

The registration of the OMX kernel is as follows: OMXMaster calls addVendorPlugin , which internally calls addPlugin("libstagefrighthw.so") . addPlugin will be checked in createOMXPlugin , using which other function pointers for makeComponentInstance , destroyComponentInstance , etc. are destroyComponentInstance .

Once the OMX kernel is initialized, you are ready to launch your own component within Android. Link to OMXMaster can be found here .

With these changes, your video decoder is integrated into the stagefright android framework.

+14
source

All Articles