In this document
Android provides a media playback engine at the native level called Stagefright that comes built-in with software-based codecs for several popular media formats. Stagefright features for audio and video playback include integration with OpenMAX codecs, session management, time-synchronized rendering, transport control, and DRM. In addition, Stagefright supports integration with custom hardware codecs that you provide. There actually isn't a HAL to implement for custom codecs, but to provide a hardware path to encode and decode media, you must implement your hardware-based codec as an OpenMax IL (Integration Layer) component.
Overview
The following diagram shows how media applications interact with the Android native multimedia framework.
- Application Framework
- At the application framework level is the app's code, which utilizes the android.media APIs to interact with the multimedia hardware.
- Binder IPC
- The Binder IPC proxies facilitate communication over process boundaries. They are located in
the
frameworks/av/media/libmedia
directory and begin with the letter "I". - Native Multimedia Framework
- At the native level, Android provides a multimedia framework that utilizes the Stagefright engine for
audio and video recording and playback. Stagefright comes with a default list of supported software codecs
and you can implement your own hardware codec by using the OpenMax integration layer standard. For more
implementation details, see the various MediaPlayer and Stagefright components located in
frameworks/av/media
. - OpenMAX Integration Layer (IL)
- The OpenMAX IL provides a standardized way for Stagefright to recognize and use custom hardware-based
multimedia codecs called components. You must provide an OpenMAX plugin in the form of a shared library
named
libstagefrighthw.so
. This plugin links your custom codec components to Stagefright. Your custom codecs must be implemented according to the OpenMAX IL component standard.
Implementing Custom Codecs
Stagefright comes with built-in software codecs for common media formats, but you can also add your
own custom hardware codecs as OpenMAX components. To do this, you need to create OMX components and also an
OMX plugin that hooks together your custom codecs with the Stagefright framework. For an example, see
the hardware/ti/omap4xxx/domx/
for example components and hardware/ti/omap4xx/libstagefrighthw
for an example plugin for the Galaxy Nexus.
To add your own codecs:
- Create your components according to the OpenMAX IL component standard. The component interface is located in the
frameworks/native/include/media/OpenMAX/OMX_Component.h
file. To learn more about the OpenMAX IL specification, see the OpenMAX website. - Create a OpenMAX plugin that links your components with the Stagefright service.
See the
frameworks/native/include/media/hardware/OMXPluginBase.h
andHardwareAPI.h
header files for the interfaces to create the plugin. - Build your plugin as a shared library with the name
libstagefrighthw.so
in your product Makefile. For example:LOCAL_MODULE := libstagefrighthw
In your device's Makefile, ensure that you declare the module as a product package:
PRODUCT_PACKAGES += \
libstagefrighthw \
...
Exposing Codecs to the Framework
The Stagefright service parses the system/etc/media_codecs.xml
and system/etc/media_profiles.xml
to expose the supported codecs and profiles on the device to app developers via the android.media.MediaCodecList
and
android.media.CamcorderProfile
classes. You need to create both files in the
device/<company_name>/<device_name>/
directory
and copy this over to the system image's system/etc
directory in your device's Makefile.
For example:
PRODUCT_COPY_FILES += \
device/samsung/tuna/media_profiles.xml:system/etc/media_profiles.xml \
device/samsung/tuna/media_codecs.xml:system/etc/media_codecs.xml \
See the device/samsung/tuna/media_codecs.xml
and
device/samsung/tuna/media_profiles.xml
file for complete examples.
Note: The <Quirk>
element for media codecs is no longer supported
by Android starting in Jelly Bean.