`

android framework之Media

 
阅读更多

Media

Android includes Stagefright, a media playback engine at the native level that has built-in software-based codecs for popular media formats.

Stagefright audio and video playback features include integration with OpenMAX codecs, session management, time-synchronized rendering, transport control, and DRM.

Stagefright also supports integration with custom hardware codecs provided by you. To set a hardware path to encode and decode media, you must implement a hardware-based codec as an OpenMax IL (Integration Layer) component.

Note: Stagefright updates can occur through the Android monthly security update process and as part of an Android OS release.

 

Architecture


Media applications interact with the Android native multimedia framework according to the following architecture.



 Figure 1. Media architecture

Application Framework
At the application framework level is application code that utilizes android.media APIs to interact with the multimedia hardware.
Binder IPC
The Binder IPC proxies facilitate communication over process boundaries. They are located in theframeworks/av/media/libmedia directory and begin with the letter "I".
Native Multimedia Framework
At the native level, Android provides a multimedia framework that utilizes the Stagefright engine for audio and video recording and playback. Stagefright comes with a default list of supported software codecs and you can implement your own hardware codec by using the OpenMax integration layer standard. For more implementation details, see the MediaPlayer and Stagefright components located in frameworks/av/media.
OpenMAX Integration Layer (IL)
The OpenMAX IL provides a standardized way for Stagefright to recognize and use custom hardware-based multimedia codecs called components. You must provide an OpenMAX plugin in the form of a shared library named libstagefrighthw.so. This plugin links Stagefright with your custom codec components, which must be implemented according to the OpenMAX IL component standard.

Implementing custom codecs


Stagefright comes with built-in software codecs for common media formats, but you can also add your own custom hardware codecs as OpenMAX components. To do this, you must create the OMX components and an OMX plugin that hooks together your custom codecs with the Stagefright framework. For example components, see the hardware/ti/omap4xxx/domx/; for an example plugin for the Galaxy Nexus, seehardware/ti/omap4xx/libstagefrighthw.

To add your own codecs:

  1. Create your components according to the OpenMAX IL component standard. The component interface is located in the frameworks/native/include/media/OpenMAX/OMX_Component.h file. To learn more about the OpenMAX IL specification, refer to the OpenMAX website.
  2. Create a OpenMAX plugin that links your components with the Stagefright service. For the interfaces to create the plugin, see frameworks/native/include/media/hardware/OMXPluginBase.h and HardwareAPI.h header files.
  3. Build your plugin as a shared library with the name libstagefrighthw.so in your product Makefile. For example:

     

    LOCAL_MODULE := libstagefrighthw

     

    In your device's Makefile, ensure you declare the module as a product package:

    PRODUCT_PACKAGES +=\
      libstagefrighthw \
      ...

Exposing codecs to the framework


The Stagefright service parses the system/etc/media_codecs.xml and system/etc/media_profiles.xml to expose the supported codecs and profiles on the device to app developers via the android.media.MediaCodecList andandroid.media.CamcorderProfile classes. You must create both files in the device/<company>/<device>/ directory and copy this over to the system image's system/etc directory in your device's Makefile. For example:

PRODUCT_COPY_FILES +=\
  device/samsung/tuna/media_profiles.xml:system/etc/media_profiles.xml \
  device/samsung/tuna/media_codecs.xml:system/etc/media_codecs.xml \

For complete examples, seee device/samsung/tuna/media_codecs.xml anddevice/samsung/tuna/media_profiles.xml .

Note: As of Android 4.1, the <Quirk> element for media codecs is no longer supported.

 

  • 大小: 55.8 KB
分享到:
评论

相关推荐

    Android media framework

    标题中的 “Android media framework” 直接指出了主要内容是关于 Android 平台上的多媒体框架。多媒体框架是指一系列软件组件,这些组件能够协同工作来执行媒体文件的解码、同步、渲染等工作。在 Android 中,这一...

    android media framework

    Android多媒体框架是Android系统的核心组成部分之一,它提供了丰富的API和工具来处理多媒体内容。该框架主要包括以下几个部分: ##### 1. OpenCORE OpenCORE是Google开发的一套开源媒体库,包含了多种音视频编解码...

    Android7.0 Audio Framework——framework introduce.pdf

    Android 7.0 Audio Framework 介绍 Android 7.0 Audio Framework 是 Android 操作系统中负责音频播放和录音功能的框架。该框架主要由三部分组成:Audio 的 JAVA 程序部分、Audio 的 JNI 部分和 Audio 的客户端部分...

    Android Camera Framework

    具体到三星方案的Android Camera Framework,可以看到其特定的实现细节,其中涉及到Samsung的硬件抽象层(HAL)的具体实现,以及对应的驱动开发,包括FIMC(Full ISP Media Consumer)、FIMC-IS(FIMC Image Signal ...

    Mastering the Android Media Framework

    在Google官方发布的《Mastering the Android Media Framework》文档中,深入探讨了Android多媒体框架的核心概念、架构设计及其在实际开发中的应用。本文档由Dave Sparks撰写于2009年5月27日,主要面向Android开发者...

    android media

    1. **Android Media Framework**: Android Media Framework是Android系统中处理多媒体数据的核心组件,它提供了一个层次化的架构,包括了低级别的硬件抽象层(HAL)和高级别的应用程序接口(API)。Media Framework...

    android_framework.ppt

    在Android系统中,Android Framework是其核心组成部分,它定义了应用程序如何与操作系统进行交互的规则。这份名为"android_framework.ppt"的文档资料主要探讨了Android框架层的交互机制,特别是Android架构的层次、...

    Android应用开发揭秘-源码+pdf(全清晰)

    开发者将了解如何集成Android的Media框架,并实现媒体相关的功能。 第13章:此章可能讲解了Android的通知和消息推送,包括通知栏通知、远程视图(RemoteViews)以及Google Cloud Messaging(GCM)或Firebase Cloud ...

    Android11.0最新Framework解析

    5. **Media APIs更新**:媒体框架在Android 11中得到了扩展,支持更多音频编解码器,同时引入了新的媒体控件,使得用户能更方便地管理和播放多媒体内容。 6. **安全与隐私**:Android 11增强了对敏感权限的管理,...

    android_media_AmrInputStream.rar_android

    Media Framework是Android系统中处理多媒体数据的核心组件,它为开发者提供了高级接口来播放、录制和处理各种类型的媒体内容,包括音频、视频和图像。 在`android_media_AmrInputStream.cpp`这个源代码文件中,我们...

    media_android.rar_android

    1. **Android Media Framework**:Android系统的Media Framework是处理音频、视频播放的核心,它提供了多种接口和服务,如MediaPlayer、MediaRecorder等。MediaPlayer用于播放音频和视频文件,支持多种格式,如MP3、...

    Pro Android media development安卓媒体开发

    1. **Android Media Framework**:Android的媒体框架是多媒体开发的基础,它提供了对音频、视频和图像处理的底层支持。开发者需要理解Media Framework的层次结构,包括MediaCodec、MediaPlayer、MediaRecorder等组件...

    【批量下载】[android.开发书籍].Apress.Pro.Android.Media.Dec.2010等

    1. **Android Media Framework**:这是Android系统用于处理音频、视频和图像的核心框架。它包括MediaCodec、MediaExtractor、MediaPlayer、MediaRecorder等组件,为开发者提供了强大的多媒体处理能力。 2. **...

    [android.开发书籍].Apress.Pro.Android.Media.Dec.2010

    1. **多媒体框架介绍**:Android系统提供了丰富的多媒体框架,包括Media Framework、OpenMAX AL、OpenSL ES和Stagefright等。这些框架使得开发者能够高效地集成各种媒体功能,如播放、录制、编辑和处理多媒体内容。 ...

    Android显示框架详细分析

    高通7系列硬件是Android显示系统的重要组成部分,其Display部分主要包括Media Processing Daemon (MDP)、Mobile Data Interface (MDDI)、MDDI Bridge和LCD module。 1. MDP是高通MSM7200A处理器中的一个核心模块,...

    Android Multimedia Framework

    - Stagefright 是Android多媒体框架的核心组件之一。 - 它提供了高性能的音视频解码和渲染功能。 - Stagefright 能够支持广泛的媒体格式,并且具有良好的跨平台兼容性。 #### 结论 Android多媒体框架的发展历程...

    Android UVC驱动外接摄像头

    3. **Media Framework**:Android的Media Framework处理多媒体数据的捕获、编码、解码和播放,包括从UVC驱动接收视频流并转发给应用程序。 在实际应用中,要实现Android设备与UVC摄像头的连接,通常需要以下步骤: ...

    Android 视频播放、音乐播放demo

    Android系统提供了一个强大的Media Framework,它包括了多种组件,如MediaPlayer、MediaCodec、MediaExtractor等,用于处理音频和视频数据。MediaPlayer是常用的播放器接口,适合简单的播放需求;而MediaCodec则...

    Android应用源码之Android视频采集+RTSP完整代码(可用)-IT计算机-毕业设计.zip

    1. **Android Media Framework**:Android系统提供了Media Framework,它是一套用于处理音频、视频和图像的底层库。在这个项目中,主要用到的是Camera和MediaRecorder两个组件。Camera类用于控制设备的摄像头,获取...

Global site tag (gtag.js) - Google Analytics