本文的示例將實現(xiàn):讀取安卓手機攝像頭數(shù)據(jù)并使用H.264編碼格式實時編碼保存為flv文件。示例包含了
1、編譯適用于安卓平臺的ffmpeg庫
2、在java中通過JNI使用ffmpeg
3、讀取安卓攝像頭數(shù)據(jù)并在后臺線程中使用ffmpeg進行編碼的基本流程
具有較強的綜合性。
編譯適用于安卓平臺的ffmpeg庫
平
時我們編譯ffmpeg類庫都是在x86平臺下,而安卓手機屬于arm平臺,所以要先通過交叉編譯的方法在x86平臺下編譯出可以在arm平臺下使用的
ffmpeg類庫。Google就為我們提供了一套可以再Linux下完成該任務的交叉編譯工具鏈——NDK,下載時可以看到google同時提供了
ndk的windows版和linux版,前者可以在windows下通過cygwin(可以理解為一個在windows下模擬linux的工具)使用。
我這里使用的是Android-ndk-r10e,在cygwin下使用。ndk的下載安裝基本是傻瓜式的,但要注意linux系統(tǒng)對x86和x64是分
的很清楚的,不像windows那么隨性,此外還需要提醒的是,在下載安裝cygwin時,只需要選擇binutils
, gcc , gcc-mingw , gdb , make這幾個組件。安裝完成后,在cygwin安裝目錄下點擊cygwin.bat打開命令窗口,輸入make -version驗證是否安裝成功。

做好以上的準備工作之后就可以正式開始ffmpeg源碼的編譯了。
首先需要修改configure文件,確保類庫版本號不會出現(xiàn)在.so后綴名的后面,否則安卓平臺無法識別這樣的類庫
找到下面幾句
- SLIBNAME_WITH_MAJOR='$(SLIBNAME).$(LIBMAJOR)'
- LIB_INSTALL_EXTRA_CMD='$$(RANLIB)"$(LIBDIR)/$(LIBNAME)"'
- SLIB_INSTALL_NAME='$(SLIBNAME_WITH_VERSION)'
- SLIB_INSTALL_LINKS='$(SLIBNAME_WITH_MAJOR)$(SLIBNAME)'
修改為
- SLIBNAME_WITH_MAJOR='$(SLIBPREF)$(FULLNAME)-$(LIBMAJOR)$(SLIBSUF)'
- LIB_INSTALL_EXTRA_CMD='$$(RANLIB)"$(LIBDIR)/$(LIBNAME)"'
- SLIB_INSTALL_NAME='$(SLIBNAME_WITH_MAJOR)'
- SLIB_INSTALL_LINKS='$(SLIBNAME)'
之后進行常規(guī)的configure配置,make, make install步驟即可,下面給出一個常規(guī)的腳本示例
- make clean
-
- export NDK=xxxx/android-ndk-r10e
- export PREBUILT=$NDK/toolchains/arm-linux-androideabi-4.8/prebuilt
- export PLATFORM=$NDK/platforms/android-8/arch-arm
- export PREFIX=../android_ffmpeglib
-
- ./configure --target-os=linux --prefix=$PREFIX \
- --enable-cross-compile \
- --enable-runtime-cpudetect \
- --disable-asm \
- --arch=arm \
- --cc=$PREBUILT/windows/bin/arm-linux-androideabi-gcc \
- --cross-prefix=$PREBUILT/windows/bin/arm-linux-androideabi- \
- --disable-stripping \
- --nm=$PREBUILT/windows/bin/arm-linux-androideabi-nm \
- --sysroot=$PLATFORM \
- --enable-gpl --enable-shared --disable-static --enable-small \
- --disable-ffprobe --disable-ffplay --disable-ffmpeg --disable-ffserver --disable-debug \
- --extra-cflags="-fPIC -DANDROID -D__thumb__ -mthumb -Wfatal-errors -Wno-deprecated -mfloat-abi=softfp -marm -march=armv7-a"
-
- make
- make install
成功編譯后,可以看到如下類庫

在java中通過JNI使用ffmpeg
JNI
即java本地接口,java native interface,它是一個協(xié)議, 該協(xié)議用來溝通Java代碼和外部的本地C/C++代碼,
通過該協(xié)議 Java代碼可以調用外部的本地代碼, 外部的C/C++
代碼可以調用Java代碼。簡單來說,就是將一個C語言的方法映射到Java的某個方法上;
JNI中的一些概念 :
-- native : Java語言中修飾本地方法的修飾符, 被該修飾符修飾的方法沒有方法體;
-- Native方法 : 在Java語言中被native關鍵字修飾的方法是Native方法;
-- JNI層 : Java聲明Native方法的部分;
-- JNI函數(shù) : JNIEnv提供的函數(shù), 這些函數(shù)在jni.h中進行定義;
-- JNI方法 : Native方法對應的JNI層實現(xiàn)的 C/C++方法, 即在jni目錄中實現(xiàn)的那些C語言代碼;
具體流程可以參照ndk的sample目錄下的hellojni項目,總結起來有如下幾個步驟
1、創(chuàng)建常規(guī)Android項目
2、聲明Native方法,形如public native String stringFromJNI();
3、創(chuàng)建c文件
在工程根目錄下創(chuàng)建 jni 目錄, 然后創(chuàng)建一個c語言源文件, 在文件中引入 include <jni.h>。c語言方法聲明形如
- jstring
- Java_com_example_hellojni_HelloJni_stringFromJNI( JNIEnv* env,
- jobject thiz )
jstring 是 Java語言中的String類型, 方法名格式為 : Java_完整包名類名_方法名();
-- JNIEnv參數(shù) : 代表的是Java環(huán)境, 通過這個環(huán)境可以調用Java里面的方法;
-- jobject參數(shù) : 調用C語言方法的對象, thiz對象表示當前的對象, 即調用JNI方法所在的類;
4、編寫Android.mk文件
就好比常用的makefile,形如
- LOCAL_PATH := $(call my-dir)
-
- include $(CLEAR_VARS)
-
- LOCAL_MODULE := hello-jni
- LOCAL_SRC_FILES := hello-jni.c
-
- include $(BUILD_SHARED_LIBRARY)
-- LOCAL_PATH : 代表mk文件所在的目錄;
-- include $(CLEAR_VARS) : 編譯工具函數(shù), 通過該函數(shù)可以進行一些初始化操作;
-- LOCAL_MODULE : 編譯后的 .so 后綴文件叫什么名字;
-- LOCAL_SRC_FILES: 指定編譯的源文件名稱;
-- include $(BUILD_SHARED_LIBRARY) : 告訴編譯器需要生成動態(tài)庫;
5、NDK編譯生成動態(tài)庫
利用上面寫好的.mk文件,進入 cygdrive 找到windows目錄下對應的文件, 編譯完成之后, 會自動生成so文件并放在libs目錄下, 之后就可以在Java中調用C語言方法了;
6、java中加載動態(tài)庫
在Java類中的靜態(tài)代碼塊中使用System.LoadLibrary()方法加載編譯好的 .so 動態(tài)庫,形如
- static {
- System.loadLibrary("hello-jni");
- }
需要說明的是,調試JNI程序非常麻煩,無法進入c文件中進行單步調試,只能通過輸出一些信息然后利用logcat進行調試,所以也可以先在vs里面把c文件的內容跑一遍簡單驗證一下。
參照上面的基本流程,具體到利用ffmpeg實現(xiàn)安卓攝像頭數(shù)據(jù)的編碼
第一步:我們聲明如下四個Native方法
- //JNI
- //初始化,讀取待編碼數(shù)據(jù)的寬和高
- public native int initial(int width,int height);
- //讀取yuv數(shù)據(jù)進行編碼
- public native int encode(byte[] yuvimage);
- //清空緩存的幀
- public native int flush();
- //清理
- public native int close();
第二步:對應的c文件內容如下,基本就是一個將yuv數(shù)據(jù)編碼為H.264的flv文件的流程,唯一需要注意的就是安卓攝像頭拍攝數(shù)據(jù)的像素格式為NV21,需要轉換為YUV420P才能進行編碼
- /**
- * 基于FFmpeg安卓攝像頭編碼
- * FFmpeg Android Camera Encoder
- *
- * 張暉 Hui Zhang
- * zhanghuicuc@gmail.com
- * 中國傳媒大學/數(shù)字電視技術
- * Communication University of China / Digital TV Technology
- *
- *
- */
-
- #include <stdio.h>
- #include <time.h>
-
- #include "libavcodec/avcodec.h"
- #include "libavformat/avformat.h"
- #include "libswscale/swscale.h"
- #include "libavutil/log.h"
-
- #ifdef ANDROID
- #include <jni.h>
- #include <android/log.h>
- #define LOGE(format, ...) __android_log_print(ANDROID_LOG_ERROR, "(>_<)", format, ##__VA_ARGS__)
- #define LOGI(format, ...) __android_log_print(ANDROID_LOG_INFO, "(=_=)", format, ##__VA_ARGS__)
- #else
- #define LOGE(format, ...) printf("(>_<) " format "\n", ##__VA_ARGS__)
- #define LOGI(format, ...) printf("(^_^) " format "\n", ##__VA_ARGS__)
- #endif
-
- AVFormatContext *ofmt_ctx;
- AVStream* video_st;
- AVCodecContext* pCodecCtx;
- AVCodec* pCodec;
- AVPacket enc_pkt;
- AVFrame *pFrameYUV;
-
- int framecnt = 0;
- int yuv_width;
- int yuv_height;
- int y_length;
- int uv_length;
- int64_t start_time;
-
- //Output FFmpeg's av_log()
- void custom_log(void *ptr, int level, const char* fmt, va_list vl){
- FILE *fp=fopen("/storage/emulated/0/av_log.txt","a+");
- if(fp){
- vfprintf(fp,fmt,vl);
- fflush(fp);
- fclose(fp);
- }
- }
-
- JNIEXPORT jint JNICALL Java_com_zhanghui_test_MainActivity_initial
- (JNIEnv *env, jobject obj,jint width,jint height)
- {
- const char* out_path = "/sdcard/zhanghui/testffmpeg.flv";
- yuv_width=width;
- yuv_height=height;
- y_length=width*height;
- uv_length=width*height/4;
-
- //FFmpeg av_log() callback
- av_log_set_callback(custom_log);
-
- av_register_all();
-
- //output initialize
- avformat_alloc_output_context2(&ofmt_ctx, NULL, "flv", out_path);
- //output encoder initialize
- pCodec = avcodec_find_encoder(AV_CODEC_ID_H264);
- if (!pCodec){
- LOGE("Can not find encoder!\n");
- return -1;
- }
- pCodecCtx = avcodec_alloc_context3(pCodec);
- pCodecCtx->pix_fmt = PIX_FMT_YUV420P;
- pCodecCtx->width = width;
- pCodecCtx->height = height;
- pCodecCtx->time_base.num = 1;
- pCodecCtx->time_base.den = 30;
- pCodecCtx->bit_rate = 800000;
- pCodecCtx->gop_size = 300;
- /* Some formats want stream headers to be separate. */
- if (ofmt_ctx->oformat->flags & AVFMT_GLOBALHEADER)
- pCodecCtx->flags |= CODEC_FLAG_GLOBAL_HEADER;
-
- //H264 codec param
- //pCodecCtx->me_range = 16;
- //pCodecCtx->max_qdiff = 4;
- //pCodecCtx->qcompress = 0.6;
- pCodecCtx->qmin = 10;
- pCodecCtx->qmax = 51;
- //Optional Param
- pCodecCtx->max_b_frames = 3;
- // Set H264 preset and tune
- AVDictionary *param = 0;
- av_dict_set(?m, "preset", "ultrafast", 0);
- av_dict_set(?m, "tune", "zerolatency", 0);
-
- if (avcodec_open2(pCodecCtx, pCodec, ?m) < 0){
- LOGE("Failed to open encoder!\n");
- return -1;
- }
-
- //Add a new stream to output,should be called by the user before avformat_write_header() for muxing
- video_st = avformat_new_stream(ofmt_ctx, pCodec);
- if (video_st == NULL){
- return -1;
- }
- video_st->time_base.num = 1;
- video_st->time_base.den = 30;
- video_st->codec = pCodecCtx;
-
- //Open output URL,set before avformat_write_header() for muxing
- if (avio_open(&ofmt_ctx->pb, out_path, AVIO_FLAG_READ_WRITE) < 0){
- LOGE("Failed to open output file!\n");
- return -1;
- }
-
- //Write File Header
- avformat_write_header(ofmt_ctx, NULL);
-
- start_time = av_gettime();
- return 0;
- }
-
- JNIEXPORT jint JNICALL Java_com_zhanghui_test_MainActivity_encode
- (JNIEnv *env, jobject obj, jbyteArray yuv)
- {
- int ret;
- int enc_got_frame=0;
- int i=0;
-
- pFrameYUV = avcodec_alloc_frame();
- uint8_t *out_buffer = (uint8_t *)av_malloc(avpicture_get_size(PIX_FMT_YUV420P, pCodecCtx->width, pCodecCtx->height));
- avpicture_fill((AVPicture *)pFrameYUV, out_buffer, PIX_FMT_YUV420P, pCodecCtx->width, pCodecCtx->height);
-
- //安卓攝像頭數(shù)據(jù)為NV21格式,此處將其轉換為YUV420P格式
- jbyte* in= (jbyte*)(*env)->GetByteArrayElements(env,yuv,0);
- memcpy(pFrameYUV->data[0],in,y_length);
- for(i=0;i<uv_length;i++)
- {
- *(pFrameYUV->data[2]+i)=*(in+y_length+i*2);
- *(pFrameYUV->data[1]+i)=*(in+y_length+i*2+1);
- }
-
- pFrameYUV->format = AV_PIX_FMT_YUV420P;
- pFrameYUV->width = yuv_width;
- pFrameYUV->height = yuv_height;
-
- enc_pkt.data = NULL;
- enc_pkt.size = 0;
- av_init_packet(&enc_pkt);
- ret = avcodec_encode_video2(pCodecCtx, &enc_pkt, pFrameYUV, &enc_got_frame);
- av_frame_free(&pFrameYUV);
-
- if (enc_got_frame == 1){
- LOGI("Succeed to encode frame: %5d\tsize:%5d\n", framecnt, enc_pkt.size);
- framecnt++;
- enc_pkt.stream_index = video_st->index;
-
- //Write PTS
- AVRational time_base = ofmt_ctx->streams[0]->time_base;//{ 1, 1000 };
- AVRational r_framerate1 = {60, 2 };//{ 50, 2 };
- AVRational time_base_q = { 1, AV_TIME_BASE };
- //Duration between 2 frames (us)
- int64_t calc_duration = (double)(AV_TIME_BASE)*(1 / av_q2d(r_framerate1)); //內部時間戳
- //Parameters
- //enc_pkt.pts = (double)(framecnt*calc_duration)*(double)(av_q2d(time_base_q)) / (double)(av_q2d(time_base));
- enc_pkt.pts = av_rescale_q(framecnt*calc_duration, time_base_q, time_base);
- enc_pkt.dts = enc_pkt.pts;
- enc_pkt.duration = av_rescale_q(calc_duration, time_base_q, time_base); //(double)(calc_duration)*(double)(av_q2d(time_base_q)) / (double)(av_q2d(time_base));
- enc_pkt.pos = -1;
-
- //Delay
- int64_t pts_time = av_rescale_q(enc_pkt.dts, time_base, time_base_q);
- int64_t now_time = av_gettime() - start_time;
- if (pts_time > now_time)
- av_usleep(pts_time - now_time);
-
- ret = av_interleaved_write_frame(ofmt_ctx, &enc_pkt);
- av_free_packet(&enc_pkt);
- }
-
- return 0;
- }
-
- JNIEXPORT jint JNICALL Java_com_zhanghui_test_MainActivity_flush
- (JNIEnv *env, jobject obj)
- {
- int ret;
- int got_frame;
- AVPacket enc_pkt;
- if (!(ofmt_ctx->streams[0]->codec->codec->capabilities &
- CODEC_CAP_DELAY))
- return 0;
- while (1) {
- enc_pkt.data = NULL;
- enc_pkt.size = 0;
- av_init_packet(&enc_pkt);
- ret = avcodec_encode_video2(ofmt_ctx->streams[0]->codec, &enc_pkt,
- NULL, &got_frame);
- if (ret < 0)
- break;
- if (!got_frame){
- ret = 0;
- break;
- }
- LOGI("Flush Encoder: Succeed to encode 1 frame!\tsize:%5d\n", enc_pkt.size);
-
- //Write PTS
- AVRational time_base = ofmt_ctx->streams[0]->time_base;//{ 1, 1000 };
- AVRational r_framerate1 = { 60, 2 };
- AVRational time_base_q = { 1, AV_TIME_BASE };
- //Duration between 2 frames (us)
- int64_t calc_duration = (double)(AV_TIME_BASE)*(1 / av_q2d(r_framerate1)); //內部時間戳
- //Parameters
- enc_pkt.pts = av_rescale_q(framecnt*calc_duration, time_base_q, time_base);
- enc_pkt.dts = enc_pkt.pts;
- enc_pkt.duration = av_rescale_q(calc_duration, time_base_q, time_base);
-
- //轉換PTS/DTS(Convert PTS/DTS)
- enc_pkt.pos = -1;
- framecnt++;
- ofmt_ctx->duration = enc_pkt.duration * framecnt;
-
- /* mux encoded frame */
- ret = av_interleaved_write_frame(ofmt_ctx, &enc_pkt);
- if (ret < 0)
- break;
- }
- //Write file trailer
- av_write_trailer(ofmt_ctx);
- return 0;
- }
-
- JNIEXPORT jint JNICALL Java_com_zhanghui_test_MainActivity_close
- (JNIEnv *env, jobject obj)
- {
- if (video_st)
- avcodec_close(video_st->codec);
- avio_close(ofmt_ctx->pb);
- avformat_free_context(ofmt_ctx);
- return 0;
- }
第三步:編寫Android.mk如下,這里要在jni目錄下放入ffmpeg的頭文件,就跟平時的調用方法一樣
- LOCAL_PATH := $(call my-dir)
-
- # FFmpeg library
- include $(CLEAR_VARS)
- LOCAL_MODULE := avcodec
- LOCAL_SRC_FILES := libavcodec-56.so
- include $(PREBUILT_SHARED_LIBRARY)
-
- include $(CLEAR_VARS)
- LOCAL_MODULE := avdevice
- LOCAL_SRC_FILES := libavdevice-56.so
- include $(PREBUILT_SHARED_LIBRARY)
-
- include $(CLEAR_VARS)
- LOCAL_MODULE := avfilter
- LOCAL_SRC_FILES := libavfilter-5.so
- include $(PREBUILT_SHARED_LIBRARY)
-
- include $(CLEAR_VARS)
- LOCAL_MODULE := avformat
- LOCAL_SRC_FILES := libavformat-56.so
- include $(PREBUILT_SHARED_LIBRARY)
-
- include $(CLEAR_VARS)
- LOCAL_MODULE := avutil
- LOCAL_SRC_FILES := libavutil-54.so
- include $(PREBUILT_SHARED_LIBRARY)
-
- include $(CLEAR_VARS)
- LOCAL_MODULE := postproc
- LOCAL_SRC_FILES := libpostproc-53.so
- include $(PREBUILT_SHARED_LIBRARY)
-
- include $(CLEAR_VARS)
- LOCAL_MODULE := swresample
- LOCAL_SRC_FILES := libswresample-1.so
- include $(PREBUILT_SHARED_LIBRARY)
-
- include $(CLEAR_VARS)
- LOCAL_MODULE := swscale
- LOCAL_SRC_FILES := libswscale-3.so
- include $(PREBUILT_SHARED_LIBRARY)
-
- # Program
- include $(CLEAR_VARS)
- LOCAL_MODULE := encode
- LOCAL_SRC_FILES :=encode.c
- LOCAL_C_INCLUDES += $(LOCAL_PATH)/include
- LOCAL_LDLIBS := -llog -lz
- LOCAL_SHARED_LIBRARIES := avcodec avdevice avfilter avformat avutil postproc swresample swscale
- include $(BUILD_SHARED_LIBRARY)
第四步:使用cygwin切換到當前項目的jni目錄下,輸入ndk-build進行編譯即可,如下

此時就可以在當前項目的libs目錄下看到ffmpeg所有的類庫以及我們自己的類庫,我這里是libencode.so,如下

第五步:在java中加載動態(tài)庫
- static{
- System.loadLibrary("avutil-54");
- System.loadLibrary("swresample-1");
- System.loadLibrary("avcodec-56");
- System.loadLibrary("avformat-56");
- System.loadLibrary("swscale-3");
- System.loadLibrary("postproc-53");
- System.loadLibrary("avfilter-5");
- System.loadLibrary("avdevice-56");
- System.loadLibrary("encode");
- }
至此,就可以在java中調用我們聲明的四個方法了。
java部分獲取攝像頭數(shù)據(jù)并編碼的流程
簡單來說,就是利用Camera的PreviewCallback獲得攝像頭數(shù)據(jù),在通過AsyncTask開啟后臺線程將攝像頭數(shù)據(jù)進行編碼。這里為了避免編碼過程占用過多的硬件資源,將編碼視頻寬和高設置為了640和480,。
這
里有兩個坑,一是onPreviewFrame方法的arg0參數(shù)就是YUV數(shù)據(jù)了,不要再轉換為YuvImage對象了,否則會在頭部加上一些多余的信
息,導致編碼錯誤,二是用于清空緩存幀的flush方法不能在oncreate主線程中調用,否則會導致程序閃退,目前還不清楚原因,所以這里連同
close()方法一起放到了onPause方法中進行調用,在安卓開發(fā)方面我還是新手,希望有了解這個問題怎么解決的朋友多多指教。
- package com.zhanghui.test;<br><br>import java.io.IOException;<br>import android.annotation.TargetApi;<br>import android.app.Activity;<br>import android.content.pm.PackageManager;<br>import android.hardware.Camera;<br>import android.os.AsyncTask;<br>import android.os.Build;<br>import android.os.Bundle;<br>import android.util.Log;<br>import android.view.Menu;<br>import android.view.MenuItem;<br>import android.view.SurfaceHolder;<br>import android.view.SurfaceView;<br>import android.view.View;<br>import android.widget.Button;<br>import android.widget.Toast;<br><br>@SuppressWarnings("deprecation")<br>public class MainActivity extends Activity {<br> private static final String TAG= "MainActivity";<br> private Button mTakeButton;<br> private Camera mCamera;<br> private SurfaceView mSurfaceView;<br> private SurfaceHolder mSurfaceHolder;<br> private boolean isRecording = false;<br> <br> private class StreamTask extends AsyncTask<Void, Void, Void>{<br> <br> private byte[] mData;<br><br> //
構造函數(shù)<br> StreamTask(byte[] data)
{<br> this.mData = data;<br> }<br> <br> @Override<br> protected Void doInBackground(Void... params) { <br> // TODO Auto-generated method stub<br> if(mData!=null){<br> Log.i(TAG, "fps: " + mCamera.getParameters().getPreviewFrameRate()); <br> encode(mData);<br> }<br> <br> return null;<br> } <br> } <br> private StreamTask mStreamTask;<br> <br> @Override<br> protected void onCreate(Bundle savedInstanceState) { <br> super.onCreate(savedInstanceState);<br> setContentView(R.layout.activity_main);<br> <br> final Camera.PreviewCallback mPreviewCallbacx=new Camera.PreviewCallback() {<br> @Override<br> public void onPreviewFrame(byte[] arg0, Camera arg1) {<br> // TODO Auto-generated method stub<br> if(null != mStreamTask){<br> switch(mStreamTask.getStatus()){<br> case RUNNING:<br> return;<br> case PENDING: <br> mStreamTask.cancel(false); <br> break; <br> }<br> }<br> mStreamTask = new StreamTask(arg0);<br> mStreamTask.execute((Void)null);<br> }<br> };<br><br> <br> mTakeButton=(Button)findViewById(R.id.take_button);<br> <br> PackageManager pm=this.getPackageManager();<br> boolean hasCamera=pm.hasSystemFeature(PackageManager.FEATURE_CAMERA) ||<br> pm.hasSystemFeature(PackageManager.FEATURE_CAMERA_FRONT) ||<br> Build.VERSION.SDK_INT<Build.VERSION_CODES.GINGERBREAD;<br> if(!hasCamera)<br> mTakeButton.setEnabled(false);<br> <br> mTakeButton.setOnClickListener(new View.OnClickListener() { <br> @Override<br> public void onClick(View arg0) {<br> // TODO Auto-generated method stub<br> if(mCamera!=null)<br> {<br> if (isRecording) { <br> mTakeButton.setText("Start"); <br> mCamera.setPreviewCallback(null); <br> Toast.makeText(MainActivity.this, "encode done", Toast.LENGTH_SHORT).show(); <br> isRecording = false; <br> }else { <br> mTakeButton.setText("Stop");<br> initial(mCamera.getParameters().getPreviewSize().width,mCamera.getParameters().getPreviewSize().height);<br> mCamera.setPreviewCallback(mPreviewCallbacx);<br> isRecording = true; <br> } <br> }<br> }<br> });<br> <br> <br> mSurfaceView=(SurfaceView)findViewById(R.id.surfaceView1);<br> SurfaceHolder holder=mSurfaceView.getHolder();<br> holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);<br> <br> holder.addCallback(new SurfaceHolder.Callback() {<br> <br> @Override<br> public void surfaceDestroyed(SurfaceHolder arg0) {<br> // TODO Auto-generated method stub<br> if(mCamera!=null)<br> {<br> mCamera.stopPreview();<br> mSurfaceView = null; <br> mSurfaceHolder = null; <br> }<br> }<br> <br> @Override<br> public void surfaceCreated(SurfaceHolder arg0) {<br> // TODO Auto-generated method stub<br> try{<br> if(mCamera!=null){<br> mCamera.setPreviewDisplay(arg0);<br> mSurfaceHolder=arg0;<br> }<br> }catch(IOException exception){<br> Log.e(TAG, "Error setting up preview display", exception);<br> }<br> }<br> <br> @Override<br> public void surfaceChanged(SurfaceHolder arg0, int arg1, int arg2, int arg3) {<br> // TODO Auto-generated method stub<br> if(mCamera==null) return;<br> Camera.Parameters parameters=mCamera.getParameters(); <br> parameters.setPreviewSize(640,480);<br> parameters.setPictureSize(640,480);<br> mCamera.setParameters(parameters);<br> try{<br> mCamera.startPreview();<br> mSurfaceHolder=arg0;<br> }catch(Exception e){<br> Log.e(TAG, "could not start preview", e);<br> mCamera.release();<br> mCamera=null;<br> }<br> }<br> });<br> <br> }<br> <br> @TargetApi(9)<br> @Override<br> protected void onResume(){<br> super.onResume();<br> if(Build.VERSION.SDK_INT>=Build.VERSION_CODES.GINGERBREAD){<br> mCamera=Camera.open(0);<br> }else<br> {<br> mCamera=Camera.open();<br> }<br> }<br> <br> @Override<br> protected void onPause(){<br> super.onPause();<br> flush();<br> close();<br> if(mCamera!=null){<br> mCamera.release();<br> mCamera=null;<br> }<br> }<br> <br><br> @Override<br> public boolean onCreateOptionsMenu(Menu menu) {<br> // Inflate the menu; this adds items to the action bar if it is present.<br> getMenuInflater().inflate(R.menu.main, menu);<br> return true;<br> }<br><br> @Override<br> public boolean onOptionsItemSelected(MenuItem item) {<br> // Handle action bar item clicks here. The action bar will<br> // automatically handle clicks on the Home/Up button, so long<br> // as you specify a parent activity in AndroidManifest.xml.<br> int id = item.getItemId();<br> if (id == R.id.action_settings) {<br> return true;<br> }<br> return super.onOptionsItemSelected(item);<br> }<br> <br> //JNI<br> public native int initial(int width,int height);<br> public native int encode(byte[] yuvimage);<br> public native int flush();<br> public native int close();<br> <br> <br> static{
- System.loadLibrary("avutil-54");
- System.loadLibrary("swresample-1");
- System.loadLibrary("avcodec-56");
- System.loadLibrary("avformat-56");
- System.loadLibrary("swscale-3");
- System.loadLibrary("postproc-53");
- System.loadLibrary("avfilter-5");
- System.loadLibrary("avdevice-56");
- System.loadLibrary("encode");
- }<br>}<br>
至此,就完成了全部功能的開發(fā),程序截圖如下
首先由用戶按下take鍵開始攝像

此時按鍵轉為stop,按下即可停止攝像

按下stop后,即停止攝像,彈出encode done信息,退出程序即可在對應目錄下看到編碼的flv視頻

項