将之前做过的一个比较实用的在Android实时采集视频,并在PC上显示出采集到的视频的程序,包括PC端和Android端程序,基于Android 1.5 在HTC G3上测试通过。代码在分界线之后。
之前网上找了很多资料,没有找到如何截取Android视频流。后来发现在Android的拍照视频预览时就可以截取视频数据。每获得一帧就调用一下接口函数。
我的开发平台是Android 1.5,这个程序实现视频流的获取,程序简单地在第20帧到来的时候,写入到文件中。这样就可以拿到电脑上进行分析。
具体请大家参考代码
package com.sunshine;
import java.io.File;
import java.io.RandomAccessFile;
import android.app.Activity;
import android.content.res.Configuration;
import android.graphics.PixelFormat;
import android.hardware.Camera;
import android.os.Bundle;
import android.util.Log;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.Window;
import android.view.WindowManager;
import android.view.SurfaceHolder.Callback;
public class AndroidVideo extends Activity implements Callback,
Camera.PictureCallback {
private SurfaceView mSurfaceView = null;
private SurfaceHolder mSurfaceHolder = null;
private Camera mCamera = null;
private boolean mPreviewRunning = false;
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
getWindow().setFormat(PixelFormat.TRANSLUCENT);
requestWindowFeature(Window.FEATURE_NO_TITLE);
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN,
WindowManager.LayoutParams.FLAG_FULLSCREEN);
setContentView(R.layout.main);
mSurfaceView = (SurfaceView) this.findViewById(R.id.surface_camera);
mSurfaceHolder = mSurfaceView.getHolder();
mSurfaceHolder.addCallback(this);
mSurfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
@Override
public void onPictureTaken(byte[] data, Camera camera) {
try {
Log.v("System.out", "get it!");
File file = new File("/sdcard/camera.jpg");
RandomAccessFile raf = new RandomAccessFile(file, "rw");
raf.write(data);
raf.close();
} catch (Exception ex) {
Log.v("System.out", ex.toString());
}
}
@Override
public void surfaceChanged(SurfaceHolder holder, int format, int width,
int height) {
if (mPreviewRunning) {
mCamera.stopPreview();
}
Camera.Parameters p = mCamera.getParameters();
p.setPreviewSize(width, height);
mCamera.setPreviewCallback(new StreamIt());
mCamera.setParameters(p);
try {
mCamera.setPreviewDisplay(holder);
} catch (Exception ex) {
}
mCamera.startPreview();
mPreviewRunning = true;
}
@Override
public void surfaceCreated(SurfaceHolder holder) {
mCamera = Camera.open();
}
@Override
public void surfaceDestroyed(SurfaceHolder holder) {
mCamera.stopPreview();
mPreviewRunning = false;
mCamera.release();
}
@Override
public void onConfigurationChanged(Configuration newConfig) {
try {
super.onConfigurationChanged(newConfig);
if (this.getResources().getConfiguration().orientation == Configuration.ORIENTATION_LANDSCAPE) {
} else if (this.getResources().getConfiguration().orientation == Configuration.ORIENTATION_PORTRAIT) {
}
} catch (Exception ex) {
}
}
}
class StreamIt implements Camera.PreviewCallback {
private int tick = 1;
@Override
public void onPreviewFrame(byte[] data, Camera camera) {
// TODO Auto-generated method stub
if (tick == 20) {
System.out.println("data len: " + data.length);
try {
File file = new File("/sdcard/pal.pal");
if (!file.exists())
file.createNewFile();
RandomAccessFile raf = new RandomAccessFile(file, "rw");
raf.write(data);
raf.close();
tick++;
} catch (Exception ex) {
Log.v("System.out", ex.toString());
}
}
tick++;
}
}
xml 布局文件
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="fill_parent" android:layout_height="fill_parent"
android:orientation="vertical">
<SurfaceView android:id="@+id/surface_camera"
android:layout_width="fill_parent" android:layout_height="fill_parent">
</SurfaceView>
</LinearLayout>
注意在项目配置文件中还要加上访问权限
<uses-permission android:name="android.permission.CAMERA" />
通过查资料发现,Android每帧的数据流的格式是YUV420
下面附上一个将 YUV420转成RGB的函数,
- static public void decodeYUV420SP(byte[] rgbBuf, byte[] yuv420sp, int width, int height) {
- final int frameSize = width * height;
- if (rgbBuf == null)
- throw new NullPointerException("buffer 'rgbBuf' is null");
- if (rgbBuf.length < frameSize * 3)
- throw new IllegalArgumentException("buffer 'rgbBuf' size "
- + rgbBuf.length + " < minimum " + frameSize * 3);
-
- if (yuv420sp == null)
- throw new NullPointerException("buffer 'yuv420sp' is null");
-
- if (yuv420sp.length < frameSize * 3 / 2)
- throw new IllegalArgumentException("buffer 'yuv420sp' size " + yuv420sp.length
- + " < minimum " + frameSize * 3 / 2);
-
- int i = 0, y = 0;
- int uvp = 0, u = 0, v = 0;
- int y1192 = 0, r = 0, g = 0, b = 0;
-
- for (int j = 0, yp = 0; j < height; j++) {
- uvp = frameSize + (j >> 1) * width;
- u = 0;
- v = 0;
- for (i = 0; i < width; i++, yp++) {
- y = (0xff & ((int) yuv420sp[yp])) - 16;
- if (y < 0) y = 0;
- if ((i & 1) == 0) {
- v = (0xff & yuv420sp[uvp++]) - 128;
- u = (0xff & yuv420sp[uvp++]) - 128;
- }
-
- y1192 = 1192 * y;
- r = (y1192 + 1634 * v);
- g = (y1192 - 833 * v - 400 * u);
- b = (y1192 + 2066 * u);
-
- if (r < 0) r = 0; else if (r > 262143) r = 262143;
- if (g < 0) g = 0; else if (g > 262143) g = 262143;
- if (b < 0) b = 0; else if (b > 262143) b = 262143;
-
- rgbBuf[yp * 3] = (byte)(r >> 10);
- rgbBuf[yp * 3 + 1] = (byte)(g >> 10);
- rgbBuf[yp * 3 + 2] = (byte)(b >> 10);
- }
- }
- }
代码来自http://chenweihuacwh.javaeye.com/blog/571223
感谢cwh643
-----------------------------分界线-------------------------------------------
-----------------------------2010-10-13更新-------------------------------
Android 端
package com.sunshine;
import java.io.DataInputStream;
import java.io.DataOutputStream;
import java.net.Socket;
import android.app.Activity;
import android.content.res.Configuration;
import android.graphics.PixelFormat;
import android.hardware.Camera;
import android.os.Bundle;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.View;
import android.view.Window;
import android.view.WindowManager;
import android.view.SurfaceHolder.Callback;
import android.view.View.OnClickListener;
import android.widget.Button;
import android.widget.EditText;
public class AndroidVideo extends Activity implements Callback,OnClickListener{
private SurfaceView mSurfaceView = null;
private SurfaceHolder mSurfaceHolder = null;
private Camera mCamera = null;
private boolean mPreviewRunning = false;
//连接相关
private EditText remoteIP=null;
private Button connect=null;
private String remoteIPStr=null;
//视频数据
private StreamIt streamIt=null;
public static Kit kit=null;
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
getWindow().setFormat(PixelFormat.TRANSLUCENT);
requestWindowFeature(Window.FEATURE_NO_TITLE);
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN,
WindowManager.LayoutParams.FLAG_FULLSCREEN);
setContentView(R.layout.main);
mSurfaceView = (SurfaceView) this.findViewById(R.id.surface_camera);
mSurfaceHolder = mSurfaceView.getHolder();
mSurfaceHolder.addCallback(this);
mSurfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
remoteIP=(EditText)this.findViewById(R.id.remoteIP);
connect=(Button)this.findViewById(R.id.connect);
connect.setOnClickListener(this);
}
public void surfaceChanged(SurfaceHolder holder, int format, int width,
int height) {
if (mPreviewRunning) {
mCamera.stopPreview();
}
Camera.Parameters p = mCamera.getParameters();
p.setPreviewSize(width, height);
streamIt=new StreamIt();
kit=new Kit();
mCamera.setPreviewCallback(streamIt);
mCamera.setParameters(p);
try {
mCamera.setPreviewDisplay(holder);
} catch (Exception ex) {
}
mCamera.startPreview();
mPreviewRunning = true;
}
public void surfaceCreated(SurfaceHolder holder) {
mCamera = Camera.open();
}
public void surfaceDestroyed(SurfaceHolder holder) {
mCamera.stopPreview();
mPreviewRunning = false;
mCamera.release();
}
@Override
public void onConfigurationChanged(Configuration newConfig) {
try {
super.onConfigurationChanged(newConfig);
if (this.getResources().getConfiguration().orientation == Configuration.ORIENTATION_LANDSCAPE) {
} else if (this.getResources().getConfiguration().orientation == Configuration.ORIENTATION_PORTRAIT) {
}
} catch (Exception ex) {
}
}
class Kit implements Runnable {
private boolean run=true;
// private final int dataLen=57600; //307200 OR 230400 76800 OR 57600
private final int tt=28800;
public void run() {
// TODO Auto-generated method stub
try {
Socket socket = new Socket(remoteIPStr, 8899);
DataOutputStream dos = new DataOutputStream(socket
.getOutputStream());
DataInputStream dis = new DataInputStream(socket
.getInputStream());
while (run) {
dos.write(streamIt.yuv420sp, 0, 28800);
dos.write(streamIt.yuv420sp, 28800, 28800);
dis.readBoolean();
Thread.sleep(155);
}
} catch (Exception ex) {
run=false;
ex.printStackTrace();
}
}
}
@Override
public void onClick(View view) {
// TODO Auto-generated method stub
if(view==connect){//连接函数
remoteIPStr=remoteIP.getText().toString();
new Thread(AndroidVideo.kit).start();
}
}
}
class StreamIt implements Camera.PreviewCallback {
public byte[] yuv420sp =null;
private boolean t=true;
public void onPreviewFrame(byte[] data, Camera camera) {
// TODO Auto-generated method stub
// if(t){
// t=false;
// new Thread(AndroidVideo.kit).start();
// }
yuv420sp=data;
}
}
PC端
import java.awt.Frame;
import java.awt.Graphics;
import java.awt.Point;
import java.awt.Transparency;
import java.awt.color.ColorSpace;
import java.awt.image.BufferedImage;
import java.awt.image.ComponentColorModel;
import java.awt.image.DataBuffer;
import java.awt.image.DataBufferByte;
import java.awt.image.PixelInterleavedSampleModel;
import java.awt.image.Raster;
import java.awt.image.SampleModel;
import java.awt.image.WritableRaster;
import java.io.DataInputStream;
import java.io.DataOutputStream;
import java.net.ServerSocket;
import java.net.Socket;
public class FlushMe extends Frame {
private static final long serialVersionUID = 1L;
private BufferedImage im;
// 图像信息
// private final int width = 480;
// private final int height = 320;
private static final int width = 240;
private static final int height = 160;
private static final int numBands = 3;
private static final int dataLen = 57600;//307200 OR 230400//57600 76800
private static final int tt = 28800;//14400;//28800;
// 图像数组
private byte[] byteArray = new byte[width * height * numBands];// 图像RGB数组
private byte[] yuv420sp = new byte[dataLen];// 图像YUV数组
private static final int[] bandOffsets = new int[] { 0, 1, 2 };
private static final SampleModel sampleModel = new PixelInterleavedSampleModel(
DataBuffer.TYPE_BYTE, width, height, 3, width * 3,
bandOffsets);
// ColorModel
private static final ColorSpace cs=ColorSpace.getInstance(ColorSpace.CS_sRGB);
private static final ComponentColorModel cm=new ComponentColorModel(cs, false, false,
Transparency.OPAQUE, DataBuffer.TYPE_BYTE);
public FlushMe() {
super("Flushing");
updateIM();
setSize(480, 320);
// 窗口关闭方法
this.addWindowListener(new java.awt.event.WindowAdapter() {
public void windowClosing(java.awt.event.WindowEvent e) {
System.exit(0);
}
});
// 窗口居中
this.setLocationRelativeTo(null);
this.setResizable(false);
this.setVisible(true);
this.getData();
}
public void update(Graphics g){
paint(g);
}
public void paint(Graphics g) {
g.drawImage(im, 0, 0, 480, 320, this);
}
public void getData() {
try {
ServerSocket server = new ServerSocket(8899);
Socket socket = server.accept();
DataInputStream dis = new DataInputStream(socket.getInputStream());
DataOutputStream dos = new DataOutputStream(socket.getOutputStream());
while (true) {
for (int i = 0; i < dataLen / tt; i++) {
dis.read(yuv420sp, i * tt, tt);
}
// 得到数据之后立即更新显示
updateIM();
im.flush();
repaint();
dos.writeBoolean(true);
}
} catch (Exception ex) {
ex.printStackTrace();
}
}
private void updateIM() {
try {
// 解析YUV成RGB格式
decodeYUV420SP(byteArray, yuv420sp, width, height);
DataBuffer dataBuffer = new DataBufferByte(byteArray, numBands);
WritableRaster wr = Raster.createWritableRaster(sampleModel,
dataBuffer, new Point(0, 0));
im = new BufferedImage(cm, wr, false, null);
} catch (Exception ex) {
ex.printStackTrace();
}
}
private static void decodeYUV420SP(byte[] rgbBuf, byte[] yuv420sp,
int width, int height) {
final int frameSize = width * height;
if (rgbBuf == null)
throw new NullPointerException("buffer 'rgbBuf' is null");
if (rgbBuf.length < frameSize * 3)
throw new IllegalArgumentException("buffer 'rgbBuf' size "
+ rgbBuf.length + " < minimum " + frameSize * 3);
if (yuv420sp == null)
throw new NullPointerException("buffer 'yuv420sp' is null");
if (yuv420sp.length < frameSize * 3 / 2)
throw new IllegalArgumentException("buffer 'yuv420sp' size "
+ yuv420sp.length + " < minimum " + frameSize * 3 / 2);
int i = 0, y = 0;
int uvp = 0, u = 0, v = 0;
int y1192 = 0, r = 0, g = 0, b = 0;
for (int j = 0, yp = 0; j < height; j++) {
uvp = frameSize + (j >> 1) * width;
u = 0;
v = 0;
for (i = 0; i < width; i++, yp++) {
y = (0xff & ((int) yuv420sp[yp])) - 16;
if (y < 0)
y = 0;
if ((i & 1) == 0) {
v = (0xff & yuv420sp[uvp++]) - 128;
u = (0xff & yuv420sp[uvp++]) - 128;
}
y1192 = 1192 * y;
r = (y1192 + 1634 * v);
g = (y1192 - 833 * v - 400 * u);
b = (y1192 + 2066 * u);
if (r < 0)
r = 0;
else if (r > 262143)
r = 262143;
if (g < 0)
g = 0;
else if (g > 262143)
g = 262143;
if (b < 0)
b = 0;
else if (b > 262143)
b = 262143;
rgbBuf[yp * 3] = (byte) (r >> 10);
rgbBuf[yp * 3 + 1] = (byte) (g >> 10);
rgbBuf[yp * 3 + 2] = (byte) (b >> 10);
}
}
}
public static void main(String[] args) {
Frame f = new FlushMe();
}
}
上个截图
相关推荐
本项目提供的"Android视频采集源码"是针对这些技术的实践应用,非常适合Android开发者学习和参考。 首先,我们要了解Android的摄像头API。Android提供了两套摄像头API:Camera API1和Camera2 API。Camera API1是...
本文将深入探讨这一主题,结合提供的开源项目,帮助开发者理解Android视频采集的核心技术以及如何实现RTSP流传输。 一、Android视频采集 1. Camera API:在Android中,视频采集主要通过Camera API进行。分为Camera...
在Android平台上进行视频采集涉及到多...以上就是关于“android视频采集”的主要知识点,从Android多媒体框架到网络通信、设备驱动,再到程序设计和性能优化,每一个环节都是构建一个完整视频采集系统不可或缺的部分。
这个压缩包提供的"Android视频采集+RTSP完整代码(可用)"包含了一套完整的源代码,可以帮助开发者快速理解和实现这一功能。以下是关于这个主题的详细知识点: 1. **Android 视频采集**: - **Camera API**:...
这个压缩包"安卓Android源码——Android视频采集+RTSP完整代码(可用).7z"提供了实现这一功能的源代码,帮助开发者快速理解和构建自己的视频流服务。 首先,我们要理解Android视频采集的基础知识。Android系统提供...
### Delphi XE6 试用Android视频采集 #### 知识点概述: 在本篇文章中,我们将深入了解如何使用Delphi XE6进行Android平台上的视频采集操作。Delphi XE6是Embarcadero公司发布的一款强大的跨平台开发工具,它支持...
本文将深入探讨这一主题,结合提供的"Android视频采集+RTSP完整代码",我们将讨论以下几个关键知识点: 1. **Android MediaRecorder**: 这是Android SDK中的一个类,用于处理多媒体文件的录制,包括视频采集。...
本文将深入探讨"Android视频采集源代码"这一主题,旨在帮助开发者理解如何在Android应用中实现视频捕获功能。 一、Android相机API Android系统提供了两种相机API:旧版Camera API和新版Camera2 API。旧版API适用于...
本项目“Android应用源码之Android视频采集+RTSP完整代码(可用)”提供了一个完整的示例,帮助开发者学习如何在Android应用中实现视频采集并将其通过RTSP(Real Time Streaming Protocol)协议进行传输。...
本文将深入探讨这个主题,基于提供的"Android源码_Android视频采集 RTSP完整代码(可用)"来解析相关的技术点。 首先,我们要理解Android视频采集的核心组件——Camera API。Android提供了两个版本的Camera API:...
这个压缩包文件“Android应用源码之Android视频采集+RTSP完整代码(可用)”提供了一个完整的实现案例,对于学习和理解Android视频处理和流媒体技术具有极高的价值。 首先,Android视频采集涉及的主要API是Android ...
首先,Android视频采集主要依赖于MediaRecorder类,这是一个系统提供的API,用于处理音频和视频的录制。我们可以通过以下步骤设置视频采集: 1. 初始化MediaRecorder对象:创建MediaRecorder实例,并调用其...
首先,让我们详细探讨Android视频采集。Android系统提供了广泛的API来支持摄像头的访问和视频录制。主要涉及到`Camera`和`Camera2`两个API。`Camera`是早期的API,虽然简单易用,但功能相对有限。而`Camera2`是现代...
Android视频采集与RTSP传输技术是移动开发中用于实时音视频通信的重要组成部分,尤其是在构建远程监控、视频会议等应用场景时。本压缩包包含了一个完整的Android客户端和服务端代码,旨在帮助开发者理解和实现这样的...
在Android平台上进行视频采集是一...以上就是Android视频采集的基本流程和关键知识点,这份源代码应该包含了这些步骤的实现,供开发者参考学习。通过理解和实践这些内容,可以更好地掌握Android平台上的视频录制功能。
本资源提供的"安卓Android源码——Android视频采集+RTSP完整代码(可用).rar"包含了实现这一功能的完整源代码,对于学习和实践Android多媒体开发的开发者来说,具有很高的参考价值。 首先,我们要理解Android视频...
本压缩包“Android视频采集RTSP完整代码.rar”提供了实现这一功能的完整源码,方便开发者进行学习和参考。 RTSP是一种应用层协议,用于控制多媒体数据的实时传输。在Android上,我们通常使用MediaRecorder类来处理...
综上所述,Android视频采集加H264编码涉及了多个技术点,包括摄像头控制、硬件编码器的使用、sps与pps的处理以及流媒体协议的理解。在实际开发中,我们需要对每个环节有深入理解,并结合具体的项目需求进行优化。...
这份"安卓Android源码——Android视频采集+RTSP完整代码(可用).zip"提供了实现这一功能的完整源代码,帮助开发者快速理解和构建自己的视频处理应用。 首先,Android视频采集是利用Android系统的MediaRecorder类来...