RESClient是总入口,持有RESVideoClient,RESAudioClient和RESRtmpSender。
RESVideoClient负责采集图像数据传给RESVideoCore。
RESVideoCore有两种: RESSoftVideoCore是软滤镜模式,通过setPreviewCallbackWithBuffer获取图像数组,然后处理NV21图像数组来实现滤镜, 并且以buffer2buffer的方式使用MediaCodec硬编码,编码数据通过Packager打包成rtmp格式,然后传给RESRtmpSender。RESHardVideoCore是硬滤镜模式,通过setPreviewTexture获取图像纹理,然后使用opengles通过绘制camera texture到surface来实现滤镜。然后以surface2surface的方式使用MediaCodec硬编码,编码数据通过Packager打包成rtmp格式,然后传给RESRtmpSender。RESAudioClient负责采集音频数据传给RESAudioCore。RESAudioCore负责通过buffer2buffer的方式使用MediaCodec硬编码。传给RESRtmpSender。 最后RESRtmpSender通过Native层借口调用librtmp,实际去发送rtmp包。基本上就是Client负责采集数据,Core负责处理数据,RtmpSender负责发送数据
一、通过如下代码进入
resClient.start();
在创建resClient的时候。有个初始方法!
rtmpSender = new RESRtmpSender();rtmpSender.prepare(coreParameters);//prepare代码如下public void prepare(RESCoreParameters coreParameters) { workHandlerThread = new HandlerThread("RESRtmpSender,workHandlerThread"); workHandlerThread.start(); workHandler = new WorkHandler(coreParameters.senderQueueLength, new FLvMetaData(coreParameters), workHandlerThread.getLooper()); }//代码结束 。然后创建一个dataCollecter 数据收集器dataCollecter = new RESFlvDataCollecter() { @Override public void collect(RESFlvData flvData, int type) { rtmpSender.feed(flvData, type); } };二、resclient的start方法如下
public void start() { synchronized (SyncOp) { rtmpSender.start(coreParameters.rtmpAddr);// 链接上服务器 videoClient.start(dataCollecter);// audioClient.start(dataCollecter); LogTools.d("RESClient,start()"); } }三、看下videoClient的初始方法。 然后执行了start方法
public boolean start(RESFlvDataCollecter flvDataCollecter) { if (!startVideo()) { resCoreParameters.dump(); LogTools.e("RESVideoClient,start(),failed"); return false; } videoCore.start(flvDataCollecter, camTexture); return true; }startvideo的作用首先是((RESSoftVideoCore) videoCore).queueVideo(data);
通过这一行代码进行了
public void queueVideo(byte[] rawVideoFrame) {
synchronized (syncOp) {
if (runState != STATE.RUNING) {
return;
}
int targetIndex = (lastVideoQueueBuffIndex + 1) % orignVideoBuffs.length;
if (orignVideoBuffs[targetIndex].isReadyToFill) {
LogTools.d("queueVideo,accept ,targetIndex" + targetIndex);
acceptVideo(rawVideoFrame, orignVideoBuffs[targetIndex].buff);//android摄像头出来的图像全是横着的,前置摄像头还是左右镜像。rawVideoFrame处理旋转之后填到orignVideoBuffes里面
orignVideoBuffs[targetIndex].isReadyToFill = false;
lastVideoQueueBuffIndex = targetIndex;
videoFilterHandler.sendMessage(videoFilterHandler.obtainMessage(VideoFilterHandler.WHAT_INCOMING_BUFF, targetIndex, 0));
} else {
LogTools.d("queueVideo,abandon,targetIndex" + targetIndex);
}
}
}然后,通过 执行了
videoFilterHandler.sendMessage(videoFilterHandler.obtainMessage(VideoFilterHandler.WHAT_INCOMING_BUFF, targetIndex, 0));
//worker handler 处理滤镜然后把数据填给MediaCodec
四、RESVideoClient里面start方法首先是 startvideo,接着进行了 videoCore.start方法
public boolean start(RESFlvDataCollecter flvDataCollecter, SurfaceTexture camTex) {
synchronized (syncOp) {
if (runState != STATE.STOPPED && runState != STATE.PREPARED) {
throw new IllegalStateException("start restreaming without prepared or destroyed");
}
try {
for (RESVideoBuff buff : orignVideoBuffs) {
buff.isReadyToFill = true;
}
if (dstVideoEncoder == null) {
dstVideoEncoder = MediaCodec.createEncoderByType(dstVideoFormat.getString(MediaFormat.KEY_MIME));
}
dstVideoEncoder.configure(dstVideoFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
dstVideoEncoder.start();
lastVideoQueueBuffIndex = 0;
videoFilterHandlerThread = new HandlerThread("videoFilterHandlerThread");
videoFilterHandlerThread.start();
videoFilterHandler = new VideoFilterHandler(videoFilterHandlerThread.getLooper());
//videoSenderThread把编码后的数据取出来打包后传给RESRtmpSender,soft和hard模式一样。音频也类似.
videoSenderThread = new VideoSenderThread("VideoSenderThread", dstVideoEncoder, flvDataCollecter);
videoSenderThread.start();
} catch (Exception e) {
LogTools.trace("RESVideoClient.start()failed", e);
return false;
}
runState = STATE.RUNING;
return true;
}
}五、通过VideoSenderThread 把数据喂给dataCollter
dataCollecter.collect(resFlvData, RESRtmpSender.FROM_VIDEO);