iOS rtmp 摄像头/录屏直播以及观看

之前讲过如何在centos上使用nginx搭建rtmp服务器(链接),本文介绍一下iOS 端如何通过rtmp录屏直播以及观看,完整的工程代码地址(https://github.com/zxm006/Rtmp_iOS),本文也主要是介绍此工程的各个模块,有需要的可以去下载。有什么问题欢迎加qq592979271 交流。
1.摄像头视频采集
iOS 端,底层摄像头是通过AVFoundation采集视频,具体怎么调用网上也有很多介绍。这里简单再介绍一下。
1)首先遍历 摄像头(AVCaptureDevice), 得到前置或者后置摄像头device,然后用得到的device创建AVCaptureDeviceInput的对象videoInput。
2)创建AVCaptureSession,设置好分辨率,然后将videoInput加入到 Session中。然后通过设置AVCaptureVideoDataOutput等,在启动AVCaptureSession之后,在captureOutput函数中得到采集的数据,为CMSampleBufferRef对象。见如下代码(具体见上述工程中的CameraHelp类文件)。

-(void)startVideoCapture 
{
     NSLog(@"startVideoCapture");
    //防锁
    [[UIApplication sharedApplication] setIdleTimerDisabled:YES];
    
    if(_mCaptureDevice || _mCaptureSession)
    {
        NSLog(@"Already capturing");
        return;
    }
    
    if((_mCaptureDevice = [CameraHelp cameraAtPosition: AVCaptureDevicePositionFront]) == nil)
    {
        NSLog(@"Failed to get valide capture device");
        return;
    }
    
    NSError *error = nil;
    _videoInput = [AVCaptureDeviceInput deviceInputWithDevice:_mCaptureDevice error:&error];
    if (!_videoInput)
    {
        NSLog(@"Failed to get video input");
        self.mCaptureDevice = nil;
        return;
    }
    
   _mCaptureSession = [[AVCaptureSession alloc] init];
    if( _mresType ==0){
     _mCaptureSession.sessionPreset = AVCaptureSessionPreset352x288;
     }
     else if( _mresType ==1){
    _mCaptureSession.sessionPreset = AVCaptureSessionPreset640x480;//
     }
     else if( _mresType ==2){
     _mCaptureSession.sessionPreset = AVCaptureSessionPresetHigh;
     }
     else {
     _mCaptureSession.sessionPreset = AVCaptureSessionPreset640x480;
     }
    
    [_mCaptureSession addInput:_videoInput];
    AVCaptureVideoDataOutput*avCaptureVideoDataOutput =  [[[AVCaptureVideoDataOutput alloc] init]autorelease];
    
    avCaptureVideoDataOutput.videoSettings = [[[NSDictionary alloc] initWithObjectsAndKeys:
                                              [NSNumber numberWithUnsignedInt:kCVPixelFormatType_420YpCbCr8BiPlanarFullRange], kCVPixelBufferPixelFormatTypeKey,nil]autorelease];
    
    avCaptureVideoDataOutput.alwaysDiscardsLateVideoFrames = YES;
    [_mCaptureSession beginConfiguration];
    [_mCaptureDevice lockForConfiguration:&error];
    
    [_mCaptureDevice setActiveVideoMinFrameDuration:CMTimeMake(1, 12)];
    [_mCaptureDevice setActiveVideoMaxFrameDuration:CMTimeMake(1, 18)];
    
    [_mCaptureDevice unlockForConfiguration];
    [_mCaptureSession commitConfiguration];
    
    dispatch_queue_t queue = dispatch_queue_create("videoSession--ouput", NULL);
    [avCaptureVideoDataOutput setSampleBufferDelegate:self queue:queue];
    [_mCaptureSession addOutput:avCaptureVideoDataOutput];
    
    dispatch_release(queue);
 
    mStarted = YES;
    doing=NO;
    
    AVCaptureConnection *videoConnection = [avCaptureVideoDataOutput connectionWithMediaType:AVMediaTypeVideo];
    // SET THE ORIENTATION HERE -------------------------------------------------
    [videoConnection setVideoOrientation:AVCaptureVideoOrientationPortrait]; 
    if(![_mCaptureSession isRunning]){
        [_mCaptureSession startRunning];
    }
    [self startPreview];
}

2.屏幕录制
苹果在iOS 10 以后出了RePlayKit用于屏幕录制,可以录制文件也可以得到视频流数据,随着版本的更新也越来越方便。这里首先介绍截屏的方式。大致就是一秒钟截30次屏幕,主要代码如下。(具体见上述工程中的CapScreen类文件)

```objectivec
- (UIImage *)capWindow: (UIWindow *) window
{
    CGSize capsize = window.bounds.size;
    if(m_uiWidth>m_uiHeight&&capsize.width<capsize.height)
    {
        int width=capsize.width;
        capsize.width=capsize.height;
        capsize.height =width;
   }
    else
    {
        [window snapshotViewAfterScreenUpdates:NO];
    }
    
   UIGraphicsBeginImageContextWithOptions(capsize, YES,0);
    NSInvocation *invocation = [NSInvocation invocationWithMethodSignature:
                                [window methodSignatureForSelector:
                                 @selector(drawViewHierarchyInRect:afterScreenUpdates:)]];
    [invocation setTarget:window];
    [invocation setSelector:@selector(drawViewHierarchyInRect:afterScreenUpdates:)];
    CGRect arg2 = window.bounds;
    BOOL arg3 = NO;
    [invocation setArgument:&arg2 atIndex:2];
    [invocation setArgument:&arg3 atIndex:3];
    [invocation invoke];
    
    UIImage *screenshot = UIGraphicsGetImageFromCurrentImageContext();
    UIGraphicsEndImageContext();
    return screenshot;
}

3.h264视频硬编码
首先,创建VTCompressionSessionRef对象,并进行相关设置(回调函数,分辨率,gop大小,码率等),(具体见上述工程中的H264HwEncoderImpl类文件)

       // Create the compression session
        OSStatus status = VTCompressionSessionCreate(NULL, width, height, kCMVideoCodecType_H264, NULL, NULL, NULL, didCompressH264, (__bridge void *)(self),  &EncodingSession);
        NSLog(@"H264: VTCompressionSessionCreate %d", (int)status);
        
        if (status != 0)
        {
            NSLog(@"H264: Unable to create a H264 session");
            _error = @"H264: Unable to create a H264 session";
            
            return ;
      }
       // 设置实时编码输出,降低编码延迟
        status = VTSessionSetProperty(EncodingSession, kVTCompressionPropertyKey_RealTime, kCFBooleanTrue);
        NSLog(@"set realtime  return: %d", (int)status);
        
        // h264 profile, 直播一般使用baseline,可减少由于b帧带来的延时
        status = VTSessionSetProperty(EncodingSession, kVTCompressionPropertyKey_ProfileLevel, kVTProfileLevel_H264_High_AutoLevel);
        NSLog(@"set profile   return: %d", (int)status);
        
        // 设置编码码率(比特率),如果不设置,默认将会以很低的码率编码,导致编码出来的视频很模糊
        status  = VTSessionSetProperty(EncodingSession, kVTCompressionPropertyKey_AverageBitRate, (__bridge CFTypeRef)@(bt)); // bps
        status += VTSessionSetProperty(EncodingSession, kVTCompressionPropertyKey_DataRateLimits, (__bridge CFArrayRef)@[@(bt*2/8), @1]); // Bps
        NSLog(@"set bitrate   return: %d", (int)status);
        
        // 设置关键帧间隔,即gop size
        status = VTSessionSetProperty(EncodingSession, kVTCompressionPropertyKey_MaxKeyFrameInterval, (__bridge CFTypeRef)@(fps));
        NSLog(@"set MaxKeyFrame  return: %d", (int)status);
        
        // 设置帧率,只用于初始化session,不是实际FPS
        status = VTSessionSetProperty(EncodingSession, kVTCompressionPropertyKey_ExpectedFrameRate, (__bridge CFTypeRef)@(fps));
        NSLog(@"set framerate return: %d", (int)status);
        
          VTSessionSetProperty(EncodingSession, kVTCompressionPropertyKey_AllowFrameReordering, kCFBooleanFalse);
        
//        // 开始编码
         status = VTCompressionSessionPrepareToEncodeFrames(EncodingSession);
        NSLog(@" VTCompressionSessionPrepareToEncodeFrames  return: %d", (int)status);

3.音频的采集和播放*
音频使用了AudioQueue方式采集,iOS 支持直接输出aac格式的音频,具体参考RecordAndSend类文件。首先使用AVAudioSession设置PlayAndRecord 模式, 需要保证先激活AVAudioSession( [[AVAudioSession sharedInstance] setActive:YES error:nil];),

音频播放,因为如果采用AudioQueue方式,在每次回调时候塞入数据可能导致音频延迟,所以采用了openal,具体参考AudioPlay类文件。

4.rtmp推送和收取
rtmp推流和拉流采用了开源库rtmpdump,封装成了uuRtmpClient类(具体见上述工程中的uuRtmpClient类文件)。具体包括rtmpdump接口的 各种调用,连接服务器,发送接收数据,以及Metadata格式封装等,需要的话可以直接使用。

4.ffmpeg解码
编译ffmpeg iOS 版本,将下述脚本保存,根据自己的 需要进行就修改并执行,得到iOS 版本的ffmpg库,导入到工程中。

#!/bin/sh
# directories
SOURCE="ffmpeg-4.0"
FAT="FFmpeg-iOS"

SCRATCH="scratch"
# must be an absolute path
THIN=`pwd`/"thin"
CONFIGURE_FLAGS=" --disable-avdevice --disable-avfilter --disable-network  --disable-programs   --disable-ffmpeg    --disable-debug  --disable-ffplay    --disable-iconv --disable-ffprobe  --disable-encoders --disable-decoders \
   --disable-filters --disable-swscale --disable-armv6 --disable-armv6t2 --disable-protocols \
--disable-muxers --disable-demuxers --disable-parsers --disable-bsfs \
  --disable-sdl2 --disable-armv5te --disable-vfp  --disable-swresample --disable-everything  \
  --enable-cross-compile   --enable-pic   --enable-small --enable-optimizations \
  --enable-decoder=h264   \
   --enable-nonfree --enable-gpl"

if [ "$X264" ]
then
	CONFIGURE_FLAGS="$CONFIGURE_FLAGS --enable-gpl --enable-encoder=libx264 --enable-libx264"
fi

if [ "$FDK_AAC" ]
then
echo 'enable-libfdk-aac'
	CONFIGURE_FLAGS="$CONFIGURE_FLAGS --enable-libfdk-aac"
fi

# avresample
#CONFIGURE_FLAGS="$CONFIGURE_FLAGS --enable-avresample"
 #x86_64
ARCHS="arm64 armv7 x86_64"

COMPILE="y"
LIPO="y"

DEPLOYMENT_TARGET="8.0"

if [ "$*" ]
then
	if [ "$*" = "lipo" ]
	then
		# skip compile
		COMPILE=
	else
		ARCHS="$*"
		if [ $# -eq 1 ]
		then
			# skip lipo
			LIPO=
		fi
	fi
fi

if [ "$COMPILE" ]
then
	if [ ! `which yasm` ]
	then
		echo 'Yasm not found'
		if [ ! `which brew` ]
		then
			echo 'Homebrew not found. Trying to install...'
                        ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)" \
				|| exit 1
		fi
		echo 'Trying to install Yasm...'
		brew install yasm || exit 1
	fi
	if [ ! `which gas-preprocessor.pl` ]
	then
		echo 'gas-preprocessor.pl not found. Trying to install...'
		(curl -L https://github.com/libav/gas-preprocessor/raw/master/gas-preprocessor.pl \
			-o /usr/local/bin/gas-preprocessor.pl \
			&& chmod +x /usr/local/bin/gas-preprocessor.pl) \
			|| exit 1
	fi

	if [ ! -r $SOURCE ]
	then
		echo 'FFmpeg source not found. Trying to download...'
		curl http://www.ffmpeg.org/releases/$SOURCE.tar.bz2 | tar xj \
			|| exit 1
	fi

	CWD=`pwd`
	for ARCH in $ARCHS
	do
		echo "building $ARCH..."
		mkdir -p "$SCRATCH/$ARCH"
		cd "$SCRATCH/$ARCH"

		CFLAGS="-arch $ARCH"
		if [ "$ARCH" = "i386" -o "$ARCH" = "x86_64" ]
		then
		    PLATFORM="iPhoneSimulator"
		    CFLAGS="$CFLAGS -mios-simulator-version-min=$DEPLOYMENT_TARGET"
		else
		    PLATFORM="iPhoneOS"
		       CFLAGS="$CFLAGS -mios-version-min=$DEPLOYMENT_TARGET "
                      
		    if [ "$ARCH" = "arm64" ]
		    then
		        EXPORT="GASPP_FIX_XCODE5=1"
		    fi
		fi

		XCRUN_SDK=`echo $PLATFORM | tr '[:upper:]' '[:lower:]'`
		CC="xcrun -sdk $XCRUN_SDK clang"
		CXXFLAGS="$CFLAGS"
		LDFLAGS="$CFLAGS"
		if [ "$X264" ]
		then
			CFLAGS="$CFLAGS -I$X264/include"
			LDFLAGS="$LDFLAGS -L$X264/lib"
		fi
		if [ "$FDK_AAC" ]
		then
echo 'enable-libfdk-aac -- lib'
			CFLAGS="$CFLAGS -I$FDK_AAC/include"
			LDFLAGS="$LDFLAGS -L$FDK_AAC/lib"
		fi

		TMPDIR=${TMPDIR/%\/} $CWD/$SOURCE/configure \
		    --target-os=darwin \
		    --arch=$ARCH \
		    --cc="$CC" \
		    $CONFIGURE_FLAGS \
		    --extra-cflags="$CFLAGS" \
		    --extra-ldflags="$LDFLAGS" \
		    --prefix="$THIN/$ARCH" \
		|| exit 1

		make -j3 install $EXPORT || exit 1
		cd $CWD
	done
fi

if [ "$LIPO" ]
then
	echo "building fat binaries..."
	mkdir -p $FAT/lib
	set - $ARCHS
	CWD=`pwd`
	cd $THIN/$1/lib
	for LIB in *.a
	do
		cd $CWD
		echo lipo -create `find $THIN -name $LIB` -output $FAT/lib/$LIB 1>&2
		lipo -create `find $THIN -name $LIB` -output $FAT/lib/$LIB || exit 1
	done

	cd $CWD
	cp -rf $THIN/$1/include $FAT
fi

echo Done

ffmpeg具体使用可以参考上述工程文件中的liveFFmpegdecode类文件,主要如下:
首先初始化库和编码器

 av_register_all();
    AVCodec *codec = avcodec_find_decoder(AV_CODEC_ID_H264);
    if (!codec) {
        return -1;
    }
    
    m_CodecContext= avcodec_alloc_context3(codec);
    if (!m_CodecContext) {
        
        return -1;
    }
    if(codec->capabilities&CODEC_CAP_TRUNCATED)
        m_CodecContext->flags|= CODEC_FLAG_TRUNCATED;
    if (avcodec_open2(m_CodecContext, codec,0) < 0) {
        avcodec_close(m_CodecContext);
        avcodec_free_context(&m_CodecContext);
             return -1;
    }
   

然后解码,过程大致如下

 AVFrame* pictureFrame= av_frame_alloc();
    int got_picture = 0;
    if (avcodec_send_packet(m_CodecContext,&avpkt)==0)
    {
        got_picture= avcodec_receive_frame(m_CodecContext, pictureFrame);
    }
    else
    {
     if (pictureFrame) {
            av_frame_free(&pictureFrame);
            pictureFrame=NULL; 
        }
        av_packet_unref(&avpkt);
        return 0;
    }
    if (got_picture==0)
    {
    ...........
    }

解码后的数据为yuv图像格式

5.yuv数据展示
yuv转成rgb,然后使用UIImage展示的方式比较消耗cpu,最好是用opengl直接绘制yuv的方式,不需要转换,具体参考上述工程中的iOSGLView类文件,使用方式比较简单,和普通控件差不多,实现细节,熟悉opengl的同学可以研究研究,不熟悉的直接用即可。

5.ReplayKit方式录屏
代码可以参考工程中的MovieClipHandler和SampleHandler文件,SampleHandler为接收数据的类。需要创建extension等操作,具体可以参考此篇文章https://www.jianshu.com/p/401b5b632d5b。ReplayKit录制之后直播上传可以参考我写的一个上传库下载链接,此库封装了ReplayKit输出音视频之后的数据的硬编码以及上传等操作,可以直接导入都工程中使用。视频格式h264,音频格式aac。

以上是iOS rtmp直播上传以及观看的大体介绍,详细代码可以下载工程看一下,有问题加qq592979271交流。

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章