[譯] iOS視頻拍攝

原文:https://www.objc.io/issues/23-video/capturing-video/

【創建於2019/03/03】

 

隨着每個版本的處理器和相機硬件性能的提升,使用 iPhone 拍攝視頻變得越來越有趣。 它們體積小、重量輕,與專業攝像機的質量差距已經縮小了許多,在某些情況下,iPhone 是絕對的備用攝像機。本文介紹各種不同的參數,用於配製捕獲視頻的管道(pipeline),以充分利用硬件。一個簡單的App演示了不同管道的實現,可在GitHub查看。

UIImagePickerController

到目前爲止,在你的app中集成視頻捕獲的最簡易方法是,使用UIImagePickerController。它是一個 view controller,封裝了完整的視頻捕獲管道和相機界面。

在初始化相機之前,首先檢查當前設備是否支持視頻錄製:

 

if ([UIImagePickerController
       isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera]) {
    NSArray *availableMediaTypes = [UIImagePickerController
      availableMediaTypesForSourceType:UIImagePickerControllerSourceTypeCamera];
    if ([availableMediaTypes containsObject:(NSString *)kUTTypeMovie]) {
        // Video recording is supported.
    }
}

 

Then create a UIImagePickerController object, and define a delegate to further process recorded videos (e.g. to save them to the camera roll) and respond to the user dismissing the camera:
然後創建一個UIImagePickerController對象,並且定義一個delegate,爲進一步處理視頻錄製做準備(比如,將它們保存到相機膠捲),並且對用戶關閉相機做出相應的響應。

UIImagePickerController *camera = [UIImagePickerController new];
camera.sourceType = UIImagePickerControllerSourceTypeCamera;
camera.mediaTypes = @[(NSString *)kUTTypeMovie];
camera.delegate = self;

以上就是調用一個功能齊全的相機所需的全部代碼。

配置相機

UIImagePickerController提供了額外配置參數。
設置 cameraDevice 屬性可以指定使用某個攝像頭,它接收一個枚舉類型UIImagePickerControllerCameraDevice。默認情況下,該屬性被設置爲UIImagePickerControllerCameraDeviceRear(後攝像頭),但它也可以設置爲UIImagePickerControllerCameraDeviceFront(前攝像頭)。記得先檢查你想要使用的攝像頭是否存在:

UIImagePickerController *camera = …
if ([UIImagePickerController isCameraDeviceAvailable:UIImagePickerControllerCameraDeviceFront]) {
    [camera setCameraDevice:UIImagePickerControllerCameraDeviceFront];
}

videoQuality屬性可以控制錄製視頻的質量。它允許你修改編碼預設,這會影響視頻的比特率和分辨率。 共有六個預設:

enum {
   UIImagePickerControllerQualityTypeHigh             = 0,
   UIImagePickerControllerQualityTypeMedium           = 1,  // default  value
   UIImagePickerControllerQualityTypeLow              = 2,
   UIImagePickerControllerQualityType640x480          = 3,
   UIImagePickerControllerQualityTypeIFrame1280x720   = 4,
   UIImagePickerControllerQualityTypeIFrame960x540    = 5
};
typedef NSUInteger  UIImagePickerControllerQualityType;

前 3 個是相對預設(低、中、高)。對於不同設備,這些預設實際產生的編碼配置可能是不同的,hight 爲你提供所選相機的最高質量。其它 3 個是指定分辨率的預算(640x480 VGA, 960x540 iFrame, 和 1280x720 iFrame)。

自定義UI

正如前文所說,UIImagePickerController提供了開箱即用的完整相機UI。然而,你也可以自定義相機界面,你可以隱藏默認控件,並用自定義view來控制相機。自定義的view將顯示在‘相機預覽視圖’之上。

UIView *cameraOverlay = …
picker.showsCameraControls = NO;
picker.cameraOverlayView = cameraOverlay;

然後,你需要將自定義控件掛鉤到UIImagePickerController的控制方法。(如: startVideoCapture 和 stopVideoCapture).

AVFoundation

假如你想要對視頻錄製過程有更多的控制,你需要使用AVFoundation庫。

AVFoundation庫中,處理視頻錄製最核心的類是AVCaptureSession。它協調音頻、視頻之間的輸入和輸出數據流:

WX20190216-142209

要使用捕獲會話,你需要實例化它,添加輸入和輸出,並啓動會話:

AVCaptureSession *captureSession = [AVCaptureSession new];
AVCaptureDeviceInput *cameraDeviceInput = …
AVCaptureDeviceInput *micDeviceInput = …
AVCaptureMovieFileOutput *movieFileOutput = …
if ([captureSession canAddInput:cameraDeviceInput]) {
    [captureSession addInput:cameraDeviceInput];
}
if ([captureSession canAddInput:micDeviceInput]) {
    [captureSession addInput:micDeviceInput];
}
if ([captureSession canAddOutput:movieFileOutput]) {
    [captureSession addOutput:movieFileOutput];
}

[captureSession startRunning];

(爲了精簡,以上代碼省略掉了dispatch queue相關代碼。因爲AVCaptureSession的所有方法調用都是會阻塞線程的,所以建議把它們分配到子線程處理。)

使用AVCaptureSession.sessionPreset屬性,可以配置輸出文件的質量。以下有 11 個不同的選項。

NSString *const  AVCaptureSessionPresetPhoto;
NSString *const  AVCaptureSessionPresetHigh;
NSString *const  AVCaptureSessionPresetMedium;
NSString *const  AVCaptureSessionPresetLow;
NSString *const  AVCaptureSessionPreset352x288;
NSString *const  AVCaptureSessionPreset640x480;
NSString *const  AVCaptureSessionPreset1280x720;
NSString *const  AVCaptureSessionPreset1920x1080;
NSString *const  AVCaptureSessionPresetiFrame960x540;
NSString *const  AVCaptureSessionPresetiFrame1280x720;
NSString *const  AVCaptureSessionPresetInputPriority;

The first one is for high-resolution photo output. The next nine are very similar to the UIImagePickerControllerQualityType options we saw for the videoQuality setting of UIImagePickerController, with the exception that there are a few additional presets available for a capture session. The last one (AVCaptureSessionPresetInputPriority) indicates that the capture session does not control the audio and video output settings. Instead, the activeFormat of the connected capture device dictates the quality level at the outputs of the capture session. In the next section, we will look at devices and device formats in more detail.
第一個用於輸出高分辨率照片。隨後 9 個與UIImagePickerController中用於設置 videoQualityUIImagePickerControllerQualityType選項非常相似。最後一個AVCaptureSessionPresetInputPriority,表示會話不控制音頻和視頻輸出設置。
在下一節中,我們將更詳細地介紹設備和視頻格式。

Inputs
The inputs for an AVCaptureSession are one or more AVCaptureDevice objects connected to the capture session through an AVCaptureDeviceInput.

We can use [AVCaptureDevice devices] to find the available capture devices. For an iPhone 6, they are:

(
    “<AVCaptureFigVideoDevice: 0x136514db0 [Back Camera][com.apple.avfoundation.avcapturedevice.built-in_video:0]>”,
    “<AVCaptureFigVideoDevice: 0x13660be80 [Front Camera][com.apple.avfoundation.avcapturedevice.built-in_video:1]>”,
    “<AVCaptureFigAudioDevice: 0x174265e80 [iPhone Microphone][com.apple.avfoundation.avcapturedevice.built-in_audio:0]>”
)
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章