轉自:https://www.jianshu.com/p/58790d70b08f
要自定義一個相機,涉及的類有很多, 這也是AVFounation的重要學習內容之一, 音視頻的捕獲方面的知識更是重中之重,大概涉及的類有AVCaptureSession
、AVCaptureConnection
、AVCaptureDevice
、AVCaptureDeviceInput
、AVCapturePhotoOutput
、AVCaptureMovieFileOutput
等, 下面來一一介紹
AVCaptureSession
這個就是充當會話者的角色, 我們可以理解爲平時生活中的插線板, 有輸入, 有輸出。其中AVCaptureDeviceInput
爲輸入, AVCapturePhotoOutput
、AVCaptureMovieFileOutput
等爲輸出它們之間的關係我們可以看一下下面這幅圖
輸入, 輸出與會話者之間的關係
這裏的AVCaptureStillImageOutPut
和AVCapturePhotoOutput
是類似的, iOS10之前用AVCaptureStillImageOutPut
, iOS10之後用AVCapturePhotoOutput
其中輸入源有:AVCaptureDeviceInput(video)
、 AVCaptureDeviceInput(audio)
輸出源有:AVCaptureVideoDataOutput
, AVCaptureAudioDataOutput
, AVCaptureMovieFileOutput
, AVCaptureStillImageOutput(AVCapturePhotoOutput)
, AVCaptureMetadataOutput
,
簡單的創建一個會話者
AVCaptureSession *session = [[AVCaptureSession alloc] init];
// 添加輸入與輸出
[session startRunning];
AVCaptureSession
可以擁有一個或多個輸入源, 當你想獲取一張圖片的時候你只需要AVCaptureDeviceInput(video)
也就是圖像的輸入源就可以了, 當你想獲取一段有聲的視頻時,則需要兩個輸入源AVCaptureDeviceInput(video)
和AVCaptureDeviceInput(audio)
,你還可以只獲取聲音數據的時候, 只要AVCaptureDeviceInput(audio)
就可以了。
在創建AVCaptureSession
時我們一般會設定一個預設來獲取不同規格的數據:
NSString *const AVCaptureSessionPresetPhoto; //一般在獲取靜態圖片的時候用
NSString *const AVCaptureSessionPresetHigh;
NSString *const AVCaptureSessionPresetMedium;
NSString *const AVCaptureSessionPresetLow;
NSString *const AVCaptureSessionPreset352x288;
NSString *const AVCaptureSessionPreset640x480; // VGA.
NSString *const AVCaptureSessionPreset1280x720; // 720p HD.
NSString *const AVCaptureSessionPreset1920x1080;
NSString *const AVCaptureSessionPresetiFrame960x540;
NSString *const AVCaptureSessionPresetiFrame1280x720;
NSString *const AVCaptureSessionPresetInputPriority; //通過已連接的捕獲設備的 activeFormat 來反過來控制 capture session 的輸出質量等級
在配置AVCaptureSession
, 需要注意的是:
1、在輸入輸出或配置它的時候, 線程是阻塞的, 所以最後是創建一條串行隊列異步執行。
2、在改變或配置輸入, 輸出, 預設(sessionPreset)的時候, 在AVCaptureSession
所在線程在修改或配置之前調用- (void)beginConfiguration;
修改完之後, 或修改失敗之後調用- (void)commitConfiguration;
, 這樣可以提高流暢性等不必要的BUG。
AVCaptureConnection
在AVCaptureSession
中, 可能存在着很多的AVCaptureConnection
, AVCaptureConnection
代表着一條連接通道, 例如: 獲取一張圖片, 輸入是AVCaptureDeviceInput(video)
輸出是 AVCaptureStillImageOutput(AVCapturePhotoOutput)
, 在輸入到輸出這個過程中, 就有一個AVCaptureConnection
對象去管理者這個流程。AVCaptureConnection
可以有多個輸入和輸出,例如:獲取視視頻的時候, 就有視頻和音頻的輸入, 輸出也可以分視頻數據和音頻數據, 還可以獲取圖片數據。一般這個對象在你AVCaptureSession
中加入輸入輸出的時候就會自動的幫你創建了。
AVCaptureConnection與輸入輸出的關係
所以我們一般是去獲取使用:
AVCaptureConnection *connection = [movieFileOutput connectionWithMediaType:AVMediaTypeAudio];
我們一般用它來管理視頻或圖片輸出的方向格式等問題, 還有音頻輸出的音量、聲道等, 控制輸入或輸出
//圖片輸出朝向問題
AVCaptureConnection *photoOutputConnection = [self.photoOutput connectionWithMediaType:AVMediaTypeVideo];
photoOutputConnection.videoOrientation = videoPreviewLayerVideoOrientation;
其中朝向問題, 這裏整理出來然大家有個很好的對比
[UIApplication sharedApplication].statusBarOrientation; 狀態欄的方向跟視頻的方向是一致的, 還有對比設備的朝向
typedef NS_ENUM(NSInteger, UIInterfaceOrientation) {
UIInterfaceOrientationUnknown = UIDeviceOrientationUnknown,
UIInterfaceOrientationPortrait = AVCaptureVideoOrientationPortrait = UIDeviceOrientationPortrait, //home button on the bottom.
UIInterfaceOrientationPortraitUpsideDown = AVCaptureVideoOrientationPortraitUpsideDown = UIDeviceOrientationPortraitUpsideDown, //home button on the top.
UIInterfaceOrientationLandscapeLeft = AVCaptureVideoOrientationLandscapeRight = UIDeviceOrientationLandscapeRight, //home button on the right.
UIInterfaceOrientationLandscapeRight = AVCaptureVideoOrientationLandscapeLeft = UIDeviceOrientationLandscapeLeft //home button on the left.
} __TVOS_PROHIBITED;
AVCaptureDeviceInput
這個類代表着AVCaptureSession
中的輸入, 其中會根據AVCaptureDevice
中的是視頻設備(攝像頭)還是音頻設備(麥克風),的不同而不同,創建出來之後充當輸入的角色添加到AVCaptureSession
中
//配置session輸入輸出, 或輸入輸出屬性, sessionPreset時,需要用到-beginConfiguration, -commitConfiguration
[self.session beginConfiguration];
self.session.sessionPreset = AVCaptureSessionPresetPhoto;
//添加視頻輸入
AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithDeviceType:AVCaptureDeviceTypeBuiltInWideAngleCamera mediaType:AVMediaTypeVideo position:AVCaptureDevicePositionUnspecified];
AVCaptureDeviceInput *videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if (!videoDeviceInput) {
NSLog(@"視頻設備輸出創建錯誤--%@", error);
self.setupResult = AVCamManualSetupResultSessionConfigurationFailed;
[self.session commitConfiguration];
return;
}
if ([self.session canAddInput:videoDeviceInput]) {
[self.session addInput:videoDeviceInput];
self.videoDeviceInput = videoDeviceInput;
self.videoDevice = videoDevice;
dispatch_async(dispatch_get_main_queue(), ^{//調整視頻輸入方向
UIInterfaceOrientation statusBarOriebtation = [UIApplication sharedApplication].statusBarOrientation;
AVCaptureVideoOrientation initialVideoOrientation = AVCaptureVideoOrientationPortrait;
if (statusBarOriebtation != UIInterfaceOrientationUnknown) {
initialVideoOrientation = (AVCaptureVideoOrientation)statusBarOriebtation;
}
AVCaptureVideoPreviewLayer *previewLayer = (AVCaptureVideoPreviewLayer *)self.previewView.layer;
previewLayer.connection.videoOrientation = initialVideoOrientation;
});
}else {
NSLog(@"不能添加視頻輸入到session");
self.setupResult = AVCamManualSetupResultSessionConfigurationFailed;
[self.session commitConfiguration];
return;
}
//添加音頻輸入
AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
AVCaptureDeviceInput *audioDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error];
if (!audioDeviceInput) {
NSLog(@"音頻設備輸出創建錯誤--%@", error);
}
if ([self.session canAddInput:audioDeviceInput]) {
[self.session addInput:audioDeviceInput];
}else {
NSLog(@"不能添加音頻輸入到session");
}
需要注意的是:
1、攝像頭的聚焦(AVCaptureFocusMode)、曝光(AVCaptureExposureMode)、白平衡(AVCaptureWhiteBalanceMode)和感光度(ISO)等等一些參數設置都是在AVCaptureDevice(video)
裏面配置完成的。
2、AVCaptureDevice
配置參數或修改參數的時候都要注意先調用 - (BOOL)lockForConfiguration:(NSError **)outError;
配置完成後調用 - (void)unlockForConfiguration;
下面是一下參數的配置:
//改變聚焦模式
- (IBAction)changeFocusMode:(id)sender
{
UISegmentedControl *control = sender;
AVCaptureFocusMode mode = (AVCaptureFocusMode)[self.focusModes[control.selectedSegmentIndex] intValue];
NSError *error = nil;
if ([self.videoDevice lockForConfiguration:&error]) {
if ([self.videoDevice isFocusModeSupported:mode]) {
self.videoDevice.focusMode = mode;
}
else {
NSLog( @"Focus mode %@ is not supported. Focus mode is %@.", [self stringFromFocusMode:mode], [self stringFromFocusMode:self.videoDevice.focusMode] );
self.focusModeControl.selectedSegmentIndex = [self.focusModes indexOfObject:@(self.videoDevice.focusMode)];
[self.videoDevice unlockForConfiguration];
}
}
else {
NSLog( @"Could not lock device for configuration: %@", error );
}
}
//手動模式下, 改變鏡頭聚焦位置
- (IBAction)changeLensPosition:(id)sender
{
UISlider *control = sender;
NSError *error = nil;
if ([self.videoDevice lockForConfiguration:&error]) {
[self.videoDevice setFocusModeLockedWithLensPosition:control.value completionHandler:nil];//手動聚焦
[self.videoDevice unlockForConfiguration];
}
else {
NSLog( @"Could not lock device for configuration: %@", error );
}
}
//設置輸入設備的聚焦和曝光的模式和點
- (void)focusWithMode:(AVCaptureFocusMode)focusMode exposeWithMode:(AVCaptureExposureMode)exposureMode atDevicePoint:(CGPoint)point monitorSubjectAreaChange:(BOOL)monitorSubjectAreaChange
{
dispatch_async(self.sessionQueue, ^{
AVCaptureDevice *device = self.videoDevice;
NSError *error = nil;
if ([device lockForConfiguration:&error]) {
if (focusMode != AVCaptureFocusModeLocked && device.isFocusPointOfInterestSupported && [device isFocusModeSupported:focusMode]) {
device.focusPointOfInterest = point;
device.focusMode = focusMode;
}
if (exposureMode != AVCaptureExposureModeCustom && device.isExposurePointOfInterestSupported && [device isExposureModeSupported:exposureMode]) {
device.exposurePointOfInterest = point;
device.exposureMode = exposureMode;
}
device.subjectAreaChangeMonitoringEnabled = monitorSubjectAreaChange;
[device unlockForConfiguration];
}else {
NSLog( @"Could not lock device for configuration: %@", error );
}
});
}
//根據點擊的位置, 進行聚焦和曝光
- (IBAction)focusAndExposeTap:(UIGestureRecognizer *)gestureRecognizer
{
CGPoint devicePoint = [(AVCaptureVideoPreviewLayer *)self.previewView.layer captureDevicePointOfInterestForPoint:[gestureRecognizer locationInView:[gestureRecognizer view]]];
[self focusWithMode:self.videoDevice.focusMode exposeWithMode:self.videoDevice.exposureMode atDevicePoint:devicePoint monitorSubjectAreaChange:YES];
}
//改變曝光模式
- (IBAction)changeExposureMode:(id)sender
{
UISegmentedControl *control = sender;
AVCaptureExposureMode mode = (AVCaptureExposureMode)[self.exposureModes[control.selectedSegmentIndex] intValue];
NSError *error = nil;
if ([self.videoDevice lockForConfiguration:&error]) {
if ([self.videoDevice isExposureModeSupported:mode]) {
self.videoDevice.exposureMode = mode;
}
else {
NSLog( @"Exposure mode %@ is not supported. Exposure mode is %@.", [self stringFromExposureMode:mode], [self stringFromExposureMode:self.videoDevice.exposureMode] );
self.exposureModeControl.selectedSegmentIndex = [self.exposureModes indexOfObject:@(self.videoDevice.exposureMode)];
}
[self.videoDevice unlockForConfiguration];
}
else {
NSLog( @"Could not lock device for configuration: %@", error );
}
}
//手動模式下的設置曝光時間
- (IBAction)changeExposureDuration:(id)sender
{
UISlider *control = sender;
NSError *error = nil;
double p = pow( control.value, kExposureDurationPower ); // Apply power function to expand slider's low-end range
double minDurationSeconds = MAX( CMTimeGetSeconds( self.videoDevice.activeFormat.minExposureDuration ), kExposureMinimumDuration );
double maxDurationSeconds = CMTimeGetSeconds( self.videoDevice.activeFormat.maxExposureDuration );
double newDurationSeconds = p * ( maxDurationSeconds - minDurationSeconds ) + minDurationSeconds; // Scale from 0-1 slider range to actual duration
if ([self.videoDevice lockForConfiguration:&error]) {
[self.videoDevice setExposureModeCustomWithDuration:CMTimeMakeWithSeconds(newDurationSeconds, 1000*1000*1000) ISO:AVCaptureISOCurrent completionHandler:nil];
[self.videoDevice unlockForConfiguration];
}
else {
NSLog( @"Could not lock device for configuration: %@", error );
}
}
AVCaptureOutput
一般是用他們的子類AVCaptureVideoDataOutput, AVCaptureAudioDataOutput, AVCaptureMovieFileOutput, AVCaptureStillImageOutput(AVCapturePhotoOutput), AVCaptureMetadataOutput, 他們都用相對應的代理, 其中數據輸出都是通過代理來實現的,
下面的是視頻的數據輸出和存儲
- (IBAction)toggleMovieRecording:(id)sender
{
self.recordButton.enabled = NO;
self.cameraButton.enabled = NO;
self.captureModeControl.enabled = NO;
AVCaptureVideoPreviewLayer *previewLayer = (AVCaptureVideoPreviewLayer *)self.previewView.layer;
AVCaptureVideoOrientation previewLayerVideoOrientation = previewLayer.connection.videoOrientation;
dispatch_async(self.sessionQueue, ^{
if (!self.movieFileOutput.isRecording) {
if ([UIDevice currentDevice].isMultitaskingSupported) {
self.backgroundRecordingID = [[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:nil];
}
AVCaptureConnection *moviceConnection = [self.movieFileOutput connectionWithMediaType:AVMediaTypeVideo];
moviceConnection.videoOrientation = previewLayerVideoOrientation;
//保存到相冊
NSString *outputFileName = [NSProcessInfo processInfo].globallyUniqueString;
NSString *outputFilePath = [NSTemporaryDirectory() stringByAppendingPathComponent:[outputFileName stringByAppendingPathExtension:@"mov"]];
[self.movieFileOutput startRecordingToOutputFileURL:[NSURL fileURLWithPath:outputFilePath] recordingDelegate:self];
}
else {
[self.movieFileOutput stopRecording];
}
});
}
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didStartRecordingToOutputFileAtURL:(NSURL *)fileURL fromConnections:(NSArray *)connections
{
dispatch_async( dispatch_get_main_queue(), ^{
self.recordButton.enabled = YES;
[self.recordButton setTitle:NSLocalizedString( @"Stop", @"Recording button stop title" ) forState:UIControlStateNormal];
});
}
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error
{
UIBackgroundTaskIdentifier currentBackgroundRecordingID = self.backgroundRecordingID;
self.backgroundRecordingID = UIBackgroundTaskInvalid;
dispatch_block_t cleanup = ^{
if ([[NSFileManager defaultManager] fileExistsAtPath:outputFileURL.path]) {
[[NSFileManager defaultManager] removeItemAtPath:outputFileURL.path error:nil];
}
if (currentBackgroundRecordingID != UIBackgroundTaskInvalid) {
[[UIApplication sharedApplication] endBackgroundTask:currentBackgroundRecordingID];
}
};
BOOL success = YES;
if (error) {
NSLog( @"Error occurred while capturing movie: %@", error );
success = [error.userInfo[AVErrorRecordingSuccessfullyFinishedKey] boolValue];
}
if (success) {
[PHPhotoLibrary requestAuthorization:^(PHAuthorizationStatus status) {
if (status == PHAuthorizationStatusAuthorized) {
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
PHAssetResourceCreationOptions *options = [[PHAssetResourceCreationOptions alloc] init];
options.shouldMoveFile = YES;
PHAssetCreationRequest *changeRequest = [PHAssetCreationRequest creationRequestForAsset];
[changeRequest addResourceWithType:PHAssetResourceTypeVideo fileURL:outputFileURL options:options];
} completionHandler:^(BOOL success, NSError * _Nullable error) {
if (!success) {
NSLog( @"Could not save movie to photo library: %@", error);
}
cleanup();
}];
}else {
cleanup();
}
}];
}
else {
cleanup();
}
dispatch_async(dispatch_get_main_queue(), ^{
self.cameraButton.enabled = (self.videoDeviceDiscoverySession.devices.count > 1);
self.recordButton.enabled = self.captureModeControl.selectedSegmentIndex == AVCamManualCaptureModeMovie;
[self.recordButton setTitle:NSLocalizedString( @"Record", @"Recording button record title" ) forState:UIControlStateNormal];
self.captureModeControl.enabled = YES;
});
}
參考
文章代碼都是官方代碼DEMO
大家還可以參考另外一篇比較好的文章iOS-AVFoundation自定義相機詳解
作者:牀前明月_光
鏈接:https://www.jianshu.com/p/58790d70b08f
來源:簡書
著作權歸作者所有。商業轉載請聯繫作者獲得授權,非商業轉載請註明出處。