iOS 7 What’s New in AV Foundation之二維碼掃描(上)

iOS 7 brings even more improvements to AV Foundation, such as:

Barcode reading support

Speech synthesis

Improved zoom functionality 



Getting Started

打開Xcode 創建一個新工程 選擇iOS\Application\Single View Application模板 點擊下一步 名字隨便起一個, 選擇iPhone for Devices,點擊下一步 選擇一個文件夾存放你的項目

廢話不多說了, 

OpenMain.storyboard, select the view controller in the scene, and selectEditor\Embed In\Navigation Controller. Finally, select the navigation bar thatappears and set its title to ColloQR.

The base project is all set up —it's time to get cracking on the camera work!

Working with the camera

First you need to import the AV Foundation framework in order to work with itsjuicy new features. OpenViewController.mand add the following import to thetop of the file:

@importAVFoundation;

Next, add the following instance variables to the implementation declaration:


@implementationViewController {

AVCaptureSession*_captureSession;

AVCaptureDevice*_videoDevice;

AVCaptureDeviceInput*_videoInput;

AVCaptureVideoPreviewLayer*_previewLayer;

BOOL_running;

}

Here’s a quick rundown of these instance variables:

1._captureSessionAVCaptureSessionis the core media handling class in AVFoundation. It talks to the hardware to retrieve, process, and output video. Acapture session wires together inputs and outputs, and controls the format andresolution of the output frames. 

2._videoDeviceAVCaptureDeviceencapsulates the physical camera on adevice. Modern iPhones have both front and rear cameras, while other devicesmay only have a single camera.

3._videoInput– To add anAVCaptureDeviceto a session, wrap it in anAVCaptureDeviceInput. A capture session can have multiple inputs and multipleoutputs.

4._previewLayerAVCaptureVideoPreviewLayerprovides a mechanism fordisplaying the current frames flowing through a capture session; it allows you todisplay the camera output in your UI.

5._running– This holds the state of the session; either the session is running orit’s not.

Your instance variables are declared; now your need to initialize them. Add thefollowing method toViewController.m:

- (void)setupCaptureSession {// 1

if(_captureSession)return;

// 2

_videoDevice= [AVCaptureDevicedefaultDeviceWithMediaType:AVMediaTypeVideo];

if(!_videoDevice) {
NSLog(@"No video camera on this device!");return;

}

// 3

_captureSession= [[AVCaptureSessionalloc]init];

// 4

_videoInput= [[AVCaptureDeviceInputalloc]initWithDevice:_videoDeviceerror:nil];

// 5

if([_captureSessioncanAddInput:_videoInput]) {[_captureSessionaddInput:_videoInput];

}

// 6

_previewLayer= [[AVCaptureVideoPreviewLayeralloc]initWithSession:_captureSession];

_previewLayer.videoGravity=AVLayerVideoGravityResizeAspectFill

}


The above method sets up the capture session. The following points explain the

code comment by comment:

1. If the session has already been created, then exit early as there’s no need to setthings up again.

2. Initialize the video device by obtaining the type of the default video mediadevice. This returns the most relevant device available. In practice, this generallyreferences the device’s rear camera. If there’s no camera available, this methodwill return nil and exit.

3. Initialize the capture session so you’re prepared to receive input.

4. Create the capture input from the device obtained in comment 2.

5. Query the session withcanAddInput:to determine if it will accept an input. Ifso, calladdInput:to add the input to the session.

6. Finally, create and initialize a preview layer and indicate which capture session topreview. Set the gravity to "resize aspect fill" so that frames will scale to fit thelayer, clipping them if required to maintain the aspect ratio. 



Creating the preview view

OpenMain.storyboard, drag a UIViewonto the view controller, and make it fillthe entire view. Next, add an outlet for the new view, name itpreviewViewandwire it up. This serves a container for the preview layer.

Back inViewController.m, modify viewDidLoadas shown below:

- (void)viewDidLoad {[superviewDidLoad];

[selfsetupCaptureSession];

_previewLayer.frame=_previewView.bounds;

[_previewView.layeraddSublayer:_previewLayer];}

The code above creates the capture session, sets up the preview layer to fill thecontainer view and adds it as a sublayer.

Next, add the following two methods toViewController.m:

- (void)startRunning { 

if(_running)return;[_captureSessionstartRunning];_running=YES;

}

- (void)stopRunning {
if(!_running)return;[_captureSessionstopRunning];_running=NO;

}

These methods start and stop the session if required. The_runninginstancevariable prevents unnecessary actions, like starting running sessions, or stoppingterminated sessions.

The app should be a good citizen and start and stop the session as necessary. Inyour app, sessions run only when the view controller is on screen.

Add the following two methods toViewController.m:

- (void)viewDidAppear:(BOOL)animated {[superviewDidAppear:animated];[selfstartRunning];

}

- (void)viewWillDisappear:(BOOL)animated {[superviewWillDisappear:animated];[selfstopRunning];

}

The above methods hook into the standardUIViewControllermethods to ensurethe session only runs when the view controller is visible. However, that’s only halfof the solution; you also need to stop the session when the app is put into thebackground and restart it when the app comes back to the foreground.

Add the following notification registrations toviewDidLoad: inViewController.m:

[[NSNotificationCenterdefaultCenter]addObserver:self

selector:@selector(applicationWillEnterForeground:)name:UIApplicationWillEnterForegroundNotification

object:nil];

[[NSNotificationCenterdefaultCenter]addObserver:self

selector:@selector(applicationDidEnterBackground:)name:UIApplicationDidEnterBackgroundNotification 

object:nil]; 

The code above takes care of starting and stopping the session depending onwhether the app is in the foreground or background.

Next, add the following implementation of the registered selectors toViewController.m:

- (void)applicationWillEnterForeground:(NSNotification*)note {[selfstartRunning];

}

- (void)applicationDidEnterBackground:(NSNotification*)note {[selfstopRunning];

}

Now the session will start and stop as required.

Build and run your project; as noted at the beginning of this chapter, you will needto run your app on a physical device that has at least one camera. The simulator isa pretty useful tool, but it can’t simulate video capture devices.

Once your app is running, you’ll see the camera’s images displayed on-screen,similar to the image below: 





If you see the exact same image as above, then you have your camera pointed at

the author’s laptop — and that’s just plain creepy!
The video capture is working well; it’s time to do something with that video input. 



轉下一篇點擊打開鏈接



發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章