What’s new in AVFoundation in iOS 10
In iOS 10 we are getting some new interesting features of AVFoundation which can improve camera usage in our apps.
I would briefly talk about three of them.
- Live Photos
- RAW
- Capturing Preview Images
But before that let’s start with AVCaptureOutput class which is AVCapturePhotoOutput. Finally, we are getting easy to use a capturing function called capturePhoto.
This function needs two parameters, AVCapturePhotoSettings and AVCapturePhotoDelegate.
And that AVCapturePhotoSettings object is key here.
Easily we can specify if our app wants to use flash mode, image stabilization, or highResolutionPhoto.
AVCapturePhotoDelegate would inform us about the state of capturing. We can use five different methods to know where on our timeline currently are we.
Live Photos
Live photos were presented with iPhone 6S as a new way of capturing the moments of our life. Technically speaking live photo object is the composition of 12 MP image and 3 seconds video (includes audio, and screen resolution).
To take live photos by using AVFoundation we have to check if live photo capture is supported, set preset to AVCaptureSessionPresetPhoto and set isLivePhotoCaputreEnabled to true.
Below is an example of the easiest way of taking live photos. We have to provide a path to store the movie file created during capturing of the live photo. I’m using uniqueID as the name of the file (that id is unique for every capture we do).
1 2 3 4 5 6 | func takeLivePhoto() { let settings = AVCapturePhotoSettings() settings.isHighResolutionPhotoEnabled = true settings.livePhotoMovieFileURL = URL(fileURLWithPath: "pathToLivePhotoVideo\(settings.uniqueID)") output.capturePhoto(with: settings, delegate: self) } |
Ok, once we took it, we have to capture the output.
1 2 3 4 5 6 7 8 9 10 11 | func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) { guard let photoSampleBuffer = photoSampleBuffer else { return } let data = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: photoSampleBuffer, previewPhotoSampleBuffer: previewPhotoSampleBuffer) do { try data?.write(to: filePath, options: .atomicWrite) } catch { print("error") } } |
RAW
Another feature which is worth mentioning is capturing RAW images. RAW means we are getting an image with 14-16 bits per pixel instead of 8 in the case of jpeg format.
To capture that kind of image format we have to set AVCaptureSessionPreset to Photo and use rear camera, we also have to specify raw photo pixel format type in AVCapturePhotoSettings
1 2 3 4 5 6 | func takeRawPhoto() { let rawFormat = output.availableRawPhotoPixelFormatTypes.first!.uint32Value let settings = AVCapturePhotoSettings(rawPixelFormatType: rawFormat) output.capturePhoto(with: settings, delegate: self) } |
We also have to implement capture output didFinishProcessingRawPhotoSampleBuffer function
1 2 3 4 5 6 7 8 9 10 11 12 | func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingRawPhotoSampleBuffer rawSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) { guard let rawSampleButtfer = rawSampleBuffer else { return } let data = AVCapturePhotoOutput.dngPhotoDataRepresentation(forRawSampleBuffer: rawSampleButtfer, previewPhotoSampleBuffer: previewPhotoSampleBuffer) let filePath = (stringFilePath as NSString).appendingPathComponent(".dng") do { try data?.write(to: URL(string:stringFilePath)!, options: .atomicWrite) } catch { print("error") } } |
As you may have noticed we are using dng file format. It’s one of the most popular file formats for RAWs created by Adobe.
Capturing Preview Images
Many times we need to have a thumbnail of captured image, at least to use it as a cell icon in our collection view or table view.
Instead of decompressing and downscaling the original image, we can capture it at the same time as the big one.
Preview image is uncompressed, we can specify it’s dimensioned or allow our output to pick it automatically.
Above you can see an example how to setup preview parameters.
1 2 3 4 5 | let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first?.int32Value let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType as! AnyObject, kCVPixelBufferWidthKey as String: 200, kCVPixelBufferHeightKey as String: 200] settings.previewPhotoFormat = previewFormat |
As you may be noticed in the implementation of caputreOutput above, we are already using previewSampleBuffer, once we are creating our data object.
In the case of Live Photos or RAW format, we are able to use it only on devices with 12 Mpx camera. That means you can capture it on iPhone 6S, 6S Plus, SE, 7 and 7 Plus, but also on 9,7-inch iPad Pro.
Things look better in the case of previews because we have no limitation, we can use it for all devices and cameras.
That was just briefly introduction to what’s new in AVFoundation if the topic seems to be interesting to you, I highly recommend you to visit Apple’s developer portal and check the newest WWDC sessions about the Photography.
Ready to take your business to the next level with a digital product?
We'll be with you every step of the way, from idea to launch and beyond!