如何使用 AVFoundation 捕获静止图像并使用 Swift 在另一个视图控制器中显示图像
How to capture a still image with AVFoundation and display the image in another view controller using Swift
目标: 我正在使用 AVFoundation 创建自定义相机,其行为类似于 Facebook、Instagram 和 Snapchats 相机图像捕获序列。
下面是我的控制器,具有理想的用户体验:
- 用户按下加号按钮
- app segues/transitions 使用 AVCaptureSession 和 AVCaptureVideoPreviewLayer 自定义相机视图控制器,显示为紫色区域
- 用户按下 拍摄按钮 来拍摄图像
- 应用程序转至他们刚刚拍摄的照片的 still/image 视图控制器,以便用户可以进行编辑或任何其他必须发生的事情
- 用户按下 使用按钮
- app保存图片弹出到根目录ViewController
Here is my Swift Storyboard of the above
问题: 我可以使用 AVCapturePreviewLayer 获取实时视频源,但是一旦我捕获了我的照片,我就无法将捕获的 UIImage 传输到第二个 ViewController.我正在使用在 captureStillImageAsynchronouslyFromConnection 完成回调结束时触发的 segue。
这里是大师ViewController?
class AddPhotoViewController: UIViewController {
@IBOutlet var previewLayerView: UIView!
var captureSession: AVCaptureSession?
var previewLayer: AVCaptureVideoPreviewLayer?
var stillImageOutput: AVCaptureStillImageOutput?
var imageDetail: UIImage?
@IBAction func cancelCameraBtn(sender: AnyObject) {
self.navigationController?.popToRootViewControllerAnimated(true)
}
@IBAction func takePhotoBtn(sender: AnyObject) {
if let videoConnection = stillImageOutput!.connectionWithMediaType(AVMediaTypeVideo) {
videoConnection.videoOrientation = AVCaptureVideoOrientation.Portrait
stillImageOutput?.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: {(sampleBuffer, error) in
if (sampleBuffer != nil) {
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
let dataProvider = CGDataProviderCreateWithCFData(imageData)
let cgImageRef = CGImageCreateWithJPEGDataProvider(dataProvider, nil, true, CGColorRenderingIntent.RenderingIntentDefault)
self.imageDetail = UIImage(CGImage: cgImageRef!, scale: 1.0, orientation: UIImageOrientation.Right)
self.performSegueWithIdentifier("captureSessionDetailSegue", sender: self)
}
})
}
}
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view, typically from a nib.
}
override func viewWillDisappear(animated: Bool) {
captureSession!.stopRunning()
self.navigationController?.setNavigationBarHidden(false, animated: false)
}
override func viewWillAppear(animated: Bool) {
super.viewWillAppear(animated)
// display properties
self.navigationController?.setNavigationBarHidden(true, animated: false)
captureSession = AVCaptureSession()
captureSession!.sessionPreset = AVCaptureSessionPresetPhoto
let backCamera = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
var error: NSError?
var input: AVCaptureDeviceInput!
do {
input = try AVCaptureDeviceInput(device: backCamera)
} catch let error1 as NSError {
error = error1
input = nil
}
if error == nil && captureSession!.canAddInput(input) {
captureSession!.addInput(input)
stillImageOutput = AVCaptureStillImageOutput()
stillImageOutput!.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]
if captureSession!.canAddOutput(stillImageOutput) {
captureSession!.addOutput(stillImageOutput)
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
previewLayer!.videoGravity = AVLayerVideoGravityResizeAspectFill
previewLayer!.connection?.videoOrientation = AVCaptureVideoOrientation.Portrait
previewLayerView.layer.addSublayer(previewLayer!)
//previewLayerView.layer.removeAllAnimations()
captureSession!.startRunning()
}
}
}
override func viewDidAppear(animated: Bool) {
super.viewDidAppear(animated)
previewLayer!.frame = previewLayerView.bounds
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
// MARK: - Navigation
// In a storyboard-based application, you will often want to do a little preparation before navigation
override func prepareForSegue(segue: UIStoryboardSegue, sender: AnyObject?) {
// Get the new view controller using segue.destinationViewController.
// Pass the selected object to the new view controller.
//if segue.identifier == "captureSessionDetailSegue" {
let destination = segue.destinationViewController as! CaptureSessionDetailViewController
destination.capturedImage.image = self.imageDetail
// returns nil propertyfrom here
//destination.navigationController!.setNavigationBarHidden(true, animated: false)
//}
}
}
这是详细信息ViewController?
class CaptureSessionDetailViewController: UIViewController {
@IBOutlet var capturedImage: UIImageView!
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
/*
// MARK: - Navigation
// In a storyboard-based application, you will often want to do a little preparation before navigation
override func prepareForSegue(segue: UIStoryboardSegue, sender: AnyObject?) {
// Get the new view controller using segue.destinationViewController.
// Pass the selected object to the new view controller.
}
*/
}
我当前的代码产生 致命错误:在展开可选值时意外发现 nil 。我认为这是因为我的 prepareForSegue 方法设置了一些尚不存在的东西,但我不知道如何将图像获取所需的 DetailViewController.
我怎样才能达到我想要的结果?
不要像这样直接分配图像:
destination.capturedImage.image = self.imageDetail
但是声明另一个实例,它将把你的图像保存到 CaptureSessionDetailViewController
中,如下所示:
var capturedImageRef = UIImage()
现在您可以将 AddPhotoViewController
中的图像分配给 CaptureSessionDetailViewController
,这样进入您的 Segue
方法:
destination.capturedImageRef = self.imageDetail
现在,在 CaptureSessionDetailViewController
的 viewDidLoad
中,您可以将该图像分配给 imageView
:
capturedImage.image = capturedImageRef
我的解决方案使用了上述用户 (Dharmesh Kheni) 和 DBCamera custom camera github.
的设计模式
在AddPhotoViewController
@IBAction func takePhotoBtn(sender: AnyObject) {
if let videoConnection = stillImageOutput!.connectionWithMediaType(AVMediaTypeVideo) {
videoConnection.videoOrientation = AVCaptureVideoOrientation.Portrait
stillImageOutput?.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: {(sampleBuffer, error) in
if (sampleBuffer != nil) {
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
// Setup class variable --> imgMetaData: NSData!
// Assign and transport to destination ViewController
self.imgMetaData = imageData
self.performSegueWithIdentifier("captureSessionDetailSegue", sender: self)
}
})
}
}
override func prepareForSegue(segue: UIStoryboardSegue, sender: AnyObject?) {
if segue.identifier == "captureSessionDetailSegue" {
let destination = segue.destinationViewController as! CaptureSessionDetailViewController
destination.capturedImageMetaData = self.imgMetaData
}
}
在CaptureSessionDetailViewController
class CaptureSessionDetailViewController: UIViewController {
var capturedImageMetaData: NSData!
@IBOutlet var capturedImage: UIImageView!
override func viewDidLoad() {
super.viewDidLoad()
let dataProvider = CGDataProviderCreateWithCFData(capturedImageMetaData)
let cgImageRef = CGImageCreateWithJPEGDataProvider(dataProvider, nil, true, CGColorRenderingIntent.RenderingIntentDefault)
let img = UIImage(CGImage: cgImageRef!, scale: 1.0, orientation: UIImageOrientation.Right)
capturedImage.image = img
// Do any additional setup after loading the view.
}
来自 AVCaptureStillImageOutput
的图像数据已分配给 AddPhotoViewController
中的 class 变量 imgMetaData: NSData!
。数据通过 prepareForSegue
传输到目标视图控制器 CaptureSessionDetailViewController
并存储在 capturedImageMEtaData: NSData!
中。然后在viewDidLoad
方法中将数据转为UIImage
目标: 我正在使用 AVFoundation 创建自定义相机,其行为类似于 Facebook、Instagram 和 Snapchats 相机图像捕获序列。
下面是我的控制器,具有理想的用户体验:
- 用户按下加号按钮
- app segues/transitions 使用 AVCaptureSession 和 AVCaptureVideoPreviewLayer 自定义相机视图控制器,显示为紫色区域
- 用户按下 拍摄按钮 来拍摄图像
- 应用程序转至他们刚刚拍摄的照片的 still/image 视图控制器,以便用户可以进行编辑或任何其他必须发生的事情
- 用户按下 使用按钮
- app保存图片弹出到根目录ViewController
Here is my Swift Storyboard of the above
问题: 我可以使用 AVCapturePreviewLayer 获取实时视频源,但是一旦我捕获了我的照片,我就无法将捕获的 UIImage 传输到第二个 ViewController.我正在使用在 captureStillImageAsynchronouslyFromConnection 完成回调结束时触发的 segue。
这里是大师ViewController?
class AddPhotoViewController: UIViewController {
@IBOutlet var previewLayerView: UIView!
var captureSession: AVCaptureSession?
var previewLayer: AVCaptureVideoPreviewLayer?
var stillImageOutput: AVCaptureStillImageOutput?
var imageDetail: UIImage?
@IBAction func cancelCameraBtn(sender: AnyObject) {
self.navigationController?.popToRootViewControllerAnimated(true)
}
@IBAction func takePhotoBtn(sender: AnyObject) {
if let videoConnection = stillImageOutput!.connectionWithMediaType(AVMediaTypeVideo) {
videoConnection.videoOrientation = AVCaptureVideoOrientation.Portrait
stillImageOutput?.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: {(sampleBuffer, error) in
if (sampleBuffer != nil) {
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
let dataProvider = CGDataProviderCreateWithCFData(imageData)
let cgImageRef = CGImageCreateWithJPEGDataProvider(dataProvider, nil, true, CGColorRenderingIntent.RenderingIntentDefault)
self.imageDetail = UIImage(CGImage: cgImageRef!, scale: 1.0, orientation: UIImageOrientation.Right)
self.performSegueWithIdentifier("captureSessionDetailSegue", sender: self)
}
})
}
}
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view, typically from a nib.
}
override func viewWillDisappear(animated: Bool) {
captureSession!.stopRunning()
self.navigationController?.setNavigationBarHidden(false, animated: false)
}
override func viewWillAppear(animated: Bool) {
super.viewWillAppear(animated)
// display properties
self.navigationController?.setNavigationBarHidden(true, animated: false)
captureSession = AVCaptureSession()
captureSession!.sessionPreset = AVCaptureSessionPresetPhoto
let backCamera = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
var error: NSError?
var input: AVCaptureDeviceInput!
do {
input = try AVCaptureDeviceInput(device: backCamera)
} catch let error1 as NSError {
error = error1
input = nil
}
if error == nil && captureSession!.canAddInput(input) {
captureSession!.addInput(input)
stillImageOutput = AVCaptureStillImageOutput()
stillImageOutput!.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]
if captureSession!.canAddOutput(stillImageOutput) {
captureSession!.addOutput(stillImageOutput)
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
previewLayer!.videoGravity = AVLayerVideoGravityResizeAspectFill
previewLayer!.connection?.videoOrientation = AVCaptureVideoOrientation.Portrait
previewLayerView.layer.addSublayer(previewLayer!)
//previewLayerView.layer.removeAllAnimations()
captureSession!.startRunning()
}
}
}
override func viewDidAppear(animated: Bool) {
super.viewDidAppear(animated)
previewLayer!.frame = previewLayerView.bounds
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
// MARK: - Navigation
// In a storyboard-based application, you will often want to do a little preparation before navigation
override func prepareForSegue(segue: UIStoryboardSegue, sender: AnyObject?) {
// Get the new view controller using segue.destinationViewController.
// Pass the selected object to the new view controller.
//if segue.identifier == "captureSessionDetailSegue" {
let destination = segue.destinationViewController as! CaptureSessionDetailViewController
destination.capturedImage.image = self.imageDetail
// returns nil propertyfrom here
//destination.navigationController!.setNavigationBarHidden(true, animated: false)
//}
}
}
这是详细信息ViewController?
class CaptureSessionDetailViewController: UIViewController {
@IBOutlet var capturedImage: UIImageView!
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
/*
// MARK: - Navigation
// In a storyboard-based application, you will often want to do a little preparation before navigation
override func prepareForSegue(segue: UIStoryboardSegue, sender: AnyObject?) {
// Get the new view controller using segue.destinationViewController.
// Pass the selected object to the new view controller.
}
*/
}
我当前的代码产生 致命错误:在展开可选值时意外发现 nil 。我认为这是因为我的 prepareForSegue 方法设置了一些尚不存在的东西,但我不知道如何将图像获取所需的 DetailViewController.
我怎样才能达到我想要的结果?
不要像这样直接分配图像:
destination.capturedImage.image = self.imageDetail
但是声明另一个实例,它将把你的图像保存到 CaptureSessionDetailViewController
中,如下所示:
var capturedImageRef = UIImage()
现在您可以将 AddPhotoViewController
中的图像分配给 CaptureSessionDetailViewController
,这样进入您的 Segue
方法:
destination.capturedImageRef = self.imageDetail
现在,在 CaptureSessionDetailViewController
的 viewDidLoad
中,您可以将该图像分配给 imageView
:
capturedImage.image = capturedImageRef
我的解决方案使用了上述用户 (Dharmesh Kheni) 和 DBCamera custom camera github.
的设计模式在AddPhotoViewController
@IBAction func takePhotoBtn(sender: AnyObject) {
if let videoConnection = stillImageOutput!.connectionWithMediaType(AVMediaTypeVideo) {
videoConnection.videoOrientation = AVCaptureVideoOrientation.Portrait
stillImageOutput?.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: {(sampleBuffer, error) in
if (sampleBuffer != nil) {
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
// Setup class variable --> imgMetaData: NSData!
// Assign and transport to destination ViewController
self.imgMetaData = imageData
self.performSegueWithIdentifier("captureSessionDetailSegue", sender: self)
}
})
}
}
override func prepareForSegue(segue: UIStoryboardSegue, sender: AnyObject?) {
if segue.identifier == "captureSessionDetailSegue" {
let destination = segue.destinationViewController as! CaptureSessionDetailViewController
destination.capturedImageMetaData = self.imgMetaData
}
}
在CaptureSessionDetailViewController
class CaptureSessionDetailViewController: UIViewController {
var capturedImageMetaData: NSData!
@IBOutlet var capturedImage: UIImageView!
override func viewDidLoad() {
super.viewDidLoad()
let dataProvider = CGDataProviderCreateWithCFData(capturedImageMetaData)
let cgImageRef = CGImageCreateWithJPEGDataProvider(dataProvider, nil, true, CGColorRenderingIntent.RenderingIntentDefault)
let img = UIImage(CGImage: cgImageRef!, scale: 1.0, orientation: UIImageOrientation.Right)
capturedImage.image = img
// Do any additional setup after loading the view.
}
来自 AVCaptureStillImageOutput
的图像数据已分配给 AddPhotoViewController
中的 class 变量 imgMetaData: NSData!
。数据通过 prepareForSegue
传输到目标视图控制器 CaptureSessionDetailViewController
并存储在 capturedImageMEtaData: NSData!
中。然后在viewDidLoad
方法中将数据转为UIImage