从 AVCapturePhotoOutput 获取完成捕获后使用的曝光持续时间和 ISO 值
Get used exposure duration and ISO values after the capture is complete from the AVCapturePhotoOutput
背景
我正在使用 AVCaptureSession 和 AVCapturePhotoOutput 将捕获保存为 JPEG 图像。
let captureSession = AVCaptureSession()
let stillImageOutput = AVCapturePhotoOutput()
var captureDevice : AVCaptureDevice?
...
func setupCamera() {
captureDevice = AVCaptureDevice.default(AVCaptureDevice.DeviceType.builtInWideAngleCamera, for: AVMediaType.video, position: .back)
if (captureDevice != nil) {
captureSession.addInput(try AVCaptureDeviceInput(device: captureDevice!))
if captureSession.canAddOutput(stillImageOutput) {
captureSession.addOutput(stillImageOutput)
}
}
}
AVCaptureDevice 设置为自动和连续调整曝光设置
func configureCamera() {
do {
try captureDevice?.lockForConfiguration()
captureDevice?.exposureMode = AVCaptureDevice.ExposureMode.continuousAutoExposure
captureDevice?.unlockForConfiguration()
} catch let error as NSError {
// Errors handled here...
}
}
捕获开始于
func capture(){
// Get an instance of AVCapturePhotoSettings class
let photoSettings = AVCapturePhotoSettings()
// Set photo settings
photoSettings.isAutoStillImageStabilizationEnabled = true
photoSettings.flashMode = .off
// Call capturePhoto method by passing photo settings and a
// delegate implementing AVCapturePhotoCaptureDelegate
stillImageOutput.capturePhoto(with: photoSettings, delegate: self)
}
父级 class 被设置为 AVCapturePhotoCaptureDelegate 并且 photoOutput 由它处理
//Delegate
func photoOutput(_ captureOutput: AVCapturePhotoOutput,
didFinishProcessingPhoto photoSampleBuffer: CMSampleBuffer?,
previewPhoto previewPhotoSampleBuffer: CMSampleBuffer?,
resolvedSettings: AVCaptureResolvedPhotoSettings,
bracketSettings: AVCaptureBracketedStillImageSettings?,
error: Error?) {
// Make sure there is a photo sample buffer
guard error == nil,
let photoSampleBuffer = photoSampleBuffer else {
//Errors handled here
return
}
// Convert photo same buffer to a jpeg image data by using // AVCapturePhotoOutput
guard let imageData =
AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: photoSampleBuffer, previewPhotoSampleBuffer: previewPhotoSampleBuffer) else {
return
}
let capturedImage = UIImage.init(data: imageData , scale: 1.0)
if let image = capturedImage {
//save photo ...
}
}
一切正常,但是...
问题
我需要知道每次拍摄所用的曝光持续时间和 ISO 值。这些值会有所不同,因为相机设置为自动调整曝光并且必须如此。
我知道捕获的元数据包含这些值,但我不知道如何访问它们。
曝光持续时间和 ISO 值对于微调曝光以获得最佳效果是必需的。微调后使用这些手动曝光值开始捕获
captureDevice?.setExposureModeCustom(duration: customTime, iso: customISO, completionHandler: nil)
我没有从拍摄元数据中获取使用的 ISO 和曝光持续时间,而是在拍摄照片之前读取这些值。这样做时,重要的是检查曝光是否已完成调整。
就在调用捕获之前:
检查自动曝光是否未调整
while ((captureDevice?.isAdjustingExposure)!){
usleep(100000) // wait 100 msec
}
读取当前曝光参数
let current_exposure_duration : CMTime = (captureDevice?.exposureDuration)!
let current_exposure_ISO : Float = (captureDevice?.iso)!
然后拍照
stillImageOutput.capturePhoto(with: photoSettings, delegate: self)
背景
我正在使用 AVCaptureSession 和 AVCapturePhotoOutput 将捕获保存为 JPEG 图像。
let captureSession = AVCaptureSession()
let stillImageOutput = AVCapturePhotoOutput()
var captureDevice : AVCaptureDevice?
...
func setupCamera() {
captureDevice = AVCaptureDevice.default(AVCaptureDevice.DeviceType.builtInWideAngleCamera, for: AVMediaType.video, position: .back)
if (captureDevice != nil) {
captureSession.addInput(try AVCaptureDeviceInput(device: captureDevice!))
if captureSession.canAddOutput(stillImageOutput) {
captureSession.addOutput(stillImageOutput)
}
}
}
AVCaptureDevice 设置为自动和连续调整曝光设置
func configureCamera() {
do {
try captureDevice?.lockForConfiguration()
captureDevice?.exposureMode = AVCaptureDevice.ExposureMode.continuousAutoExposure
captureDevice?.unlockForConfiguration()
} catch let error as NSError {
// Errors handled here...
}
}
捕获开始于
func capture(){
// Get an instance of AVCapturePhotoSettings class
let photoSettings = AVCapturePhotoSettings()
// Set photo settings
photoSettings.isAutoStillImageStabilizationEnabled = true
photoSettings.flashMode = .off
// Call capturePhoto method by passing photo settings and a
// delegate implementing AVCapturePhotoCaptureDelegate
stillImageOutput.capturePhoto(with: photoSettings, delegate: self)
}
父级 class 被设置为 AVCapturePhotoCaptureDelegate 并且 photoOutput 由它处理
//Delegate
func photoOutput(_ captureOutput: AVCapturePhotoOutput,
didFinishProcessingPhoto photoSampleBuffer: CMSampleBuffer?,
previewPhoto previewPhotoSampleBuffer: CMSampleBuffer?,
resolvedSettings: AVCaptureResolvedPhotoSettings,
bracketSettings: AVCaptureBracketedStillImageSettings?,
error: Error?) {
// Make sure there is a photo sample buffer
guard error == nil,
let photoSampleBuffer = photoSampleBuffer else {
//Errors handled here
return
}
// Convert photo same buffer to a jpeg image data by using // AVCapturePhotoOutput
guard let imageData =
AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: photoSampleBuffer, previewPhotoSampleBuffer: previewPhotoSampleBuffer) else {
return
}
let capturedImage = UIImage.init(data: imageData , scale: 1.0)
if let image = capturedImage {
//save photo ...
}
}
一切正常,但是...
问题
我需要知道每次拍摄所用的曝光持续时间和 ISO 值。这些值会有所不同,因为相机设置为自动调整曝光并且必须如此。
我知道捕获的元数据包含这些值,但我不知道如何访问它们。
曝光持续时间和 ISO 值对于微调曝光以获得最佳效果是必需的。微调后使用这些手动曝光值开始捕获
captureDevice?.setExposureModeCustom(duration: customTime, iso: customISO, completionHandler: nil)
我没有从拍摄元数据中获取使用的 ISO 和曝光持续时间,而是在拍摄照片之前读取这些值。这样做时,重要的是检查曝光是否已完成调整。
就在调用捕获之前:
检查自动曝光是否未调整
while ((captureDevice?.isAdjustingExposure)!){
usleep(100000) // wait 100 msec
}
读取当前曝光参数
let current_exposure_duration : CMTime = (captureDevice?.exposureDuration)!
let current_exposure_ISO : Float = (captureDevice?.iso)!
然后拍照
stillImageOutput.capturePhoto(with: photoSettings, delegate: self)