Swift: 获取ARKit中人脸追踪的TruthDepth相机参数

Swift: Get the TruthDepth camera parameters for face tracking in ARKit

我的目标:

我正在尝试获取 TruthDepth 相机参数(例如内部、外部、镜头畸变等),同时进行面部跟踪。我读到 OpenCV 有例子和可能的例子。我只是想知道是否应该在 Swift.

中实现类似的目标

我阅读和尝试的内容:

我阅读了有关 ARCamera 的苹果文档:intrinsics 和 AVCameraCalibrationData: extrinsicMatrix and intrinsicMatrix.

然而,我发现的只是 AVCameraCalibrationDataARCamera 的声明:


对于AVCameraCalibrationData


对于内在矩阵

var intrinsicMatrix: matrix_float3x3 { get }

对于外部矩阵

var extrinsicMatrix: matrix_float4x3 { get }

我也读了这个 post: 并尝试了 Bourne 的建议:

func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
        let ex = photo.depthData?.cameraCalibrationData?.extrinsicMatrix
        //let ex = photo.cameraCalibrationData?.extrinsicMatrix
        let int = photo.cameraCalibrationData?.intrinsicMatrix
        photo.depthData?.cameraCalibrationData?.lensDistortionCenter
        print ("ExtrinsicM: \(String(describing: ex))")
        print("isCameraCalibrationDataDeliverySupported: \(output.isCameraCalibrationDataDeliverySupported)")
    }

但它根本不打印矩阵。


对于 ARCamera 我读过 Andy Fedoroff 的 :

var intrinsics: simd_float3x3 { get }
func inst (){
    sceneView.pointOfView?.camera?.focalLength
    DispatchQueue.main.asyncAfter(deadline: .now() + 2.0) {
        print(" Focal Length: \(String(describing: self.sceneView.pointOfView?.camera?.focalLength))")
        print("Sensor Height: \(String(describing: self.sceneView.pointOfView?.camera?.sensorHeight))")
        // SENSOR HEIGHT IN mm
        let frame = self.sceneView.session.currentFrame
        // INTRINSICS MATRIX
        print("Intrinsics fx: \(String(describing: frame?.camera.intrinsics.columns.0.x))")
        print("Intrinsics fy: \(String(describing: frame?.camera.intrinsics.columns.1.y))")
        print("Intrinsics ox: \(String(describing: frame?.camera.intrinsics.columns.2.x))")
        print("Intrinsics oy: \(String(describing: frame?.camera.intrinsics.columns.2.y))")
    }
}

显示渲染相机参数:

Focal Length: Optional(20.784610748291016)
Sensor Height: Optional(24.0)
Intrinsics fx: Optional(1277.3052)
Intrinsics fy: Optional(1277.3052)
Intrinsics ox: Optional(720.29443)
Intrinsics oy: Optional(539.8974)

但是,这只显示了渲染相机,而不是我用于面部跟踪的 TruthDepth 相机。


那么谁能帮助我开始获取 TruthDepth 相机参数,因为文档除了声明之外并没有真正显示任何示例?

非常感谢!

您无法打印内在函数的原因可能是因为您在可选链接中得到了 nil。你应该看看苹果的评论 here and here.

Camera calibration data is present only if you specified the isCameraCalibrationDataDeliveryEnabled and isDualCameraDualPhotoDeliveryEnabled settings when requesting capture. For camera calibration data in a capture that includes depth data, see the AVDepthData cameraCalibrationData property.

To request capture of depth data alongside a photo (on supported devices), set the isDepthDataDeliveryEnabled property of your photo settings object to true when requesting photo capture. If you did not request depth data delivery, this property's value is nil.

所以如果你想获得TrueDepth相机的intrinsicMatrixextrinsicMatrix,你应该使用builtInTrueDepthCamera作为输入设备,设置isDepthDataDeliveryEnabled管道的照片输出为true,并在拍摄照片时将isDepthDataDeliveryEnabled设置为true。然后,您可以通过访问 photo 参数的 depthData.cameraCalibrationData 属性来访问 photoOutput(_: didFinishProcessingPhoto: error:) 回调中的内在矩阵。

这是一个 code sample for setting up such a pipeline