如何使用 ARMatteGenerator 将 CIFilter 添加到 MTLTexture?
How to add a CIFilter to MTLTexture Using ARMatteGenerator?
我正在研究 Apple 的 sample project 与使用 ARMatteGenerator
生成一个 MTLTexture 相关的工作,该 MTLTexture 可以用作人物遮挡技术中的遮挡遮罩。
我想确定如何通过 CIFilter 运行 生成遮罩。在我的代码中,我是 "filtering" 这样的遮罩;
func updateMatteTextures(commandBuffer: MTLCommandBuffer) {
guard let currentFrame = session.currentFrame else {
return
}
var targetImage: CIImage?
alphaTexture = matteGenerator.generateMatte(from: currentFrame, commandBuffer: commandBuffer)
dilatedDepthTexture = matteGenerator.generateDilatedDepth(from: currentFrame, commandBuffer: commandBuffer)
targetImage = CIImage(mtlTexture: alphaTexture!, options: nil)
monoAlphaCIFilter?.setValue(targetImage!, forKey: kCIInputImageKey)
monoAlphaCIFilter?.setValue(CIColor.red, forKey: kCIInputColorKey)
targetImage = (monoAlphaCIFilter?.outputImage)!
let drawingBounds = CGRect(origin: .zero, size: CGSize(width: alphaTexture!.width, height: alphaTexture!.height))
context.render(targetImage!, to: alphaTexture!, commandBuffer: commandBuffer, bounds: drawingBounds, colorSpace: CGColorSpaceCreateDeviceRGB())
}
当我合成遮罩纹理和背景时,遮罩没有应用过滤效果。这就是纹理的合成方式;
func compositeImagesWithEncoder(renderEncoder: MTLRenderCommandEncoder) {
guard let textureY = capturedImageTextureY, let textureCbCr = capturedImageTextureCbCr else {
return
}
// Push a debug group allowing us to identify render commands in the GPU Frame Capture tool
renderEncoder.pushDebugGroup("CompositePass")
// Set render command encoder state
renderEncoder.setCullMode(.none)
renderEncoder.setRenderPipelineState(compositePipelineState)
renderEncoder.setDepthStencilState(compositeDepthState)
// Setup plane vertex buffers
renderEncoder.setVertexBuffer(imagePlaneVertexBuffer, offset: 0, index: 0)
renderEncoder.setVertexBuffer(scenePlaneVertexBuffer, offset: 0, index: 1)
// Setup textures for the composite fragment shader
renderEncoder.setFragmentBuffer(sharedUniformBuffer, offset: sharedUniformBufferOffset, index: Int(kBufferIndexSharedUniforms.rawValue))
renderEncoder.setFragmentTexture(CVMetalTextureGetTexture(textureY), index: 0)
renderEncoder.setFragmentTexture(CVMetalTextureGetTexture(textureCbCr), index: 1)
renderEncoder.setFragmentTexture(sceneColorTexture, index: 2)
renderEncoder.setFragmentTexture(sceneDepthTexture, index: 3)
renderEncoder.setFragmentTexture(alphaTexture, index: 4)
renderEncoder.setFragmentTexture(dilatedDepthTexture, index: 5)
// Draw final quad to display
renderEncoder.drawPrimitives(type: .triangleStrip, vertexStart: 0, vertexCount: 4)
renderEncoder.popDebugGroup()
}
如何将 CIFilter 仅应用于 ARMatteGenerator 生成的 alphaTexture?
我认为你不想将 CIFilter
应用于 alphaTexture
。我假设您使用的是 Apple 的 Effecting People Occlusion in Custom Renderers sample code. If you watch this year's Bringing People into AR WWDC session,他们谈论使用 ARMatteGenerator
生成分割遮罩,这正是 alphaTexture = matteGenerator.generateMatte(from: currentFrame, commandBuffer: commandBuffer)
所做的。 alphaTexture
是一个 MTLTexture
,它本质上是一个 alpha 掩码,用于在相机框架中检测到人类的地方(即,在人类存在的地方完全不透明,在人类不存在的地方完全透明)。
向 alpha 纹理添加过滤器不会过滤最终渲染图像,只会影响合成中使用的蒙版。如果您尝试实现 your previous question 中链接的视频,我建议您调整合成发生的金属着色器。在 session 中,他们指出他们比较 dilatedDepth
和 renderedDepth
以查看他们是否应该从相机绘制虚拟内容或像素:
fragment half4 customComposition(...) {
half4 camera = cameraTexture.sample(s, in.uv);
half4 rendered = renderedTexture.sample(s, in.uv);
float renderedDepth = renderedDepthTexture.sample(s, in.uv);
half4 scene = mix(rendered, camera, rendered.a);
half matte = matteTexture.sample(s, in.uv);
float dilatedDepth = dilatedDepthTexture.sample(s, in.uv);
if (dilatedDepth < renderedDepth) { // People in front of rendered
// mix together the virtual content and camera feed based on the alpha provided by the matte
return mix(scene, camera, matte);
} else {
// People are not in front so just return the scene
return scene
}
}
遗憾的是,示例代码中的做法略有不同,但修改起来仍然相当容易。打开Shaders.metal
。找到 compositeImageFragmentShader
函数。在函数的末尾,您会看到 half4 occluderResult = mix(sceneColor, cameraColor, alpha);
这与我们在上面看到的 mix(scene, camera, matte);
本质上是相同的操作。我们正在决定是否应该使用场景中的像素或基于分割蒙版的相机馈送中的像素。通过将 cameraColor
替换为代表颜色的 half4
,我们可以轻松地将相机图像像素替换为任意 rgba 值。例如,我们可以使用 half4(float4(0.0, 0.0, 1.0, 1.0))
将分段内的所有像素绘制为蓝色:
…
// Replacing camera color with blue
half4 occluderResult = mix(sceneColor, half4(float4(0.0, 0.0, 1.0, 1.0)), alpha);
half4 mattingResult = mix(sceneColor, occluderResult, showOccluder);
return mattingResult;
当然,您也可以应用其他效果。动态灰度静态很容易实现。
在compositeImageFragmentShader
上方添加:
float random(float offset, float2 tex_coord, float time) {
// pick two numbers that are unlikely to repeat
float2 non_repeating = float2(12.9898 * time, 78.233 * time);
// multiply our texture coordinates by the non-repeating numbers, then add them together
float sum = dot(tex_coord, non_repeating);
// calculate the sine of our sum to get a range between -1 and 1
float sine = sin(sum);
// multiply the sine by a big, non-repeating number so that even a small change will result in a big color jump
float huge_number = sine * 43758.5453 * offset;
// get just the numbers after the decimal point
float fraction = fract(huge_number);
// send the result back to the caller
return fraction;
}
(取自@twostraws ShaderKit)
然后修改compositeImageFragmentShader
为:
…
float randFloat = random(1.0, cameraTexCoord, rgb[0]);
half4 occluderResult = mix(sceneColor, half4(float4(randFloat, randFloat, randFloat, 1.0)), alpha);
half4 mattingResult = mix(sceneColor, occluderResult, showOccluder);
return mattingResult;
你应该得到:
最后,调试器似乎很难跟上应用程序的步伐。对我来说,当 运行 附加 Xcode 时,应用程序会在启动后不久冻结,但当 运行 单独使用时通常很流畅。
我正在研究 Apple 的 sample project 与使用 ARMatteGenerator
生成一个 MTLTexture 相关的工作,该 MTLTexture 可以用作人物遮挡技术中的遮挡遮罩。
我想确定如何通过 CIFilter 运行 生成遮罩。在我的代码中,我是 "filtering" 这样的遮罩;
func updateMatteTextures(commandBuffer: MTLCommandBuffer) {
guard let currentFrame = session.currentFrame else {
return
}
var targetImage: CIImage?
alphaTexture = matteGenerator.generateMatte(from: currentFrame, commandBuffer: commandBuffer)
dilatedDepthTexture = matteGenerator.generateDilatedDepth(from: currentFrame, commandBuffer: commandBuffer)
targetImage = CIImage(mtlTexture: alphaTexture!, options: nil)
monoAlphaCIFilter?.setValue(targetImage!, forKey: kCIInputImageKey)
monoAlphaCIFilter?.setValue(CIColor.red, forKey: kCIInputColorKey)
targetImage = (monoAlphaCIFilter?.outputImage)!
let drawingBounds = CGRect(origin: .zero, size: CGSize(width: alphaTexture!.width, height: alphaTexture!.height))
context.render(targetImage!, to: alphaTexture!, commandBuffer: commandBuffer, bounds: drawingBounds, colorSpace: CGColorSpaceCreateDeviceRGB())
}
当我合成遮罩纹理和背景时,遮罩没有应用过滤效果。这就是纹理的合成方式;
func compositeImagesWithEncoder(renderEncoder: MTLRenderCommandEncoder) {
guard let textureY = capturedImageTextureY, let textureCbCr = capturedImageTextureCbCr else {
return
}
// Push a debug group allowing us to identify render commands in the GPU Frame Capture tool
renderEncoder.pushDebugGroup("CompositePass")
// Set render command encoder state
renderEncoder.setCullMode(.none)
renderEncoder.setRenderPipelineState(compositePipelineState)
renderEncoder.setDepthStencilState(compositeDepthState)
// Setup plane vertex buffers
renderEncoder.setVertexBuffer(imagePlaneVertexBuffer, offset: 0, index: 0)
renderEncoder.setVertexBuffer(scenePlaneVertexBuffer, offset: 0, index: 1)
// Setup textures for the composite fragment shader
renderEncoder.setFragmentBuffer(sharedUniformBuffer, offset: sharedUniformBufferOffset, index: Int(kBufferIndexSharedUniforms.rawValue))
renderEncoder.setFragmentTexture(CVMetalTextureGetTexture(textureY), index: 0)
renderEncoder.setFragmentTexture(CVMetalTextureGetTexture(textureCbCr), index: 1)
renderEncoder.setFragmentTexture(sceneColorTexture, index: 2)
renderEncoder.setFragmentTexture(sceneDepthTexture, index: 3)
renderEncoder.setFragmentTexture(alphaTexture, index: 4)
renderEncoder.setFragmentTexture(dilatedDepthTexture, index: 5)
// Draw final quad to display
renderEncoder.drawPrimitives(type: .triangleStrip, vertexStart: 0, vertexCount: 4)
renderEncoder.popDebugGroup()
}
如何将 CIFilter 仅应用于 ARMatteGenerator 生成的 alphaTexture?
我认为你不想将 CIFilter
应用于 alphaTexture
。我假设您使用的是 Apple 的 Effecting People Occlusion in Custom Renderers sample code. If you watch this year's Bringing People into AR WWDC session,他们谈论使用 ARMatteGenerator
生成分割遮罩,这正是 alphaTexture = matteGenerator.generateMatte(from: currentFrame, commandBuffer: commandBuffer)
所做的。 alphaTexture
是一个 MTLTexture
,它本质上是一个 alpha 掩码,用于在相机框架中检测到人类的地方(即,在人类存在的地方完全不透明,在人类不存在的地方完全透明)。
向 alpha 纹理添加过滤器不会过滤最终渲染图像,只会影响合成中使用的蒙版。如果您尝试实现 your previous question 中链接的视频,我建议您调整合成发生的金属着色器。在 session 中,他们指出他们比较 dilatedDepth
和 renderedDepth
以查看他们是否应该从相机绘制虚拟内容或像素:
fragment half4 customComposition(...) {
half4 camera = cameraTexture.sample(s, in.uv);
half4 rendered = renderedTexture.sample(s, in.uv);
float renderedDepth = renderedDepthTexture.sample(s, in.uv);
half4 scene = mix(rendered, camera, rendered.a);
half matte = matteTexture.sample(s, in.uv);
float dilatedDepth = dilatedDepthTexture.sample(s, in.uv);
if (dilatedDepth < renderedDepth) { // People in front of rendered
// mix together the virtual content and camera feed based on the alpha provided by the matte
return mix(scene, camera, matte);
} else {
// People are not in front so just return the scene
return scene
}
}
遗憾的是,示例代码中的做法略有不同,但修改起来仍然相当容易。打开Shaders.metal
。找到 compositeImageFragmentShader
函数。在函数的末尾,您会看到 half4 occluderResult = mix(sceneColor, cameraColor, alpha);
这与我们在上面看到的 mix(scene, camera, matte);
本质上是相同的操作。我们正在决定是否应该使用场景中的像素或基于分割蒙版的相机馈送中的像素。通过将 cameraColor
替换为代表颜色的 half4
,我们可以轻松地将相机图像像素替换为任意 rgba 值。例如,我们可以使用 half4(float4(0.0, 0.0, 1.0, 1.0))
将分段内的所有像素绘制为蓝色:
…
// Replacing camera color with blue
half4 occluderResult = mix(sceneColor, half4(float4(0.0, 0.0, 1.0, 1.0)), alpha);
half4 mattingResult = mix(sceneColor, occluderResult, showOccluder);
return mattingResult;
当然,您也可以应用其他效果。动态灰度静态很容易实现。
在compositeImageFragmentShader
上方添加:
float random(float offset, float2 tex_coord, float time) {
// pick two numbers that are unlikely to repeat
float2 non_repeating = float2(12.9898 * time, 78.233 * time);
// multiply our texture coordinates by the non-repeating numbers, then add them together
float sum = dot(tex_coord, non_repeating);
// calculate the sine of our sum to get a range between -1 and 1
float sine = sin(sum);
// multiply the sine by a big, non-repeating number so that even a small change will result in a big color jump
float huge_number = sine * 43758.5453 * offset;
// get just the numbers after the decimal point
float fraction = fract(huge_number);
// send the result back to the caller
return fraction;
}
(取自@twostraws ShaderKit)
然后修改compositeImageFragmentShader
为:
…
float randFloat = random(1.0, cameraTexCoord, rgb[0]);
half4 occluderResult = mix(sceneColor, half4(float4(randFloat, randFloat, randFloat, 1.0)), alpha);
half4 mattingResult = mix(sceneColor, occluderResult, showOccluder);
return mattingResult;
你应该得到:
最后,调试器似乎很难跟上应用程序的步伐。对我来说,当 运行 附加 Xcode 时,应用程序会在启动后不久冻结,但当 运行 单独使用时通常很流畅。