将一行从 Swift 转换为 Objective-C
Convert a line from Swift to Objective-C
我正在尝试将扩展形式 Swift 转换为 Objective-C。需要确定 UIImage 的平均颜色。
我几乎完成了所有工作,但坚持了一行:
var bitmap = [UInt8](repeating: 0, count: 4)
此扩展来自 https://www.hackingwithswift.com:
extension UIImage {
var averageColor: UIColor? {
guard let inputImage = CIImage(image: self) else { return nil }
let extentVector = CIVector(x: inputImage.extent.origin.x, y: inputImage.extent.origin.y, z: inputImage.extent.size.width, w: inputImage.extent.size.height)
guard let filter = CIFilter(name: "CIAreaAverage", parameters: [kCIInputImageKey: inputImage, kCIInputExtentKey: extentVector]) else { return nil }
guard let outputImage = filter.outputImage else { return nil }
var bitmap = [UInt8](repeating: 0, count: 4)
let context = CIContext(options: [.workingColorSpace: kCFNull])
context.render(outputImage, toBitmap: &bitmap, rowBytes: 4, bounds: CGRect(x: 0, y: 0, width: 1, height: 1), format: .RGBA8, colorSpace: nil)
return UIColor(red: CGFloat(bitmap[0]) / 255, green: CGFloat(bitmap[1]) / 255, blue: CGFloat(bitmap[2]) / 255, alpha: CGFloat(bitmap[3]) / 255)
}
}
这就是我在 Objective-C 中得到的:
- (UIColor *)averageColorWithUIImage:(UIImage *)image {
CIImage *inputImage = [CIImage imageWithCGImage:(__bridge CGImageRef _Nonnull)(image)];
if (!inputImage) {
return nil;
}
CIVector *extentVector = [CIVector vectorWithX:inputImage.extent.origin.x Y: inputImage.extent.origin.y Z: inputImage.extent.size.width W: inputImage.extent.size.height];
CIFilter *filter = [CIFilter filterWithName:@"CIAreaAverage" keysAndValues:kCIInputImageKey, inputImage, kCIInputExtentKey, extentVector, nil];
if (!filter) {
return nil;
}
CIImage *outputImage = filter.outputImage;
if (!outputImage) {
return nil;
}
NSSet *bitmap = [NSSet setWithObjects:0 count:4];
CIContext *context = [[CIContext alloc] initWithOptions:@{kCIContextWorkingColorSpace : NSNull.null}];
[context render:outputImage toBitmap:&bitmap rowBytes:4 bounds:CGRectMake(0, 0, 1, 1) format:kCIFormatRGBA8 colorSpace:nil];
return [UIColor colorWithRed:(CGFloat)bitmap[0]/255.0
green:((CGFloat)bitmap[1])/255.0
blue:((CGFloat)bitmap[2])/255.0
alpha:((CGFloat)bitmap[3])/255.0];
}
我这里弄错了:
return [UIColor colorWithRed:(CGFloat)bitmap[0]/255.0
green:((CGFloat)bitmap[1])/255.0
blue:((CGFloat)bitmap[2])/255.0
alpha:((CGFloat)bitmap[3])/255.0];
错误是-
Expected method to read array element not found on object of type 'NSSet *'
以下是将 bitmap
变量转换为 Objective-C 的方法:
uint8_t bitmap[4] = {};
这是将其传递给渲染方法的方式:
[context render:outputImage toBitmap:bitmap
rowBytes:4 bounds:CGRectMake(0, 0, 1, 1)
format:kCIFormatRGBA8 colorSpace:nil];
请注意,您 不是 使用 &
运算符获取其地址!
我正在尝试将扩展形式 Swift 转换为 Objective-C。需要确定 UIImage 的平均颜色。 我几乎完成了所有工作,但坚持了一行:
var bitmap = [UInt8](repeating: 0, count: 4)
此扩展来自 https://www.hackingwithswift.com:
extension UIImage {
var averageColor: UIColor? {
guard let inputImage = CIImage(image: self) else { return nil }
let extentVector = CIVector(x: inputImage.extent.origin.x, y: inputImage.extent.origin.y, z: inputImage.extent.size.width, w: inputImage.extent.size.height)
guard let filter = CIFilter(name: "CIAreaAverage", parameters: [kCIInputImageKey: inputImage, kCIInputExtentKey: extentVector]) else { return nil }
guard let outputImage = filter.outputImage else { return nil }
var bitmap = [UInt8](repeating: 0, count: 4)
let context = CIContext(options: [.workingColorSpace: kCFNull])
context.render(outputImage, toBitmap: &bitmap, rowBytes: 4, bounds: CGRect(x: 0, y: 0, width: 1, height: 1), format: .RGBA8, colorSpace: nil)
return UIColor(red: CGFloat(bitmap[0]) / 255, green: CGFloat(bitmap[1]) / 255, blue: CGFloat(bitmap[2]) / 255, alpha: CGFloat(bitmap[3]) / 255)
}
}
这就是我在 Objective-C 中得到的:
- (UIColor *)averageColorWithUIImage:(UIImage *)image {
CIImage *inputImage = [CIImage imageWithCGImage:(__bridge CGImageRef _Nonnull)(image)];
if (!inputImage) {
return nil;
}
CIVector *extentVector = [CIVector vectorWithX:inputImage.extent.origin.x Y: inputImage.extent.origin.y Z: inputImage.extent.size.width W: inputImage.extent.size.height];
CIFilter *filter = [CIFilter filterWithName:@"CIAreaAverage" keysAndValues:kCIInputImageKey, inputImage, kCIInputExtentKey, extentVector, nil];
if (!filter) {
return nil;
}
CIImage *outputImage = filter.outputImage;
if (!outputImage) {
return nil;
}
NSSet *bitmap = [NSSet setWithObjects:0 count:4];
CIContext *context = [[CIContext alloc] initWithOptions:@{kCIContextWorkingColorSpace : NSNull.null}];
[context render:outputImage toBitmap:&bitmap rowBytes:4 bounds:CGRectMake(0, 0, 1, 1) format:kCIFormatRGBA8 colorSpace:nil];
return [UIColor colorWithRed:(CGFloat)bitmap[0]/255.0
green:((CGFloat)bitmap[1])/255.0
blue:((CGFloat)bitmap[2])/255.0
alpha:((CGFloat)bitmap[3])/255.0];
}
我这里弄错了:
return [UIColor colorWithRed:(CGFloat)bitmap[0]/255.0
green:((CGFloat)bitmap[1])/255.0
blue:((CGFloat)bitmap[2])/255.0
alpha:((CGFloat)bitmap[3])/255.0];
错误是-
Expected method to read array element not found on object of type 'NSSet *'
以下是将 bitmap
变量转换为 Objective-C 的方法:
uint8_t bitmap[4] = {};
这是将其传递给渲染方法的方式:
[context render:outputImage toBitmap:bitmap
rowBytes:4 bounds:CGRectMake(0, 0, 1, 1)
format:kCIFormatRGBA8 colorSpace:nil];
请注意,您 不是 使用 &
运算符获取其地址!