skip to Main Content

I’m trying to add some color correction to images in my Mac app using Core Image filters. First, I’m looking at allowing a custom white-balance, to remove a cast from an image. It looked like the CIWhitePointAdjust was exactly what I’m looking for, but after trying it out, I’m not sure whether I’m using it wrong, or if it doesn’t do what I thought.

Starting with this image, yellowed with age, and with a thin strip of white at its right edge:

Original image

I apply the filter, like so:

NSColor *whiteSample = // Chosen from the speech bubble's background
CIColor *whiteInputColor = [[CIColor alloc] initWithColor:whiteSample];

layer.filters = @[
    [CIFilter filterWithName:@"CIWhitePointAdjust"
         withInputParameters:@{kCIInputColorKey: whiteInputColor}]
];

and get this image back:

Image after applying CIFilter

Note that it looks darker and yellower than the original (the opposite of my desired effect). I had been expecting an effect more like doing an Auto Color in Photoshop, like so:

Photoshopped color balance

Am I using CIWhitePointAdjust incorrectly, or is it the wrong tool for this job? If another filter or combination of filters would work better, I’d love to know.

Since I’m manipulating images that are already in CALayer objects, Core Image filters definitely seem like the right choice, but if this can only feasibly be done through another means, I’m open to that.

Update

A helpful answer on the Signal Processing site give me the name of what I’m trying to implement. In broad terms, it’s called Histogram Equalization. I’m trying to figure out if there’s a way to perform that process using Core Image filter(s) and so far, it’s not looking too hopeful (without me writing my own).

3

Answers


  1. Try getting a list of possible filters with autoAdjustmentFilters or autoAdjustmentFiltersWithOptions… probably the latter so that you can exclude the filters that require face detection, if you plan to autoadjust cartoon scans.

    Login or Signup to reply.
  2. OK, so I think I’ve nailed it! Here’s what I’ve clobbered together from https://stackoverflow.com/a/30447041/734860 and a couple of glue bits:

    @import Accelerator;
    
    // I haven't yet found a way to do it within Core Image
    
    // use this when you need to CIImage* -> CGImageRef
    CIImage *ciImage = [CIImage imageWithCGImage:cgImage];
    
    ...
    
    // use this when you need to CGImageRef -> CIImage*
    CIContext* context = [[CIContext alloc] init];
    CGImageRef cgImage = [context createCGImage:ciImage fromRect:ciImage.extent];
    
    ...
    
    // the algorithm itself, which uses vImage and has to convert to/from it via CGImage
    CGImageRef CreateEqualisedCGImageFromCGImage(CGImageRef original)
    {
        vImage_Error err;
        vImage_Buffer _img;
        vImage_CGImageFormat format = {
            .bitsPerComponent = 8,
            .bitsPerPixel = 32,
            .colorSpace = NULL,
            .bitmapInfo = (CGBitmapInfo)kCGImageAlphaFirst,
            .version = 0,
            .decode = NULL,
            .renderingIntent = kCGRenderingIntentDefault,
        };
    
        CGFloat width = CGImageGetWidth(original);
        CGFloat height = CGImageGetHeight(original);
    
        vImage_Buffer _dstA, _dstR, _dstG, _dstB;
    
        err = vImageBuffer_InitWithCGImage(&_img, &format, NULL, original, kvImageNoFlags);
        if (err != kvImageNoError)
            NSLog(@"vImageBuffer_InitWithCGImage error: %ld", err);
    
        err = vImageBuffer_Init( &_dstA, height, width, 8 * sizeof( uint8_t ), kvImageNoFlags);
        if (err != kvImageNoError)
            NSLog(@"vImageBuffer_Init (alpha) error: %ld", err);
    
        err = vImageBuffer_Init( &_dstR, height, width, 8 * sizeof( uint8_t ), kvImageNoFlags);
        if (err != kvImageNoError)
            NSLog(@"vImageBuffer_Init (red) error: %ld", err);
    
        err = vImageBuffer_Init( &_dstG, height, width, 8 * sizeof( uint8_t ), kvImageNoFlags);
        if (err != kvImageNoError)
            NSLog(@"vImageBuffer_Init (green) error: %ld", err);
    
        err = vImageBuffer_Init( &_dstB, height, width, 8 * sizeof( uint8_t ), kvImageNoFlags);
        if (err != kvImageNoError)
            NSLog(@"vImageBuffer_Init (blue) error: %ld", err);
    
        err = vImageConvert_ARGB8888toPlanar8(&_img, &_dstA, &_dstR, &_dstG, &_dstB, kvImageNoFlags);
        if (err != kvImageNoError)
            NSLog(@"vImageConvert_ARGB8888toPlanar8 error: %ld", err);
    
        err = vImageEqualization_Planar8(&_dstR, &_dstR, kvImageNoFlags);
        if (err != kvImageNoError)
            NSLog(@"vImageEqualization_Planar8 (red) error: %ld", err);
    
        err = vImageEqualization_Planar8(&_dstG, &_dstG, kvImageNoFlags);
        if (err != kvImageNoError)
            NSLog(@"vImageEqualization_Planar8 (green) error: %ld", err);
    
        err = vImageEqualization_Planar8(&_dstB, &_dstB, kvImageNoFlags);
        if (err != kvImageNoError)
            NSLog(@"vImageEqualization_Planar8 (blue) error: %ld", err);
    
        err = vImageConvert_Planar8toARGB8888(&_dstA, &_dstR, &_dstG, &_dstB, &_img, kvImageNoFlags);
        if (err != kvImageNoError)
            NSLog(@"vImageConvert_Planar8toARGB8888 error: %ld", err);
    
        err = vImageContrastStretch_ARGB8888( &_img, &_img, kvImageNoError );
        if (err != kvImageNoError)
            NSLog(@"vImageContrastStretch_ARGB8888 error: %ld", err);
    
        free(_dstA.data);
        free(_dstR.data);
        free(_dstG.data);
        free(_dstB.data);
    
        CGImageRef result = vImageCreateCGImageFromBuffer(&_img, &format, NULL, NULL, kvImageNoFlags, &err);
        if (err != kvImageNoError)
            NSLog(@"vImageCreateCGImageFromBuffer error: %ld", err);
    
        free(_img.data);
    
        return result;
    }
    

    The merit of the solution goes all to https://stackoverflow.com/users/4735340/james-bush, but I’ve looked so long without success for an image-oriented solution (as opposed to the video processing discussed in that question) that I think a ready-made reply such as this one is relevant. Looking for ‘AutoLevels’ on OS X or iOS got me nowhere, I hope this may be useful to someone else.

    Login or Signup to reply.
  3. Swift 5

    import Accelerate
    import Metal
    
    extension UIImage {
    
    func whiteBalance() -> UIImage? {
        
        // Create a vImage_Buffer from the CGImage
        
        guard let sourceRef = cgImage else { return nil }
        
        var srcBuffer = vImage_Buffer()
        
        var format = vImage_CGImageFormat(bitsPerComponent: 8,
                                          bitsPerPixel: 32,
                                          colorSpace: nil,
                                          bitmapInfo: CGBitmapInfo(rawValue: CGImageAlphaInfo.first.rawValue),
                                          version: 0,
                                          decode: nil,
                                          renderingIntent: .defaultIntent)
        
        // init cgImage from sourceBuffer
        
        var err = vImageBuffer_InitWithCGImage(&srcBuffer, &format, nil, sourceRef, vImage_Flags(kvImageNoFlags))
        
        guard err == kvImageNoError else {
            free(srcBuffer.data)
            return nil
        }
        
        // Create dest buffers
                
        let dstWidth = sourceRef.width
        let dstHeight = sourceRef.height
                
        let pixelBits : UInt32 = UInt32(8*MemoryLayout<__uint8_t>.size)
    
        var dstA : vImage_Buffer = vImage_Buffer.init()
        var dstR : vImage_Buffer = vImage_Buffer.init()
        var dstG : vImage_Buffer = vImage_Buffer.init()
        var dstB : vImage_Buffer = vImage_Buffer.init()
                        
        // dstA - Alpha
                        
        err = vImageBuffer_Init(&dstA, UInt(dstHeight), UInt(dstWidth), pixelBits, vImage_Flags(kvImageNoFlags))
        
        guard err == kvImageNoError else {
            free(srcBuffer.data)
            print("vImageBuffer_Init (alpha) error: (String(describing: err))")
            return nil
        }
        
        // dstR - Red
        
        err = vImageBuffer_Init(&dstR, UInt(dstHeight), UInt(dstWidth), pixelBits, vImage_Flags(kvImageNoFlags))
        
        guard err == kvImageNoError else {
            free(srcBuffer.data)
            print("vImageBuffer_Init (red) error: (String(describing: err))")
            return nil
        }
        
        // dstG - Green
        
        err = vImageBuffer_Init(&dstG, UInt(dstHeight), UInt(dstWidth), pixelBits, vImage_Flags(kvImageNoFlags))
    
        guard err == kvImageNoError else {
            free(srcBuffer.data)
            print("vImageBuffer_Init (green) error: (String(describing: err))")
            return nil
        }
        
        // _dstB - Blue
        
        err = vImageBuffer_Init(&dstB, UInt(dstHeight), UInt(dstWidth), pixelBits, vImage_Flags(kvImageNoFlags))
    
        guard err == kvImageNoError else {
            free(srcBuffer.data)
            print("vImageBuffer_Init (blue) error: (String(describing: err))")
            return nil
        }
    
        // convert to planar8
        
        err = vImageConvert_ARGB8888toPlanar8(&srcBuffer, &dstA, &dstR, &dstG, &dstB, vImage_Flags(kvImageNoFlags))
        
        guard err == kvImageNoError else {
            free(srcBuffer.data)
            print("vImageConvert_ARGB8888toPlanar8 error: (String(describing: err))")
            return nil
        }
        
        // equalize red
    
        err = vImageEqualization_Planar8(&dstR, &dstR, vImage_Flags(kvImageNoFlags))
        
        guard err == kvImageNoError else {
            free(srcBuffer.data)
            print("vImageEqualization_Planar8 (red) error: (String(describing: err))")
            return nil
        }
    
        // equalize green
    
        err = vImageEqualization_Planar8(&dstG, &dstG, vImage_Flags(kvImageNoFlags))
        
        guard err == kvImageNoError else {
            free(srcBuffer.data)
            print("vImageEqualization_Planar8 (green) error: (String(describing: err))")
            return nil
        }
    
        // equalize blue
    
        err = vImageEqualization_Planar8(&dstB, &dstB, vImage_Flags(kvImageNoFlags))
        
        guard err == kvImageNoError else {
            free(srcBuffer.data)
            print("vImageEqualization_Planar8 (blue) error: (String(describing: err))")
            return nil
        }
    
        // planar8 to ARGB8888
    
        err = vImageConvert_Planar8toARGB8888(&dstA, &dstR, &dstG, &dstB, &srcBuffer, vImage_Flags(kvImageNoFlags))
        
        guard err == kvImageNoError else {
            free(srcBuffer.data)
            print("vImageConvert_Planar8toARGB8888 error: (String(describing: err))")
            return nil
        }
    
        // contrast stretch
    
        err = vImageContrastStretch_ARGB8888( &srcBuffer, &srcBuffer, vImage_Flags(kvImageNoFlags))
        
        guard err == kvImageNoError else {
            free(srcBuffer.data)
            print("vImageContrastStretch_ARGB8888 error: (String(describing: err))")
            return nil
        }
    
        // free buffers
    
        free(dstA.data)
        free(dstR.data)
        free(dstG.data)
        free(dstB.data)
        
        // Create CGImage from vImage_Buffer
        
        guard let result = vImageCreateCGImageFromBuffer(&srcBuffer, &format, nil, nil, vImage_Flags(kvImageNoAllocate), &err)?.takeRetainedValue() else {
            
            print("vImageCreateCGImageFromBuffer error: (err))")
    
            return nil
        }
        
        //free(srcBuffer.data)
    
        // Create UIImage
        
        let destImage = UIImage(cgImage: result, scale: 0.0, orientation: imageOrientation)
        
        return destImage
        
    }
    }
    
    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search