skip to Main Content

I want to make a effect that a colored normal image to a scanned image.

like 2 pictures below, first image is the original image, second image is the adjusted image, effects should like the second image, the text on back of the page should also disappear:

First image:

original image

Second image:

effected image

I want to use CoreImage and CIFilter to do this. besides contrast and brightness. I think should adjust level like photoshop. but how can adjust that? or other method?

I have tried to adjust it in photoshop, it seems that use photoshop’s level function can reach this effect, the value shows on picture below:

enter image description here

but I don’t know how to do this use CIFilter or other methods? I don’t want to import third part framework or sources. because iOS 11 Notes APP can reach this effect, I think system’s methods and frameworks can do this.

2

Answers


  1. I believe in order to manually implement Photoshop-like level adjustment feature, you’ll need to understand some of the core knowledge on the field of Image Processing.

    You may want to read this Core Image Programming Guide as a starting point:

    https://developer.apple.com/library/content/documentation/GraphicsImaging/Conceptual/CoreImaging/ci_intro/ci_intro.html

    Or, you may simply end up using a third party framework, though you have already stated that you don’t want to use 3rd party libraries. If this is the case, I would like to recommend GPUImage2, just in case:

    https://github.com/BradLarson/GPUImage2

    Finally, I found a similar question to this so you may want to take a look:

    How can I map Photoshop's level adjustment to a Core Image filter?

    Good luck.

    Login or Signup to reply.
  2. Try below code.

    func getScannedImage(inputImage: UIImage) -> UIImage? {
        
        let openGLContext = EAGLContext(api: .openGLES2)
        let context = CIContext(eaglContext: openGLContext!)
        
        let filter = CIFilter(name: "CIColorControls")
        let coreImage = CIImage(image: inputImage)
        
        filter?.setValue(coreImage, forKey: kCIInputImageKey)
        //Key value are changable according to your need.
        filter?.setValue(7, forKey: kCIInputContrastKey) 
        filter?.setValue(1, forKey: kCIInputSaturationKey) 
        filter?.setValue(1.2, forKey: kCIInputBrightnessKey) 
        
        if let outputImage = filter?.value(forKey: kCIOutputImageKey) as? CIImage {
        let output = context.createCGImage(outputImage, from: outputImage.extent)
            return UIImage(cgImage: output!)
        }     
        return nil
    }
    

    You can call the above func like below.

    filterImage.image = getImage(inputImage: filterImage.image!)
    

    Output: (Output from real Device)

    enter image description here


    For more understanding about Core Image Filter try below links and answers.

    Get UIImage from function and Convert UIImage to grayscale keeping image quality

    Core Image Filter Reference from Apple Doc: https://developer.apple.com/library/content/documentation/GraphicsImaging/Reference/CoreImageFilterReference/index.html

    Note: For more specific requirement you need to create your own custom filter. Following link may helps https://spin.atomicobject.com/2016/10/20/ios-image-filters-in-swift/


    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search