I am using a piece of code from this link – Resize UIImage by keeping Aspect ratio and width, and it works perfectly, but I am wondering if it can be altered to preserve hard edges of pixels. I want to double the size of the image and keep the hard edge of the pixels.
class func resizeImage(image: UIImage, newHeight: CGFloat) -> UIImage {
let scale = newHeight / image.size.height
let newWidth = image.size.width * scale
UIGraphicsBeginImageContext(CGSizeMake(newWidth, newHeight))
image.drawInRect(CGRectMake(0, 0, newWidth, newHeight))
let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return newImage
}
In Photoshop there is the nearest neighbour interpolation when resizing, is there something like that in iOS?
3
Answers
Did a bit more digging and found the answer -
https://stackoverflow.com/a/25430447/4196903
but where
instead write
You need to use
CISampler
class (Which is only available in iOS 9) and need to create your own custom image processing filter for it i thinkYou can find more information here and here too
Inspired by the accepted answer, updated to Swift 5.
Swift 5