skip to Main Content

While using Apple’s texture importer, or my own, a white soft-edged circle drawn in software (with a transparent bg) or in Photoshop (saved as a PNG) when rendered will have its semi-transparent colors replaced with black when brought into Metal.

Below is a screen grab from Xcode’s Metal debugger, you can see the texture before being sent to shaders.

Image located here (I’m not high ranked enough to embed)

In Xcode, finder, and when put into an UIImageView, the source texture does not have the ring. But somewhere along the UIImage -> CGContex -> MTLTexture process (I’m thinking specifically the MTLTexture part) the tranparent sections are darkened.

I’ve been banging my head against the wall changing everything I could for the past couple of days but I can’t figure it out.

To be transparent (ha), here is my personal import code

import UIKit
import CoreGraphics

class MetalTexture {

    class func imageToTexture(imageNamed: String, device: MTLDevice) -> MTLTexture {
        let bytesPerPixel = 4
        let bitsPerComponent = 8

        var image = UIImage(named: imageNamed)!

        let width = Int(image.size.width)
        let height = Int(image.size.height)
        let bounds = CGRectMake(0, 0, CGFloat(width), CGFloat(height))

        var rowBytes = width * bytesPerPixel
        var colorSpace = CGColorSpaceCreateDeviceRGB()

        let context = CGBitmapContextCreate(nil, width, height, bitsPerComponent, rowBytes, colorSpace, CGBitmapInfo(CGImageAlphaInfo.PremultipliedLast.rawValue))

        CGContextClearRect(context, bounds)
        CGContextTranslateCTM(context, CGFloat(width), CGFloat(height))
        CGContextScaleCTM(context, -1.0, -1.0)
        CGContextDrawImage(context, bounds, image.CGImage)

        var texDescriptor = MTLTextureDescriptor.texture2DDescriptorWithPixelFormat(.RGBA8Unorm, width: width, height: height, mipmapped: false)

        var texture = device.newTextureWithDescriptor(texDescriptor)
        texture.label = imageNamed

        var pixelsData = CGBitmapContextGetData(context)

        var region = MTLRegionMake2D(0, 0, width, height)
        texture.replaceRegion(region, mipmapLevel: 0, withBytes: pixelsData, bytesPerRow: rowBytes)

        return texture
    }
}

But I don’t think that’s the problem (since it’s a copy of Apple’s in Swift, and I’ve used theirs instead with no differences).

Any leads at all would be super helpful.

2

Answers


  1. Chosen as BEST ANSWER

    Thanks to Jessy I decided to look through how I was blending my alphas and I've figured it out. My texture still looks darkened in the GPU debugger, but in the actual app everything looks correct. I made my changes to my pipeline state descriptor, which you can see below.

    pipelineStateDescriptor.colorAttachments[0].blendingEnabled = true
    pipelineStateDescriptor.colorAttachments[0].rgbBlendOperation = .Add
    pipelineStateDescriptor.colorAttachments[0].alphaBlendOperation = .Add
    pipelineStateDescriptor.colorAttachments[0].sourceRGBBlendFactor = .DestinationAlpha
    pipelineStateDescriptor.colorAttachments[0].sourceAlphaBlendFactor = .DestinationAlpha
    pipelineStateDescriptor.colorAttachments[0].destinationRGBBlendFactor = .OneMinusSourceAlpha
    pipelineStateDescriptor.colorAttachments[0].destinationAlphaBlendFactor = .OneMinusBlendAlpha
    

  2. Since you are using a CGContext that is configured to use colors premultiplied with the alpha channel, using the standard RGB = src.rgb * src.a + dst.rgb * (1-src.a) will cause darkened areas because the src.rgb values are premultiplied with src.a already. This means that what you want is src.rgb + dst.rgb * (1-src.a) which is configured like this:

    pipeline.colorAttachments[0].isBlendingEnabled = true
    pipeline.colorAttachments[0].rgbBlendOperation = .add
    pipeline.colorAttachments[0].alphaBlendOperation = .add
    pipeline.colorAttachments[0].sourceRGBBlendFactor = .one
    pipeline.colorAttachments[0].sourceAlphaBlendFactor = .sourceAlpha
    pipeline.colorAttachments[0].destinationRGBBlendFactor = .oneMinusSourceAlpha
    pipeline.colorAttachments[0].destinationAlphaBlendFactor = .oneMinusSourceAlpha
    

    The .one means to leave RGB as-is. The reason for premultiplied colors is that you only need to multiply the colors once on creation as opposed to every time you blend. Your configuration achieves this too, but indirectly.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search