skip to Main Content

I’ve created an Upload Broadcast Extension to capture screen recording and my main app uses native webrtc for audio/video calls.
I’m successfully able to receive frames inside the SampleHandler class. My question is how to pass the data from this SampleHandler to my main app so that I can utilise these frames?
I tried a dependency known as Wormhole to pass the data but it’s not passing any data and also it only supports the passing of strings so it’ll useless for me.

2

Answers


  1. Chosen as BEST ANSWER

    Finally I get to know that it's not possible. We can share data like strings using UserDefaults by creating app groups but that's not what I want to do here. I need to share CMSampleBuffer so this is out of the question. The Extension and the host app run on separate sandbox environments and sharing data between two wont be feasible.

    What we can do is writing the complete video call logic inside the extension itself. All the relevant code that will be responsible to send these frames to the server to a particular room (Room name and other details can be shared using UserDefaults) and stream from the extension itself, which I believe is a long road. Or write a common code for streaming that'll be utilised by the host app and the extension (Again, data between these two wont be possible, both will create a different instance for the file).

    Sometimes enforcing security can be painful for devs!


  2. Use the processSampleBuffer() function

    override func processSampleBuffer(_ sampleBuffer: CMSampleBuffer, with sampleBufferType: RPSampleBufferType) {
        
        switch sampleBufferType {
    
        case RPSampleBufferType.video:                               // handle video sample buffer
            print("----- Got app VIDEO -----")
            let pixel           = CMSampleBufferGetImageBuffer(sampleBuffer)
            self.session.pushVideo(pixel)
            break
             
        case RPSampleBufferType.audioApp:                           // handle audio sample buffer for app audio
            print("----- Got app audio -----")
            var blockBuffer     = CMSampleBufferGetDataBuffer(sampleBuffer)
            var lengthAtOffset  : size_t!
            var totalLength     : size_t!
            var data            : UnsafeMutablePointer<UnsafeMutablePointer<CChar>?>?
            CMBlockBufferGetDataPointer(blockBuffer!,
                                        atOffset            : 0,
                                        lengthAtOffsetOut   : &lengthAtOffset,
                                        totalLengthOut      : &totalLength,
                                        dataPointerOut      : data)
            let audioData = NSData(bytes:data, length:totalLength) as? Data
            self.session.pushAudio(audioData)
            break
             
        case RPSampleBufferType.audioMic:                          // handle audio sample buffer for MIC audio
            print("----- Got MIC audio -----")
            let blockBuffer             = CMSampleBufferGetDataBuffer(sampleBuffer)
            let blockBufferDataLength   = CMBlockBufferGetDataLength(blockBuffer!)
            var blockBufferData         = [UInt8](repeating:0, count: blockBufferDataLength)
            let status = CMBlockBufferCopyDataBytes(blockBuffer!, atOffset: 0, dataLength: blockBufferDataLength, destination: &blockBufferData)
            if status == noErr {
                let data = Data(bytes: blockBufferData, count: blockBufferDataLength)
                self.session.pushAudio(data)
                break
            }
            print(" ------- unable to copy MIC audio data bytes ----")
            return
            
        @unknown default:                                       // handle other sample buffer types
            fatalError("----- Unknown type of sample buffer -------")
        }
        
    }
    

    Good luck!

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search