I have managed to implement a function, with which the user is able to record his screen when pressing a button. I have accomplished that by adding a RPSystemBroadcastPickerView to my respective ViewController:
var picker: RPSystemBroadcastPickerView!
picker = RPSystemBroadcastPickerView(frame: CGRect(x: 0, y: 0, width: 200, height: 200))
self.view.addSubview(picker!)
But my biggest problem is that I don’t know how to get the current recorded video from that RPSystemBroadcastPickerView in realtime. I need to get the recorded video as a CMSampleBuffer because I need to send that to another user in my app. Does anybody know how to get the video content from that active recording as CMSampleBuffer?
Note: I am using Swift and Xcode 13.1
2
Answers
From xcode go to file -> new -> target -> Broadcast upload extension.
Select the way you want to name it. This will create a new target with a SampleHandler in it, this sample handler has the default functions you need to override in order to do what you need. in this case you’ll have to implement the logic in the start function in order to send them to the CMSampleBuffer.
Zoom has a pretty good example of how this works https://marketplace.zoom.us/docs/sdk/video/ios/advanced/screen-share/broadcast
streaming a screen record is a bit complex on iOS, for security reasons etc. But we’ve made a guide showing how to do it with Agora, which I’m assuming you’re using based on the tags:
https://docs.agora.io/en/Video/screensharing_ios?platform=iOS
This line is where you can catch the CMSampleBuffer and do with it as you wish. Currently in that example it will create an AgoraVideoFrame and push that to the engine.