How do I add a blur effect to a webRTC video track? I am building a video call app using WebRTC and I need to blur the background of the person using my app and also blur all of the video coming from the other side for security reasons. (Random people can call in, so bluring all of their video is a security measure that you can turn off when you feel comfortable).
Some of the code for how I start local video capture
private var videoCapturer: CameraVideoCapturer = getCamera()
private fun getCamera(isFrontFacing: Boolean = true): CameraVideoCapturer {
return Camera1Enumerator(true).run {
deviceNames.find {
if (isFrontFacing)
isFrontFacing(it)
else
isBackFacing(it)
}?.let {
createCapturer(it, null)
} ?: throw IllegalStateException()
}
}
fun startLocalVideoCapture(localVideoOutput: SurfaceViewRenderer, localVideoOutputPiP: SurfaceViewRenderer? = null, localVideoOutputInAppPip: SurfaceViewRenderer? = null, isMicOn: Boolean = true) {
localVideoOutput.setMirror(true)
val localVideoSource = peerConnectionFactory.createVideoSource(false)
val surfaceTextureHelper = SurfaceTextureHelper.create(Thread.currentThread().name, rootEglBase.eglBaseContext)
(videoCapturer as VideoCapturer).initialize(surfaceTextureHelper, localVideoOutput.context, localVideoSource.capturerObserver)
videoCapturer.startCapture(1280, 962, 24)
localVideoTrack = peerConnectionFactory.createVideoTrack(LOCAL_TRACK_ID, localVideoSource)
val localAudioTrack = peerConnectionFactory.createAudioTrack(
LOCAL_AUDIO_TRACK_ID, peerConnectionFactory.createAudioSource(MediaConstraints())
)
localAudioTrack.setEnabled(isMicOn)
localVideoTrack?.addSink(localVideoOutput)
localStream = peerConnectionFactory.createLocalMediaStream(LOCAL_STREAM_ID)
localStream.audioTracks.add(localAudioTrack)
localStream.videoTracks.add(localVideoTrack)
videoSender = peerConnection?.addTrack(localVideoTrack, arrayListOf(LOCAL_STREAM_ID))
peerConnection?.addTrack(localAudioTrack, arrayListOf(LOCAL_STREAM_ID))
}
And how I receive the video:
override fun onAddStream(p0: MediaStream?) {
super.onAddStream(p0)
if (p0?.videoTracks?.isNotEmpty() == true) {
p0.videoTracks?.get(0)?.addSink(remote_view)
remoteVideoTrack = p0.videoTracks?.get(0)
callControlsViewModel.isClientCamOn.postValue(true)
}
if (p0?.audioTracks?.isNotEmpty() == true) {
remoteAudioTrack = p0.audioTracks?.get(0)
callControlsViewModel.isClientMicOn.postValue(true)
}
}
2
Answers
For blurring backgrounds you’d need to use some kind of AI solution, e.g. https://google.github.io/mediapipe/solutions/selfie_segmentation.html. The problem is that this is very CPU/GPU intensive. If you can control the outgoing WebRTC VideoTrack it would be best to first do the process of blurring the background and send the modified Track.
If you receive "Normal" VideoTracks and want to blur them after receiving it would probably be too much to handle for any device. On top of that, since you mention security reasons, it would be unwise to receive "Vanilla" tracks and handle blurring in your application code. It should already arrive in a "secured" state.
I’ll try to explain as much as possible.
To blur your own feed, you need to do some pre processing on your video frames before sending. And how to do that?
1.1 Instead of using CameraVideoCapturer/VideoCapturer, extend the VideoCapturer class and process your frames there before calling sendFrame()/onFrameAvailable()
1.2 First create a bytearray/bitmap of your camera feed.
1.3 Use OpenCV to manipulate your frames (Blurring the background in your case)
1.4 Convert the frames from bitmap to webrtc video frame and pass it to the function inside your extended VideoCapturer to be sent to webrtc (Explained on step 1)
How to blur remote video?