skip to Main Content

How do I add a blur effect to a webRTC video track? I am building a video call app using WebRTC and I need to blur the background of the person using my app and also blur all of the video coming from the other side for security reasons. (Random people can call in, so bluring all of their video is a security measure that you can turn off when you feel comfortable).

Some of the code for how I start local video capture

private var videoCapturer: CameraVideoCapturer = getCamera()

private fun getCamera(isFrontFacing: Boolean = true): CameraVideoCapturer {
    return Camera1Enumerator(true).run {
        deviceNames.find {
            if (isFrontFacing)
                isFrontFacing(it)
            else
                isBackFacing(it)
        }?.let {
            createCapturer(it, null)
        } ?: throw IllegalStateException()
    }
}


fun startLocalVideoCapture(localVideoOutput: SurfaceViewRenderer, localVideoOutputPiP: SurfaceViewRenderer? = null, localVideoOutputInAppPip: SurfaceViewRenderer? = null, isMicOn: Boolean = true) {
    localVideoOutput.setMirror(true)
    val localVideoSource = peerConnectionFactory.createVideoSource(false)
    val surfaceTextureHelper = SurfaceTextureHelper.create(Thread.currentThread().name, rootEglBase.eglBaseContext)
    (videoCapturer as VideoCapturer).initialize(surfaceTextureHelper, localVideoOutput.context, localVideoSource.capturerObserver)
    videoCapturer.startCapture(1280, 962, 24)
    localVideoTrack = peerConnectionFactory.createVideoTrack(LOCAL_TRACK_ID, localVideoSource)
    val localAudioTrack = peerConnectionFactory.createAudioTrack(
        LOCAL_AUDIO_TRACK_ID, peerConnectionFactory.createAudioSource(MediaConstraints())
    )
    localAudioTrack.setEnabled(isMicOn)
    localVideoTrack?.addSink(localVideoOutput)

    localStream = peerConnectionFactory.createLocalMediaStream(LOCAL_STREAM_ID)
    localStream.audioTracks.add(localAudioTrack)
    localStream.videoTracks.add(localVideoTrack)
    videoSender = peerConnection?.addTrack(localVideoTrack, arrayListOf(LOCAL_STREAM_ID))
    peerConnection?.addTrack(localAudioTrack, arrayListOf(LOCAL_STREAM_ID))
}

And how I receive the video:

override fun onAddStream(p0: MediaStream?) {
                super.onAddStream(p0)
                if (p0?.videoTracks?.isNotEmpty() == true) {
                    p0.videoTracks?.get(0)?.addSink(remote_view)
                    remoteVideoTrack = p0.videoTracks?.get(0)
                    callControlsViewModel.isClientCamOn.postValue(true)
                }
                if (p0?.audioTracks?.isNotEmpty() == true) {
                    remoteAudioTrack = p0.audioTracks?.get(0)
                    callControlsViewModel.isClientMicOn.postValue(true)
                }
            }

2

Answers


  1. For blurring backgrounds you’d need to use some kind of AI solution, e.g. https://google.github.io/mediapipe/solutions/selfie_segmentation.html. The problem is that this is very CPU/GPU intensive. If you can control the outgoing WebRTC VideoTrack it would be best to first do the process of blurring the background and send the modified Track.

    If you receive "Normal" VideoTracks and want to blur them after receiving it would probably be too much to handle for any device. On top of that, since you mention security reasons, it would be unwise to receive "Vanilla" tracks and handle blurring in your application code. It should already arrive in a "secured" state.

    Login or Signup to reply.
  2. I’ll try to explain as much as possible.

    1. To blur your own feed, you need to do some pre processing on your video frames before sending. And how to do that?

      1.1 Instead of using CameraVideoCapturer/VideoCapturer, extend the VideoCapturer class and process your frames there before calling sendFrame()/onFrameAvailable()

      1.2 First create a bytearray/bitmap of your camera feed.

      1.3 Use OpenCV to manipulate your frames (Blurring the background in your case)

      1.4 Convert the frames from bitmap to webrtc video frame and pass it to the function inside your extended VideoCapturer to be sent to webrtc (Explained on step 1)

    2. How to blur remote video?

      • You cant. Basically, Native WebRTC on android doesn’t give access to raw frame data for any remote participant. This applies to the audio stream as well. You’ll only get an ID and thats it. Also, its not recommended, It should be upto the remote participants whether they want to blur their background or not.
      • Still if you want to achieve this, you’ll have to fork WebRTC source code and add your own callback that will pass you remote user’s video samples. Haven’t tried it but definitely possible.
      • Keep in mind, your processing overhead will increase if you also process remote user’s video streams causing lag, heating and battery drainage. OpenCV is fast but processing frames at like 15FPS can be process intensive and you’re already doing it on your local video stream.
      • Not recommended as explained by @Martin. you shouldn’t have access to anyone’s video feed. If you have access, you can record, save etc without anyone knowing and that posses a great security issue.
    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search