skip to Main Content

I need to connect the mediastream of the track provided to the pannerNode so i can have 3d spatial audio , but something doesnt seem to work.

This is the function:

async function handleTrackSubscribed(  //SIMPLE

    track: RemoteTrack,
    publication: RemoteTrackPublication,
    participant: RemoteParticipant
) {
    if (track.kind === 'audio') {
        //const audioElement = document.createElement('audio');

        if (track.mediaStream) {

            //audioElement.autoplay = true;
            //audioElement.srcObject = track.mediaStream;

            const panner = new PannerNode(audioContext, {
                panningModel: "HRTF",
                distanceModel: "linear",
                positionX: window.innerWidth /2,
                positionY: window.innerHeight / 2,
                positionZ: 300,
                orientationX: 0.0,
                orientationY: 0.0,
                orientationZ: -1.0,
                refDistance: 1,
                maxDistance: 20_000,
                rolloffFactor: 10,
                coneInnerAngle: 40,
                coneOuterAngle: 50,
                coneOuterGain: 0.4,
            });

            panner.connect(audioContext.destination);

            const audioSource = audioContext.createMediaStreamSource(track.mediaStream);
            audioSource.connect(panner);

            
            // const audioBuffer = await loadAudioBuffer('/sound/song.mp3');
            // const sourceNode = audioContext.createBufferSource();
            // sourceNode.buffer = audioBuffer;

            // sourceNode.connect(panner);

            // sourceNode.start();


            const audioForm = document.getElementById('audioForm') as HTMLFormElement;
            const moveButton = document.getElementById('moveButton') as HTMLButtonElement;

            moveButton.addEventListener('click', () => {
                const xValue = parseFloat((document.getElementById('xValue') as HTMLInputElement).value);
                const yValue = parseFloat((document.getElementById('yValue') as HTMLInputElement).value);
                const zValue = parseFloat((document.getElementById('zValue') as HTMLInputElement).value);
        
                panner.positionX.value = xValue;
                panner.positionY.value = yValue;
                panner.positionZ.value = zValue;
            });

        }
    }
}

As you can see there are some parts commented , the first 3 lines commented are creating an audio element directly in the html , if i do that the audio works perfectly fine , but i cant have 3d, but it works so the stream is there.
The second commented part is literally a song i imported to test the panner and the panner works perfectly fine , so the conclusion is that the mistake is in connecting the stream to the panner, but i dont know how to do it.
Thanks in advance.

2

Answers


  1. Chosen as BEST ANSWER

    I made it work like this:

    const mediaStream = new MediaStream(track.mediaStream);
    const options = {
        mediaStream: mediaStream,
      };
    const sourceNode = new MediaStreamAudioSourceNode(audioContext, options)
    
    sourceNode.connect(panner).connect(audioContext.destination);
    
    //audioContext.createMediaStreamSource(mediaStream);
    
    if(track instanceof RemoteAudioTrack) {
        track.setWebAudioPlugins([panner])
        console.log("RemoteTrack")
    }


  2. I’m assuming you’re testing this in Chrome. In this case the MediaStream coming from WebRTC needs to be connected to a media element even if you don’t intend to play it.

    Uncommenting audioElement.srcObject = track.mediaStream without setting autoplay to true or calling play() should work.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search