Skip to content

Instantly share code, notes, and snippets.

@jimmywarting
Created January 1, 2019 11:05
Show Gist options
  • Save jimmywarting/2296342bfb7a31ad26068659d1fd9fc9 to your computer and use it in GitHub Desktop.
Save jimmywarting/2296342bfb7a31ad26068659d1fd9fc9 to your computer and use it in GitHub Desktop.
how to easily make webrtc connection without dependencies
var pc1 = new RTCPeerConnection(),
pc2 = new RTCPeerConnection();
var addCandidate = (pc, can) => can && pc.addIceCandidate(can).catch(console.error);
pc1.onicecandidate = e => { addCandidate(pc2, e.candidate); };
pc2.onicecandidate = e => { addCandidate(pc1, e.candidate); };
pc1.oniceconnectionstatechange = e => console.log("pc1 iceConnState:", pc1.iceConnectionState);
pc2.oniceconnectionstatechange = e => console.log("pc2 iceConnState:", pc2.iceConnectionState);
pc1dch = pc1.createDataChannel('dch', {"negotiated" : true, id: 1});
pc2dch = pc2.createDataChannel('dch', {"negotiated" : true, id: 1});
pc2dch.binaryType = 'arraybuffer'
pc1dch.binaryType = 'arraybuffer'
pc1dch.onopen = e => {console.log("pc1dch open")};
pc2dch.onopen = e => {console.log("pc2dch open")};
pc1dch.onclose = e => {console.log("pc1dch close")};
pc2dch.onclose = e => {console.log("pc2dch close")};
pc2dch.onmessage = e => {console.log("pc2dch message: ", e)}
pc1dch.onmessage = e => {console.log("pc1dch message: ", e)}
function start() {
pc1.createOffer()
.then(d => pc1.setLocalDescription(d))
.then(() => pc2.setRemoteDescription(pc1.localDescription))
.then(() => pc2.createAnswer())
.then(d => pc2.setLocalDescription(d))
.then(() => pc1.setRemoteDescription(pc2.localDescription))
.catch(console.error);
};
start();
@guest271314
Copy link

@jimmywarting Re https://stackoverflow.com/q/56510151 there are several issues at the code; among them 1) addTransceiver() is called after createOffer(); 2) since the <canvas> is only drawn every 1 second the MediaStreamTrack cycles between muted attribute being true and false every 1 second (when muted "black frames or silence" are recorded https://w3c.github.io/mediacapture-record/MediaRecorder.html); 3) MediaSource has issues playing ArrayBuffer from MediaRecorder that are not completed. When using "segments" mode for MediaSource (the default mode) timestampOffset needs to be set by the application. MediaSource implementation between Chromium and Firefox vary widely; 4) in general, addTrack() should be used instead of addStream().

Perhaps the code at the branches at https://github.com/guest271314/MediaFragmentRecorder might be helpful relevant to using MediaSource, WebRTC and MediaRecorder.

BTW, is there any reason why the previous questions and answers https://stackoverflow.com/q/48257041; https://stackoverflow.com/a/48257086 and https://stackoverflow.com/q/47119426; https://stackoverflow.com/a/47172409 are not listed at https://stackoverflow.com/a/52079109 (which essentially make the answer a duplicate of the previous answers)?

@guest271314
Copy link

@jimmywarting Composing the same code using MediaRecorder, WebRTC which outputs the same results at Chromium/Chrome and Firefox is not straightforward. There might also be a Chromium/Chrome MediaRecorder bug involved. When not using MediaSource but rather setting #videoD srcObject to stream within stream2mediaSorce the resulting webm file has a frame rate that does not reflect the rate at which the images on the <canvas> change https://jsfiddle.net/bpr2m0h1/.

@jimmywarting
Copy link
Author

addTransceiver() is called after createOffer()

So what? you will get a onnegotiationneeded event at witch point you will have to renegotiate (sending offer and answer once again) that is what the localPeerConnectionLoop helps out with.

BTW, is there any reason why the previous questions and answers ... are not listed at ... which essentially make the answer a duplicate

They are very much the same, and should maybe be merged. there are so many post that makes it hard to keep track of what exist and what not, finding them is not also the easiest sometimes...


Hmm, frameRate you say 🤔 will experiment with the framerate a bit. maybe i can get it to work. the clock is just a dummy stream cuz i don't have a webcam on my computer and i will not have some custom framerate then. i just picked something.
Don't fully understand what black screen has to do with muted...

btw, i don't really care about firefox as this is going to be a internal application that i will only use for myself. so just getting it to work in chrome would be fine. And if i can't get it to work in chrome then maybe i can use some other browser.

@guest271314
Copy link

@jimmywarting

They are very much the same, and should maybe be merged. there are so many post that makes it hard to keep track of what exist and what not, finding them is not also the easiest sometimes...

There are only two relevant duplicate targets are linked in previous comment (https://stackoverflow.com/q/47515232 is a duplicate of the second linked question). See https://gist.github.com/guest271314/7eac2c21911f5e40f48933ac78e518bd; whatwg/html#3269 for source of the code.

Don't fully understand what black screen has to do with muted...

It is not straightforward. See and follow links from w3c/mediacapture-main#583; https://bugzilla.mozilla.org/show_bug.cgi?id=1557394#c9.

btw, i don't really care about firefox as this is going to be a internal application that i will only use for myself. so just getting it to work in chrome would be fine. And if i can't get it to work in chrome then maybe i can use some other browser.

Firefox has own issues with the same code https://bugzilla.mozilla.org/show_bug.cgi?id=1212237; https://bugzilla.mozilla.org/show_bug.cgi?id=1542616.

Also, MediaSource at Chromium has its issues as well, particularly when using "segments" mode. Chromium/Chrome still crashes the tab when captureStream() is called on a <video> element with MediaSource set at src, see w3c/media-source#190 and master branch of above linked MediaFragmentRecorder repository.

For one issue with the current code see https://plnkr.co/edit/mVDY4T?p=preview.

@guest271314
Copy link

@jimmywarting Why is MediaSource being used to play the remote video where the <video> srcObject can be set to the remote MediaStream? When using "segments" mode with multiple buffers timestampOffset will more than likely need to be set at SourceBuffer. Even then waiting event of <video> element might need to be used to append buffers to the SourceBuffer. First of all at Chromium MediaRecorder needs to produce a webm file having the correct frame rate. Currently MediaRecorder is recording the 10 seconds of images drawn onto the input canvas in less than 1 second.

@jimmywarting
Copy link
Author

jimmywarting commented Jun 16, 2019

guest271314

Why is MediaSource being used to play the remote video where the <video> srcObject can be set to the remote MediaStream?

I do it b/c i want to re-play the live stream with an added delay. (few seconds)
But if i perhaps could use and configure playout-delay then i could just do what you said, set the video source to the remote stream directly and not having to use MediaRecorder at all (which would be the best option)

If you get a working example i would be so happy.
Still struggeling to get it to work

@guest271314
Copy link

I do it b/c i want to re-play the live stream with an added delay. (few seconds)

What do you mean by "delay"?

But if i perhaps could use and configure playout-delay then i could just do what you said, set the video source to the remote stream directly and not having to use MediaRecorder at all (which would be the best option)

If you get a working example i would be so happy.
Still struggeling to get it to work

That is possible now, as long as you are not expecting to record the rendering using MediaRecorder and get a webm file output that reflects the rendered playback.

@guest271314
Copy link

The <video> element with <span> having textContent "remoteVideoStream" above it at the linked plnkr renders playback at the same rate as the local MediaStream though has srcObject set to remote MediaStream.

@jimmywarting
Copy link
Author

jimmywarting commented Jun 16, 2019

That is possible now, as long as you are not expecting to record the rendering using MediaRecorder and get a webm file output that reflects the rendered playback.

Now you intrigue me!
How can i configure the playout-delay to be something like 10 seconds? I don't have to record anything if I can do that

@guest271314
Copy link

@jimmywarting See line #223 remoteVideoStream.srcObject = mediaStream; at https://plnkr.co/edit/mVDY4T?p=preview. For the code to output the same result at both Chromium/Chrome and Firefox (to avoid DOMException: "The operation is insecure." error) comment lines #224 through #257. That is, if MediaRecorder usage for remote MediaStream is not an essential requirement.

@guest271314
Copy link

@jimmywarting Again, what do you mean by "playout-delay"?

@jimmywarting
Copy link
Author

jimmywarting commented Jun 16, 2019

But if i perhaps could use and configure playout-delay then i could just do what you said, set the video source to the remote stream directly and not having to use MediaRecorder at all (which would be the best option)

If you get a working example i would be so happy.
Still struggeling to get it to work

That is possible now, as long as you are not expecting to record the rendering using MediaRecorder and get a webm file output that reflects the rendered playback.

hmm, maybe a miss communication there...

This is what i was talking about: https://webrtc.org/experiments/rtp-hdrext/playout-delay/
I can understand what playout-delay means but not how to configure it, or if it's even at all possible

@guest271314
Copy link

@jimmywarting Not sure if that extension is implemented. Was referring to an approach similar to https://run.plnkr.co/plunks/jnHfKW/ where ImageCapture is used to save images, then - after required delay - draw the images onto a <canvas> (which can be captured as a MediaStream and set as srcObject of a <video>).

@jimmywarting
Copy link
Author

jimmywarting commented Jun 16, 2019

I can see the extension implemented in the sdp (chrome)

a=mid:0
a=extmap:14 urn:ietf:params:rtp-hdrext:toffset
a=extmap:13 http://www.webrtc.org/experiments/rtp-hdrext/abs-send-time
a=extmap:12 urn:3gpp:video-orientation
a=extmap:2 http://www.ietf.org/id/draft-holmer-rmcat-transport-wide-cc-extensions-01

HERE
a=extmap:11 http://www.webrtc.org/experiments/rtp-hdrext/playout-delay

a=extmap:6 http://www.webrtc.org/experiments/rtp-hdrext/video-content-type
a=extmap:7 http://www.webrtc.org/experiments/rtp-hdrext/video-timing
a=extmap:8 http://tools.ietf.org/html/draft-ietf-avtext-framemarking-07
a=extmap:9 http://www.webrtc.org/experiments/rtp-hdrext/color-space
a=extmap:3 urn:ietf:params:rtp-hdrext:sdes:mid
a=extmap:4 urn:ietf:params:rtp-hdrext:sdes:rtp-stream-id
a=extmap:5 urn:ietf:params:rtp-hdrext:sdes:repaired-rtp-stream-id

@guest271314
Copy link

@jimmywarting Does the playout-delay that you are referring to mean a. 1) when remote MediaStream is received 2) delay 10 seconds 3) play the MediaStream; or b. 1) wait 10 seconds between each "packet" of media streamed?

@guest271314
Copy link

@jimmywarting Have not tried adjusting SDP line by line. Perhaps @fippo could help with the specific topic of setting the value of that extension and what the expected and actual results are (see w3c/webrtc-pc#2193 (comment))?

@guest271314
Copy link

@jimmywarting Can you clarify what the expected output is?

Are you expecting

  • a 10 second delay before the remote portion of the code receives the first media packet?

  • a 10 second delay between each media pack the remote portion of the code receives?

@jimmywarting
Copy link
Author

jimmywarting commented Jun 16, 2019

Believe it's option B

a 10 second delay between each media pack the remote portion of the code receives?

I want to use one camera to transfer the live stream to another device and throughout the hole session it should play the video with an added delay on the playback to it so you can see something you did afterwards.

It shouldn't just record 10 seconds of a video and then stop and play the final blob afterwards... needs to be continuous

@guest271314
Copy link

@jimmywarting At any given point during the live stream you can draw black frames for the effect of a delay, given the same code at tthe linked plnkr

      let raf;
      let now = 0;
      let then = 60 * 10;
      const draw = async() => {
        if (++now < then) {
          ctx.fillStyle = "black";
          ctx.fillRect(0, 0, width, height);
        } else {
          drawClock();
        }
        requestFrame();
        requestAnimationFrame(draw)
      };
      // https://github.com/w3c/mediacapture-fromelement/issues/76
      const requestFrame = _ => canvasStream.requestFrame ? canvasStream.requestFrame() : videoTrack.requestFrame();
      raf = requestAnimationFrame(draw);
      setTimeout(() => console.log(now), 10000);

@guest271314
Copy link

Another alternative would be to use 2 canvas elements at local portion of code to draw and store images of live stream while streaming black frames. At 10 seconds begin drawing stored frames, which should render at remote portion of code 10 seconds behind live images being stored.

@guest271314
Copy link

@jimmywarting Substitute the code below for createMediaStreamTracks function at the linked plnkr

    const createMediaStreamTracks = _ => {
      const canvas = document.createElement("canvas");
      canvas.id = "canvas";
      const span = document.createElement("span");
      span.textContent = canvas.id;
      canvas.width = width;
      canvas.height = height;
      document.body.appendChild(canvas);
      canvas.insertAdjacentElement("beforebegin", span);
      const ctx = canvas.getContext("2d");
      canvasStream = canvas.captureStream(0);
      const [videoTrack] = canvasStream.getVideoTracks();
      var radius = canvas.height / 2;
      ctx.translate(radius, radius);
      radius = radius * 0.90;

      function drawClock() {
        drawFace(ctx, radius);
        drawNumbers(ctx, radius);
        drawTime(ctx, radius);
      }

      function drawFace(ctx, radius) {
        var grad;
        ctx.beginPath();
        ctx.arc(0, 0, radius, 0, 2 * Math.PI);
        ctx.fillStyle = 'white';
        ctx.fill();
        grad = ctx.createRadialGradient(0, 0, radius * 0.95, 0, 0, radius * 1.05);
        grad.addColorStop(0, '#333');
        grad.addColorStop(0.5, 'white');
        grad.addColorStop(1, '#333');
        ctx.strokeStyle = grad;
        ctx.lineWidth = radius * 0.1;
        ctx.stroke();
        ctx.beginPath();
        ctx.arc(0, 0, radius * 0.1, 0, 2 * Math.PI);
        ctx.fillStyle = '#333';
        ctx.fill();
      }

      function drawNumbers(ctx, radius) {
        var ang;
        var num;
        ctx.font = radius * 0.15 + "px arial";
        ctx.textBaseline = "middle";
        ctx.textAlign = "center";
        for (num = 1; num < 13; num++) {
          ang = num * Math.PI / 6;
          ctx.rotate(ang);
          ctx.translate(0, -radius * 0.85);
          ctx.rotate(-ang);
          ctx.fillText(num.toString(), 0, 0);
          ctx.rotate(ang);
          ctx.translate(0, radius * 0.85);
          ctx.rotate(-ang);
        }
      }

      function drawTime(ctx, radius) {
        var now = new Date();
        var hour = now.getHours();
        var minute = now.getMinutes();
        var second = now.getSeconds();
        //hour
        hour = hour % 12;
        hour = (hour * Math.PI / 6) +
          (minute * Math.PI / (6 * 60)) +
          (second * Math.PI / (360 * 60));
        drawHand(ctx, hour, radius * 0.5, radius * 0.07);
        //minute
        minute = (minute * Math.PI / 30) + (second * Math.PI / (30 * 60));
        drawHand(ctx, minute, radius * 0.8, radius * 0.07);
        // second
        second = (second * Math.PI / 30);
        drawHand(ctx, second, radius * 0.9, radius * 0.02);
      }

      function drawHand(ctx, pos, length, width) {
        ctx.beginPath();
        ctx.lineWidth = width;
        ctx.lineCap = "round";
        ctx.moveTo(0, 0);
        ctx.rotate(pos);
        ctx.lineTo(0, -length);
        ctx.stroke();
        ctx.rotate(-pos);
      }
      // draw black frames for 10 seconds
      const delayStreamCanvas = document.createElement("canvas");
      delayStreamCanvas.width = width;
      delayStreamCanvas.height = height;
      const delayStreamContext = delayStreamCanvas.getContext("2d");
      const delayStream = delayStreamCanvas.captureStream(0);
      const [delayStreamTrack] = delayStream.getVideoTracks();

      let now = 0;
      let then = 60 * 10;
      let raf;
      const delayed = [];
      
      requestAnimationFrame(function drawDelay() {
        if (++now < then) {
          delayStreamContext.fillStyle = "black";
          delayStreamContext.fillRect(0, 0, width, height);
        } else {
          // stream stored images of stream
          delayStreamContext.drawImage(delayed.shift(), 0, 0);
        }
        requestFrame(delayStream);
        requestAnimationFrame(drawDelay);
      });

      const draw = async() => {
        // draw images
        drawClock();
        // store images
        delayed.push(await createImageBitmap(canvas));
        requestFrame(canvasStream);
        requestAnimationFrame(draw);
      };
      // https://github.com/w3c/mediacapture-fromelement/issues/76
      const requestFrame = stream => stream.requestFrame ? stream.requestFrame() : stream.getVideoTracks()[0].requestFrame();
      raf = requestAnimationFrame(draw);
      setTimeout(() => console.log(now), 10000);
      return {
        mediaStream: delayStream,
        videoTrack: delayStreamTrack,
        raf
      };
    }

@guest271314
Copy link

@guest271314
Copy link

@jimmywarting If the initial MediaStream is not derived from a canvas, e.g., the same approach can be employed by utilizing ImageCapture grabFrame() to store the current frame of a MediaStream as an ImageBitmap (see https://plnkr.co/edit/5bvp9xv0ciMYfVzG; https://github.com/guest271314/MediaFragmentRecorder/blob/imagecapture-audiocontext-readablestream-writablestream/MediaFragmentRecorder.html).

@guest271314
Copy link

Using one requestAnimationFrame

      const draw = async() => {
        drawClock();
        delayed.push(await createImageBitmap(canvas));
        if (++now < then) {
          delayStreamContext.fillStyle = "black";
          delayStreamContext.fillRect(0, 0, width, height);
        } else {
          delayStreamContext.drawImage(delayed.shift(), 0, 0);
        }
        requestFrame(canvasStream);
        requestFrame(delayStream);
        requestAnimationFrame(draw);
      };

@guest271314
Copy link

@guest271314
Copy link

@jimmywarting

https://bugs.chromium.org/p/webrtc/issues/detail?id=10759#c12 :

To answer your original question: It's not possible to set a playout delay of 10 seconds in WebRTC.

@jimmywarting
Copy link
Author

Hmm, thanks for investigating the possibility

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment