Skip to content

Instantly share code, notes, and snippets.

@jimmywarting
Created January 1, 2019 11:05
Show Gist options
  • Save jimmywarting/2296342bfb7a31ad26068659d1fd9fc9 to your computer and use it in GitHub Desktop.
Save jimmywarting/2296342bfb7a31ad26068659d1fd9fc9 to your computer and use it in GitHub Desktop.
how to easily make webrtc connection without dependencies
var pc1 = new RTCPeerConnection(),
pc2 = new RTCPeerConnection();
var addCandidate = (pc, can) => can && pc.addIceCandidate(can).catch(console.error);
pc1.onicecandidate = e => { addCandidate(pc2, e.candidate); };
pc2.onicecandidate = e => { addCandidate(pc1, e.candidate); };
pc1.oniceconnectionstatechange = e => console.log("pc1 iceConnState:", pc1.iceConnectionState);
pc2.oniceconnectionstatechange = e => console.log("pc2 iceConnState:", pc2.iceConnectionState);
pc1dch = pc1.createDataChannel('dch', {"negotiated" : true, id: 1});
pc2dch = pc2.createDataChannel('dch', {"negotiated" : true, id: 1});
pc2dch.binaryType = 'arraybuffer'
pc1dch.binaryType = 'arraybuffer'
pc1dch.onopen = e => {console.log("pc1dch open")};
pc2dch.onopen = e => {console.log("pc2dch open")};
pc1dch.onclose = e => {console.log("pc1dch close")};
pc2dch.onclose = e => {console.log("pc2dch close")};
pc2dch.onmessage = e => {console.log("pc2dch message: ", e)}
pc1dch.onmessage = e => {console.log("pc1dch message: ", e)}
function start() {
pc1.createOffer()
.then(d => pc1.setLocalDescription(d))
.then(() => pc2.setRemoteDescription(pc1.localDescription))
.then(() => pc2.createAnswer())
.then(d => pc2.setLocalDescription(d))
.then(() => pc1.setRemoteDescription(pc2.localDescription))
.catch(console.error);
};
start();
@jimmywarting
Copy link
Author

jimmywarting commented Jun 16, 2019

That is possible now, as long as you are not expecting to record the rendering using MediaRecorder and get a webm file output that reflects the rendered playback.

Now you intrigue me!
How can i configure the playout-delay to be something like 10 seconds? I don't have to record anything if I can do that

@guest271314
Copy link

@jimmywarting See line #223 remoteVideoStream.srcObject = mediaStream; at https://plnkr.co/edit/mVDY4T?p=preview. For the code to output the same result at both Chromium/Chrome and Firefox (to avoid DOMException: "The operation is insecure." error) comment lines #224 through #257. That is, if MediaRecorder usage for remote MediaStream is not an essential requirement.

@guest271314
Copy link

@jimmywarting Again, what do you mean by "playout-delay"?

@jimmywarting
Copy link
Author

jimmywarting commented Jun 16, 2019

But if i perhaps could use and configure playout-delay then i could just do what you said, set the video source to the remote stream directly and not having to use MediaRecorder at all (which would be the best option)

If you get a working example i would be so happy.
Still struggeling to get it to work

That is possible now, as long as you are not expecting to record the rendering using MediaRecorder and get a webm file output that reflects the rendered playback.

hmm, maybe a miss communication there...

This is what i was talking about: https://webrtc.org/experiments/rtp-hdrext/playout-delay/
I can understand what playout-delay means but not how to configure it, or if it's even at all possible

@guest271314
Copy link

@jimmywarting Not sure if that extension is implemented. Was referring to an approach similar to https://run.plnkr.co/plunks/jnHfKW/ where ImageCapture is used to save images, then - after required delay - draw the images onto a <canvas> (which can be captured as a MediaStream and set as srcObject of a <video>).

@jimmywarting
Copy link
Author

jimmywarting commented Jun 16, 2019

I can see the extension implemented in the sdp (chrome)

a=mid:0
a=extmap:14 urn:ietf:params:rtp-hdrext:toffset
a=extmap:13 http://www.webrtc.org/experiments/rtp-hdrext/abs-send-time
a=extmap:12 urn:3gpp:video-orientation
a=extmap:2 http://www.ietf.org/id/draft-holmer-rmcat-transport-wide-cc-extensions-01

HERE
a=extmap:11 http://www.webrtc.org/experiments/rtp-hdrext/playout-delay

a=extmap:6 http://www.webrtc.org/experiments/rtp-hdrext/video-content-type
a=extmap:7 http://www.webrtc.org/experiments/rtp-hdrext/video-timing
a=extmap:8 http://tools.ietf.org/html/draft-ietf-avtext-framemarking-07
a=extmap:9 http://www.webrtc.org/experiments/rtp-hdrext/color-space
a=extmap:3 urn:ietf:params:rtp-hdrext:sdes:mid
a=extmap:4 urn:ietf:params:rtp-hdrext:sdes:rtp-stream-id
a=extmap:5 urn:ietf:params:rtp-hdrext:sdes:repaired-rtp-stream-id

@guest271314
Copy link

@jimmywarting Does the playout-delay that you are referring to mean a. 1) when remote MediaStream is received 2) delay 10 seconds 3) play the MediaStream; or b. 1) wait 10 seconds between each "packet" of media streamed?

@guest271314
Copy link

@jimmywarting Have not tried adjusting SDP line by line. Perhaps @fippo could help with the specific topic of setting the value of that extension and what the expected and actual results are (see w3c/webrtc-pc#2193 (comment))?

@guest271314
Copy link

@jimmywarting Can you clarify what the expected output is?

Are you expecting

  • a 10 second delay before the remote portion of the code receives the first media packet?

  • a 10 second delay between each media pack the remote portion of the code receives?

@jimmywarting
Copy link
Author

jimmywarting commented Jun 16, 2019

Believe it's option B

a 10 second delay between each media pack the remote portion of the code receives?

I want to use one camera to transfer the live stream to another device and throughout the hole session it should play the video with an added delay on the playback to it so you can see something you did afterwards.

It shouldn't just record 10 seconds of a video and then stop and play the final blob afterwards... needs to be continuous

@guest271314
Copy link

@jimmywarting At any given point during the live stream you can draw black frames for the effect of a delay, given the same code at tthe linked plnkr

      let raf;
      let now = 0;
      let then = 60 * 10;
      const draw = async() => {
        if (++now < then) {
          ctx.fillStyle = "black";
          ctx.fillRect(0, 0, width, height);
        } else {
          drawClock();
        }
        requestFrame();
        requestAnimationFrame(draw)
      };
      // https://github.com/w3c/mediacapture-fromelement/issues/76
      const requestFrame = _ => canvasStream.requestFrame ? canvasStream.requestFrame() : videoTrack.requestFrame();
      raf = requestAnimationFrame(draw);
      setTimeout(() => console.log(now), 10000);

@guest271314
Copy link

Another alternative would be to use 2 canvas elements at local portion of code to draw and store images of live stream while streaming black frames. At 10 seconds begin drawing stored frames, which should render at remote portion of code 10 seconds behind live images being stored.

@guest271314
Copy link

@jimmywarting Substitute the code below for createMediaStreamTracks function at the linked plnkr

    const createMediaStreamTracks = _ => {
      const canvas = document.createElement("canvas");
      canvas.id = "canvas";
      const span = document.createElement("span");
      span.textContent = canvas.id;
      canvas.width = width;
      canvas.height = height;
      document.body.appendChild(canvas);
      canvas.insertAdjacentElement("beforebegin", span);
      const ctx = canvas.getContext("2d");
      canvasStream = canvas.captureStream(0);
      const [videoTrack] = canvasStream.getVideoTracks();
      var radius = canvas.height / 2;
      ctx.translate(radius, radius);
      radius = radius * 0.90;

      function drawClock() {
        drawFace(ctx, radius);
        drawNumbers(ctx, radius);
        drawTime(ctx, radius);
      }

      function drawFace(ctx, radius) {
        var grad;
        ctx.beginPath();
        ctx.arc(0, 0, radius, 0, 2 * Math.PI);
        ctx.fillStyle = 'white';
        ctx.fill();
        grad = ctx.createRadialGradient(0, 0, radius * 0.95, 0, 0, radius * 1.05);
        grad.addColorStop(0, '#333');
        grad.addColorStop(0.5, 'white');
        grad.addColorStop(1, '#333');
        ctx.strokeStyle = grad;
        ctx.lineWidth = radius * 0.1;
        ctx.stroke();
        ctx.beginPath();
        ctx.arc(0, 0, radius * 0.1, 0, 2 * Math.PI);
        ctx.fillStyle = '#333';
        ctx.fill();
      }

      function drawNumbers(ctx, radius) {
        var ang;
        var num;
        ctx.font = radius * 0.15 + "px arial";
        ctx.textBaseline = "middle";
        ctx.textAlign = "center";
        for (num = 1; num < 13; num++) {
          ang = num * Math.PI / 6;
          ctx.rotate(ang);
          ctx.translate(0, -radius * 0.85);
          ctx.rotate(-ang);
          ctx.fillText(num.toString(), 0, 0);
          ctx.rotate(ang);
          ctx.translate(0, radius * 0.85);
          ctx.rotate(-ang);
        }
      }

      function drawTime(ctx, radius) {
        var now = new Date();
        var hour = now.getHours();
        var minute = now.getMinutes();
        var second = now.getSeconds();
        //hour
        hour = hour % 12;
        hour = (hour * Math.PI / 6) +
          (minute * Math.PI / (6 * 60)) +
          (second * Math.PI / (360 * 60));
        drawHand(ctx, hour, radius * 0.5, radius * 0.07);
        //minute
        minute = (minute * Math.PI / 30) + (second * Math.PI / (30 * 60));
        drawHand(ctx, minute, radius * 0.8, radius * 0.07);
        // second
        second = (second * Math.PI / 30);
        drawHand(ctx, second, radius * 0.9, radius * 0.02);
      }

      function drawHand(ctx, pos, length, width) {
        ctx.beginPath();
        ctx.lineWidth = width;
        ctx.lineCap = "round";
        ctx.moveTo(0, 0);
        ctx.rotate(pos);
        ctx.lineTo(0, -length);
        ctx.stroke();
        ctx.rotate(-pos);
      }
      // draw black frames for 10 seconds
      const delayStreamCanvas = document.createElement("canvas");
      delayStreamCanvas.width = width;
      delayStreamCanvas.height = height;
      const delayStreamContext = delayStreamCanvas.getContext("2d");
      const delayStream = delayStreamCanvas.captureStream(0);
      const [delayStreamTrack] = delayStream.getVideoTracks();

      let now = 0;
      let then = 60 * 10;
      let raf;
      const delayed = [];
      
      requestAnimationFrame(function drawDelay() {
        if (++now < then) {
          delayStreamContext.fillStyle = "black";
          delayStreamContext.fillRect(0, 0, width, height);
        } else {
          // stream stored images of stream
          delayStreamContext.drawImage(delayed.shift(), 0, 0);
        }
        requestFrame(delayStream);
        requestAnimationFrame(drawDelay);
      });

      const draw = async() => {
        // draw images
        drawClock();
        // store images
        delayed.push(await createImageBitmap(canvas));
        requestFrame(canvasStream);
        requestAnimationFrame(draw);
      };
      // https://github.com/w3c/mediacapture-fromelement/issues/76
      const requestFrame = stream => stream.requestFrame ? stream.requestFrame() : stream.getVideoTracks()[0].requestFrame();
      raf = requestAnimationFrame(draw);
      setTimeout(() => console.log(now), 10000);
      return {
        mediaStream: delayStream,
        videoTrack: delayStreamTrack,
        raf
      };
    }

@guest271314
Copy link

@guest271314
Copy link

@jimmywarting If the initial MediaStream is not derived from a canvas, e.g., the same approach can be employed by utilizing ImageCapture grabFrame() to store the current frame of a MediaStream as an ImageBitmap (see https://plnkr.co/edit/5bvp9xv0ciMYfVzG; https://github.com/guest271314/MediaFragmentRecorder/blob/imagecapture-audiocontext-readablestream-writablestream/MediaFragmentRecorder.html).

@guest271314
Copy link

Using one requestAnimationFrame

      const draw = async() => {
        drawClock();
        delayed.push(await createImageBitmap(canvas));
        if (++now < then) {
          delayStreamContext.fillStyle = "black";
          delayStreamContext.fillRect(0, 0, width, height);
        } else {
          delayStreamContext.drawImage(delayed.shift(), 0, 0);
        }
        requestFrame(canvasStream);
        requestFrame(delayStream);
        requestAnimationFrame(draw);
      };

@guest271314
Copy link

@guest271314
Copy link

@jimmywarting

https://bugs.chromium.org/p/webrtc/issues/detail?id=10759#c12 :

To answer your original question: It's not possible to set a playout delay of 10 seconds in WebRTC.

@jimmywarting
Copy link
Author

Hmm, thanks for investigating the possibility

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment