Skip to content

Instantly share code, notes, and snippets.

Show Gist options
  • Star 1 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save guest271314/c38042935db4e0131c1e0b68ca59f4ac to your computer and use it in GitHub Desktop.
Save guest271314/c38042935db4e0131c1e0b68ca59f4ac to your computer and use it in GitHub Desktop.
Why is video not playing on recording of remote stream?
https://stackoverflow.com/questions/61022341/why-is-video-not-playing-on-recording-of-remote-stream
This works without a problem. If I replace the local stream with a remote stream, same code, only received over webRTC, I see the first frame and nothing more... No errors... jsut one Frame.
@cracker0dks
Copy link

WebRTC, the standard way (Client->Server->Client), is all working fine (Video and Audio). Only problem is the server load on video on many people.
So to solve this I record the video on the server with mediaRecorder,send the chunks back to the clients via websockets, (this way I avoid webRTC video transcoding), and render it via MediaSource. That is working but with two little problems atm:

  1. If someone joins the session while the video is already running, he is not able to render the stream (because he is not starting on a keyframe or so)
  2. If I start the video (everything is woking), stop it, and start again I'll get somthing like "MediaStream is closed" on the client even if I did a "new MediaStream" on every new stream start. (but will change that to the canvas solution anyway I think).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment