Skip to content

Instantly share code, notes, and snippets.

@ethaniel
Last active March 1, 2024 00:49
Show Gist options
  • Star 1 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save ethaniel/1a556310ee159f27e7d93af685148efb to your computer and use it in GitHub Desktop.
Save ethaniel/1a556310ee159f27e7d93af685148efb to your computer and use it in GitHub Desktop.
Split one RTSP stream into multiple gstreamer pipelines
  1. Get RTSP video, decode it, apply framerate and send to shared buffer. Don't forget to set shm-size to something high, because you're now dealing with decoded raw video:

gst-launch-1.0 -v rtspsrc location="rtsp://192.168.86.249/live/sala" protocols=tcp latency=1000 ! watchdog timeout=10000 ! rtph264depay ! h264parse ! avdec_h264 ! videorate ! video/x-raw,framerate=30/1 ! shmsink wait-for-connection=false socket-path=/tmp/foo shm-size=100000000

  1. Note the caps in the output:

/GstPipeline:pipeline0/GstShmSink:shmsink0.GstPad:sink: caps = video/x-raw, format=(string)I420, width=(int)2560, height=(int)1440, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)1:0:0:0, framerate=(fraction)30/1

  1. Read decoded video from shared buffer (input the width,height,format,framerate caps from above):

gst-launch-1.0 shmsrc do-timestamp=true is-live=true socket-path=/tmp/foo ! video/x-raw,width=2560,height=1440,format=I420,framerate=30/1 ! autovideosink

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment