Skip to content

Instantly share code, notes, and snippets.

@nebgnahz
Last active November 6, 2023 12:28
Show Gist options
  • Save nebgnahz/26a60cd28f671a8b7f522e80e75a9aa5 to your computer and use it in GitHub Desktop.
Save nebgnahz/26a60cd28f671a8b7f522e80e75a9aa5 to your computer and use it in GitHub Desktop.
Collections of GStreamer usages

Most GStreamer examples found online are either for Linux or for gstreamer 0.10.

This particular release note seems to have covered important changes, such as:

  • ffmpegcolorspace => videoconvert
  • ffmpeg => libav

Applying -v will print out useful information. And most importantly the negotiation results.

Before anything, you will need to install it: On Mac OS, brew will work for the most of the times. To install x264 support, simply type brew options gst-plugins-ugly and you will see --with-x264 as an options. Sometimes you might need to brew reinstall to get some plugins (not sure why? it just seems reinstallation helps).

From Swarmbox to Mac OS X

sender:

gst-launch-1.0 -v v4l2src device=/dev/video1 ! 'video/x-h264,width=800,height=448,framerate=30/1' ! h264parse ! rtph264pay config-interval=10 pt=96 ! udpsink host=192.168.0.100 port=9000

receiver:

gst-launch-1.0 -v udpsrc port=9000 ! 'application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)"Z0JAKLtAZBy/gKJAAAADAEAAAA8YEAALcbAAtx73vheEQjU\=\,aM44gA\=\=", payload=(int)96, ssrc=(uint)3725838184, timestamp-offset=(uint)2716743768, seqnum-offset=(uint)769' ! rtpjitterbuffer ! rtph264depay ! avdec_h264 ! videoconvert ! facedetect ! videoconvert ! glimagesink

Need to understand what sprop-parameter-sets does.

Most of the commands below are mainly tested on Mac OS X

Simple Commands

gst-launch-1.0 videotestsrc ! osxvideosink
gst-launch-1.0 playbin uri=file:///<filepath>

H264 encoding and decoding

Start by putting all of them in a single pipeline

gst-launch-1.0 videotestsrc is-live=true ! x264enc ! h264parse ! avdec_h264 ! videoconvert ! osxvideosink

Another success I had is to use VideoToolbox. The caps filter are somewhat needed so that negotiation will succeed. Not sure how this approach compares to the x264 one.

gst-launch-1.0 -v videotestsrc ! vtenc_h264 ! video/x-h264,width=640,height=480,framerate=30/1 ! vtdec_hw ! 'video/x-raw,format=NV12' ! videoconvert ! osxvideosink

You can save the stream as a file and play it with playbin:

gst-launch-1.0 videotestsrc ! x264enc ! mpegtsmux ! filesink location=testfile.ts

H264 encoding/decoding with RTP payload

gst-launch-1.0 videotestsrc is-live=true ! x264enc ! h264parse ! rtph264pay ! rtph264depay ! avdec_h264 ! videoconvert ! osxvideosink

H264 encoding/decoding with UDP-RTP

Sender:

gst-launch-1.0 videotestsrc is-live=true ! video/x-raw,framerate=25/1 ! videoconvert ! x264enc ! h264parse ! rtph264pay pt=96 ! udpsink host=127.0.0.1 port=5000

Receiver:

gst-launch-1.0 -v udpsrc port=5000 ! application/x-rtp,clock-rate=90000,payload=96 ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! osxvideosink

This kindof works, except the receiving side is telling me things are too slow.

There may be a timestamping problem, or this computer is too slow.

H264 encoding/decoding with TCP

gst-launch-1.0 videotestsrc horizontal-speed=5 ! x264enc tune="zerolatency" threads=1 ! tcpserversink port=4444
gst-launch-1.0 tcpclientsrc port=4444 host=localhost ! h264parse ! avdec_h264 ! glimagesink

This works fine, I did wait for a couple of seconds before seeing the rolling test source.

To stream to VLC:

GStreamer sender

gst-launch-1.0 videotestsrc ! vtenc_h264 ! rtph264pay config-interval=10 pt=96 ! udpsink host=127.0.0.1 port=5000

VLC Receiver

$ cat test.sdp
v=0
m=video 5000 RTP/AVP 96
c=IN IP4 127.0.0.1
a=rtpmap:96 H264/90000

With OpenCV

You can install opencv plugins in gst-plugins-bad by brew install gst-plugins-bad --with-opencv.

Then you can live test face detection in your pipeline.

gst-launch-1.0 avfvideosrc ! videoconvert ! facedetect ! videoconvert ! osxvideosink

Resolution

It seems that OS X avfvideosrc only supports the following resolution: 320x240, 352x288, 640x480, 960x540, 1280x720. Otherwise, it will report ERROR: from element /GstPipeline:pipeline0/GstAVFVideoSrc:avfvideosrc0: Internal data flow error.

Some typical ways people measuring the performance (use end to end FPS)

Generate N numbers of buffers and get the execution time.

gst-launch-1.0 filesrc num-buffers=100 location=/dev/zero blocksize=8294400 ! videoparse format=rgba width=1920 height=1080 ! glimagesink sync=false

You would have something like:

Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
Got context from element 'sink': gst.gl.GLDisplay=context, gst.gl.GLDisplay=(GstGLDisplay)"\(GstGLDisplayCocoa\)\ gldisplaycocoa0";
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Got EOS from element "pipeline0".
Execution ended after 0:00:00.921270000
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

So we run 100 1920x1080 frames within 0.92 seconds on Mac OS.

This is pretty impressive if you compare this to RPi. What an unfair comparison...

Pipeline module processing time

The marking library is helpful to get the execution time with markin and markout. The fork fixed the autoconf compilation issue.

A typical usage:

gst-launch-1.0 -v --gst-debug=markout:5 --gst-plugin-path=$HOME/repos/gstreamer_timestamp_marking/src avfvideosrc num-buffers=300 ! videoconvert ! markin ! faceblur profile=/usr/local/Cellar/opencv/2.4.13/share/OpenCV/haarcascades/haarcascade_frontalface_alt2.xml ! markout ! videoconvert ! osxvideosink sync=true

I've been using this to measure the processing time of OpenCV faceblur. On Macbook Pro 2015, it's somewhere around 10ms (using resolution: 320x240).

Three terminals:

A -> B (face detect box) -> C

  • A: gst-launch-1.0 avfvideosrc ! x264enc tune="zerolatency" threads=1 ! tcpserversink port=4444
  • B: gst-launch-1.0 tcpclientsrc port=4444 host=localhost ! h264parse ! avdec_h264 ! videoconvert ! facedetect ! videoconvert ! x264enc tune="zerolatency" threads=1 ! tcpserversink port=5555
  • C: gst-launch-1.0 tcpclientsrc port=5555 host=localhost ! h264parse ! avdec_h264 ! glimagesink
@ljfantin
Copy link

Hi,
Is possible create a virtual device on OSX ?

I found
gst-launch-1.0 -v videotestsrc pattern=snow ! video/x-raw,width=1280,height=720 ! v4l2sink device=/dev/video1
But v4l2sink doesn't work on OSX
Is possible create a virtual device with osxvideosink ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment