Skip to content

Instantly share code, notes, and snippets.

Embed
What would you like to do?
How to make a Raspberry Pi an RTSP streamer and how to consume this?

RTSP streaming from Raspberry PI

Introduction

This gist describes the necessary software installation steps for a Raspberry PI in order to enable the PI's camera to act as an external camera for the Dragonfly Java application. This gist shows, how to make a Raspberry PI an RTSP streaming server. The resulting feed can then be used as input for the Dragonfly Java app or the Accuware Dragonfly Demo - Calibration Mode server. The RTSP server on the Raspberry PI must be made publicly available, if calibration is a requirement.

Prerequisites

  • Raspberry PI Zero W, 2, 3, 3b, 3b+, 4 with a Raspberry PI-Cam 5 MP or a SainSmart Wide Angle Fish-Eye Cam (recommended).

  • USB cams are not addressed by this gist, even though possible. The configuration of U4VL is slightly different. We don't recommend the use of USB cameras with a Raspberry PI for latency reasons.

  • Raspbian Stretch Lite or

  • Raspbian Buster/Buster Lite

  • Ubiquity Robotics Raspberry PI image. This gist refers to the 2019-06-19 image available here. Basically it is an Ubuntu 16.04 Mate derivate.

  • A recent version of the GStreamer on your Mac or Linux PC (1.14 sufficient, 1.16 recommended). Check the GStreamer documentation, how to install it.

Please note: Although technically possible to use any of the a.m. Raspberry PI devices, we recommend to use at least a Raspberry PI 3B+, since it has sufficient computation power and allows you to use 5GHz Wifi, which is mostly not that crowded.

Before first boot and after having flashed the SD card with the image (e.g. using Etcher):

  • Enable headless SSH by placing an empty ssh file into the boot partition /boot of the SD
  • Enable headless Wifi by placing a wpa_supplicant.conf into the boot partition to the boot partition /boot of the SD. 5GHz Wifi preferred, if possible.
country=<your-two-letter-code>
ctrl_interface=DIR=/var/run/wpa_supplicant GROUP=netdev
update_config=1
network={
       ssid="<your-ssid>"
       psk="<your-password>"
       key_mgmt=WPA-PSK
}

On problems with Wifi: https://www.raspberrypi.org/forums/viewtopic.php?t=191061

  • Find the IP of the PI
  • SSH to it

Installations

After having found out the IP of the IP on your network SSH to the PI:

ssh pi@<ip-of-pi>

Initial password is raspberry.

Update and Upgrade

sudo apt-get update
sudo apt-get upgrade

Configure the PI once you are at console level

sudo raspi-config
  • Change user password. This is from then on your SSH password. Strongly recommended.
  • Interfacing options/Enable camera

Reboot the PI

sudo reboot

SSH to the PI

ssh pi@<ip-of-pi>

Use your changed password now.

sudo apt-get update
sudo apt-get upgrade
sudo apt-get install gstreamer1.0-tools
sudo apt-get install gstreamer1.0-plugins-good
sudo apt-get install gstreamer1.0-plugins-bad
sudo apt-get install gstreamer1.0-plugins-ugly
sudo apt-get install gstreamer1.0-libav

Test your installation. In order to do this run this command on your Pi:

raspivid -t 0 -w 640 -h 480 -fps 48 -b 2000000 -awb tungsten  -o - | gst-launch-1.0 -v fdsrc ! h264parse ! rtph264pay config-interval=1 pt=96 ! gdppay ! tcpserversink host=0.0.0.0 port=5000

If you have installed GStreamer via the DMG file installation package, run this command in a terminal on your Mac:

/Library/Frameworks/GStreamer.framework/Commands/gst-launch-1.0 -v tcpclientsrc host=<your_Pi's_IP> port=5000 ! gdpdepay ! rtph264depay ! avdec_h264 ! videoconvert  ! osxvideosink sync=false

Run this command in a terminal on your Linux PC. The same command works on your Mac, if you have installed GStreamer via brew:

gst-launch-1.0 -v tcpclientsrc host=<your_Pi's_IP> port=5000 ! gdpdepay ! rtph264depay ! avdec_h264 ! videoconvert  ! autovideosink sync=false

You should see a video. This is already H.264, but not RTSP.

So we are going to enable this now. Terminate the raspivid server on your PI by typing CTRL-C.

sudo apt-get install libglib2.0-dev
sudo apt-get install libgstreamer1.0-dev libgstreamer-plugins-base1.0-dev

Download gst-rtsp-server from https://gstreamer.freedesktop.org/src/. Should match the installed gstreamer version. This can be checked by

dpkg -l | grep gstreamer

At the time of writing this is 1.14.4 for Buster lite. Please edit the commands below for other versions (i.e. replace all occurrences of 1.14.4 with the code of the version you are using).

wget https://gstreamer.freedesktop.org/src/gst-rtsp-server/gst-rtsp-server-1.14.4.tar.xz
tar -xf gst-rtsp-server-1.14.4.tar.xz
cd gst-rtsp-server-1.14.4
./configure
make
sudo make install

Test your setup:

cd examples
./test-launch --gst-debug=3 '( videotestsrc !  x264enc ! rtph264pay name=pay0 pt=96 )'

If you have installed GStreamer via the DMG file installation package, run this command in a terminal on your Mac:

/Library/Frameworks/GStreamer.framework/Commands/gst-launch-1.0 -v rtspsrc location=rtsp://<your_Pi's_IP:8554/test latency=0 buffer-mode=auto ! decodebin ! videoconvert ! osxvideosink sync=false

Run this command in a terminal on your Linux PC. The same command works on your Mac, if you have installed GStreamer via brew:

gst-launch-1.0 -v rtspsrc location=rtsp://<your_Pi's_IP>:8554/test latency=0 buffer-mode=auto ! decodebin ! videoconvert ! autovideosink sync=false

You should see a test video.

Terminate the RTSP server on the PI with CTRL-C.

Now we want to see the camera. For this we use the nice raspivid wrapper gst-rpicamsrc.

cd ~
sudo apt-get install git
git clone https://github.com/thaytan/gst-rpicamsrc.git

For Raspbian OS

Proceed with

cd gst-rpicamsrc
./autogen.sh
make
sudo make install

For Ubiquity Robotics OS

As the time of this writing the Raspbian build procedure does not work on Ubiquity. While compilation and linking is going fine, there is a runtime issue:

mmal: mmal_component_create_core: could not find component 'vc.ril.camera'

The problem has been discussed and resolved here.

The workaround is to use meson and ninja in order to build the camera driver. Since the default version of meson is not sufficient for the build, you need to install meson via the Python3 installer in order to get a more recent meson version.

Proceed with:

cd gst-rpicamsrc
pip3 install --user meson
mkdir build
~/.local/bin/meson --prefix=/usr build
ninja -C build -v
cd build
sudo ninja install

Finally for both OS

Check, if gst-rpicamsrc is installed

gst-inspect-1.0 | grep rpicamsrc

Now for the final test:

cd ../gst-rtsp-server-1.14.4/examples
./test-launch --gst-debug=3 "( rpicamsrc bitrate=8000000 awb-mode=tungsten preview=false ! video/x-h264, width=640, height=480, framerate=30/1 ! h264parse ! rtph264pay name=pay0 pt=96 )"

If it runs, remove --gst-debug=3 and let it run as a deamon by appending & to the command line above.

If you have installed GStreamer via the DMG file installation package, run this command in a terminal on your Mac:

/Library/Frameworks/GStreamer.framework/Commands/gst-launch-1.0 -v rtspsrc location=rtsp://<your_Pi's_IP>:8554/test latency=0 buffer-mode=auto ! decodebin ! videoconvert ! osxvideosink sync=false

Run this command in a terminal on your Linux PC. The same command works on your Mac, if you have installed GStreamer via brew:

gst-launch-1.0 -v rtspsrc location=rtsp://<your_Pi's_IP>:8554/test latency=0 buffer-mode=auto ! decodebin ! videoconvert ! autovideosink sync=false

For more insight into gst-rpicamsrc and possible other parameters as already used above:

https://sparkyflight.wordpress.com/tag/gst-rpicamsrc/

@adamoell

This comment has been minimized.

Copy link

@adamoell adamoell commented Mar 19, 2020

Thank you so much for this - I have never worked with gstreamer before but your detailed instructions worked perfectly, first time! I especially appreciated the pauses to test each component as you led me through it. A really impressive tutorial.

@neilyoung

This comment has been minimized.

Copy link
Owner Author

@neilyoung neilyoung commented Mar 20, 2020

Thanks for letting me know and glad that it worked. I wrote that a year ago, mostly to remind myself. In case you are interested: There are also ways to use the PI with WebRTC, a way more modern approach. You can easily use your browser to connect to it.

If you would like to consume the RTSP server from your browser, then there is a proxy solution, described here https://github.com/accuware/gstreamer

@neilyoung

This comment has been minimized.

Copy link
Owner Author

@neilyoung neilyoung commented Mar 20, 2020

I have updated the gist to reflect the latest knowledge.

@adamoell

This comment has been minimized.

Copy link

@adamoell adamoell commented Mar 20, 2020

Fantastic, thanks Neil - ultimately working on getting the stream into OBS with obs-gstreamer, and slowly exhausting all the blind alleys! I really appreciate the update; your help has already been invaluable.

@wvalcke

This comment has been minimized.

Copy link

@wvalcke wvalcke commented Jun 1, 2020

Nice explanation about rtsp streaming, thanks for this.
I tried running the example using 1280x720 @ 25 frames, it starts well and the receiving side shows the pi camera stream real time.
But after some time you start seeing delays, there seems to be a buffer queuing up frames, and after some time the delay is really noticeable (more than 1 second). The longer you wait the worse it gets. Restarting the client syncs again, to get more and more delay the more you wait.
Starting with resolution 1920x1080 gives from the beginning a serious delay.
Somebody any idea what could be done to avoid this ? It must be definitely possible as i can stream to a janus server 1920x1080@25 frames, and the webbrowser shows the image in real time (even after hours of playing).

@adamoell

This comment has been minimized.

Copy link

@adamoell adamoell commented Jun 1, 2020

I found that the laptop I was streaming to was struggling to decode the 1080p stream in realtime. Actually no problems on the RPi (it's a 4B, but seems to be OK on a 3B as well), but the low-clock-speed i3 in the laptop was just about OK with 1280x720, but at 1920x1080 just wasn't keeping up, with similar drift of latency as you describe. Can you try with a faster receiving machine?

@wvalcke

This comment has been minimized.

Copy link

@wvalcke wvalcke commented Jun 2, 2020

Hi adamoell
I tried again with a powerful machine at the receiving side (Intel core i9)
The raspberry pi side runs the following command

./test-launch --gst-debug=3 "( rpicamsrc bitrate=8000000 awb-mode=tungsten preview=false ! video/x-h264, width=1280, height=720, framerate=25/1, profile=constrained-baseline ! h264parse ! rtph264pay name=pay0 pt=96 )"

At the receiving side i run the following command:

gst-launch-1.0 -v rtspsrc location=rtsp://<ip of pi>:8554/test latency=0 buffer-mode=auto ! decodebin ! videoconvert ! autovideosink sync=false

And i see the video stream in realtime, with a very small delay (maybe a few 100 millis). This is really fantastic, it works with multiple clients and i tested as well 1920x1080 which works as good as the 1280x720 resolution. So it was definitely the receiving side who was responsible for the delay accumulating over time.
Thanks for the tip and also thanks to neilyoung for sharing this info.

@neilyoung

This comment has been minimized.

Copy link
Owner Author

@neilyoung neilyoung commented Jun 2, 2020

Cool :)

@adamoell

This comment has been minimized.

Copy link

@adamoell adamoell commented Jun 2, 2020

Excellent!

@wvalcke

This comment has been minimized.

Copy link

@wvalcke wvalcke commented Jun 2, 2020

If people are interested, i made a python program using openCV to grab the rtsp stream and display it.
Here the code (remember you need an OpenCV version with gstreamer support)


cap = cv2.VideoCapture()
res = cap.open("rtspsrc latency=0 buffer-mode=auto location=rtsp://raspberrypi:8554/test ! decodebin ! videoconvert ! appsink max-buffers=1 drop=true")

while True:
    err, frame = cap.read()

    cv2.imshow("Output", frame)

    key = cv2.waitKey(1)
    if key == ord('q'):
        break

This program shows the rtsp stream in realtime, exactly the same as the gstreamer receiver. I tried running the gstreamer receiver and the python program at the same time, and you cannot see any difference between the two. So another step forward in my project.
My next step will be to try to get the test-launch program combining the rtsp server and an rtp stream towards janus gateway. This way we have a stream via rtsp and via Webrtc. This makes it possible to view the camera image via a webbrowser as well (with the same realtime display as we have with the rtsp stream).

@adamoell

This comment has been minimized.

Copy link

@adamoell adamoell commented Jun 2, 2020

Great stuff wvalcke - I would be interested in hearing of your progress! Just yesterday I knocked this together, which is the beginning of a Python script which will replace test-launch in my setup:

`#!/usr/bin/env python3

-*- coding:utf-8 vi:ts=4:noexpandtab

Simple RTSP server. Run as-is or with a command-line to replace the default pipeline

SETUP

sudo pip3 install tendo

sudo apt install gir1.2-gst-rtsp-server-1.0

import sys
import gi
from tendo import singleton

me = singleton.SingleInstance() # will quit -1 if already running

gi.require_version('Gst', '1.0')
from gi.repository import Gst, GstRtspServer, GObject

loop = GObject.MainLoop()
GObject.threads_init()
Gst.init(None)

class MyFactory(GstRtspServer.RTSPMediaFactory):
def init(self):
GstRtspServer.RTSPMediaFactory.init(self)

def do_create_element(self, url):
    pipeline_str = "( rpicamsrc bitrate=20000000 awb-mode=off awb-gain-blue=1.8 awb-gain-red=1.6 exposure-compensation=5 preview=false vflip=false hflip=false ! video/x-h264, width=1920, height=1080, framerate=30/1 ! h264parse ! rtph264pay name=pay0 pt=96 )"
    
    if len(sys.argv) > 1:
        pipeline_str = " ".join(sys.argv[1:])
    print(pipeline_str)
    return Gst.parse_launch(pipeline_str)

class GstServer():
def init(self):
self.server = GstRtspServer.RTSPServer()
f = MyFactory()
f.set_shared(True)
m = self.server.get_mount_points()
m.add_factory("/test", f)
self.server.attach(None)

if name == 'main':
s = GstServer()
loop.run()
`
My plan is to have a multi-camera into OBS setup, with non-video comms (control etc) brokered through MQTT. I am still needing to get obs-gstreamer working in my setup... one step at a time!

@wvalcke

This comment has been minimized.

Copy link

@wvalcke wvalcke commented Jun 2, 2020

Using the test-launch software i was able to stream the camera data to a Janus server and setup a rtsp server as well.
The test-launch program runs on a raspberry pi 3.

./test-launch --gst-debug=3 "( rpicamsrc bitrate=8000000 awb-mode=tungsten preview=false ! video/x-h264, width=1280, height=720, framerate=25/1, profile=constrained-baseline ! tee name=t ! queue  ! h264parse ! rtph264pay name=pay0 pt=96 t. ! queue ! h264parse ! rtph264pay config-interval=1 pt=96 ! udpsink host=127.0.0.1 port=8004 sync=false )"

The test-launch software sets up a rtsp server which accepts multiple clients. At the receiving side it is tested with a Python Openv program (see above) and with the gst-launch software running at the same time. (Receiver is a Linux PC, core i9).
The test-launch pipeline also duplicates the h264 raw data from the camera and creates another rtp stream and sends this to 127.0.0.1 (raspberry pi) on udp port 8004. This port is where the Janus WebRtc gateway listens on.
Now we can launch a webbrowser as well and see the same camera image inside a webbrowser using WebRtc.
The test i did was having 2 clients at rtsp side (gst-launch and Python OpenCv) and a web browser connected to the stream via Janus WebRtc gateway. The result is a superb video stream where all 3 videos show the exact same real time video. There is literally NO difference visible. The video resolution used was 1280x720@25fps.
The CPU load at the raspberry pi is for the janus process 12%, for the test-launch process 12% and this with 3 clients connected to it!
Once again, thanks for sharing the gist, it was a helpful start!

@jarhed

This comment has been minimized.

Copy link

@jarhed jarhed commented Jun 11, 2020

Thanks so much for this. Fantastic post, really helpful.

@sukrugorgulu

This comment has been minimized.

Copy link

@sukrugorgulu sukrugorgulu commented Jun 20, 2020

Hi, thanks for this detailed step-by-step guide. I succeeded to start stream video from the picamera of my raspberry pi. I also have an usb sound card for audio input. Could you tell how to add audio input to my stream? (usb-sound-card mic input is recognized and working that I could locally save a mp4 with muxing audio to video using ffmpeg in python). I want to use the raspberry as video+audio generic-rtsp ip-cam.

Edit:
I have just succeeded streaming live audio with rtsp:
./test-launch "( alsasrc device="default:CARD=Device" ! audioconvert ! rtpL16pay name=pay0 pt=11)"
Listened the audio with:
gst-launch-1.0 rtspsrc location=rtsp://<rpi's IP address>:8554/test latency=0 ! rtpL16depay ! audioconvert ! volume volume=1.3 ! alsasink

The question is how can I put video and audio together in one rtsp stream?

@MADXhh

This comment has been minimized.

Copy link

@MADXhh MADXhh commented Sep 8, 2020

@sukrugorgulu

Hi, thanks for this detailed step-by-step guide. I succeeded to start stream video from the picamera of my raspberry pi. I also have an usb sound card for audio input. Could you tell how to add audio input to my stream? (usb-sound-card mic input is recognized and working that I could locally save a mp4 with muxing audio to video using ffmpeg in python). I want to use the raspberry as video+audio generic-rtsp ip-cam.

Edit:
I have just succeeded streaming live audio with rtsp:
./test-launch "( alsasrc device="default:CARD=Device" ! audioconvert ! rtpL16pay name=pay0 pt=11)"
Listened the audio with:
gst-launch-1.0 rtspsrc location=rtsp://<rpi's IP address>:8554/test latency=0 ! rtpL16depay ! audioconvert ! volume volume=1.3 ! alsasink

The question is how can I put video and audio together in one rtsp stream?

Try:

./gst-rtsp-server-1.14.4/examples/test-launch "( alsasrc device="hw:1,0" ! "audio/x-raw,channels=1,rate=48000" ! audioconvert ! opusenc ! rtpopuspay name=pay1 pt=97 ! rpicamsrc bitrate=8000000 awb-mode=tungsten preview=false ! video/x-h264, width=1640, height=1232, framerate=30/1 ! h264parse ! rtph264pay name=pay0 pt=96 )" 

You may need to install:

sudo apt install gstreamer1.0-alsa
@krambriw

This comment has been minimized.

Copy link

@krambriw krambriw commented Oct 18, 2020

Hi, sounds great!
I know what you wrote about usb cams but do you have any hint how to use them anyway? I have a number of RPI3's with good quality usb cams as /dev/video0, /dev/video1 etc. Currently streaming with mjpg-streamer and it is working great but as http. I would like to try rtsp instead

Best regards, Walter

@neilyoung

This comment has been minimized.

Copy link
Owner Author

@neilyoung neilyoung commented Oct 18, 2020

I'm not sure, if I ever did try that, but from the theory it should be possible to replace rpicamsrc bitrate=8000000 awb-mode=tungsten preview=falseby v4l2src(https://gstreamer.freedesktop.org/documentation/video4linux2/v4l2src.html?gi-language=c)

Maybe like so v4l2src device=/dev/video0...

@krambriw

This comment has been minimized.

Copy link

@krambriw krambriw commented Oct 18, 2020

Tried this but get errors. Any further idea?
Best regards, Walter

pi@raspberrypi:~/gst-rtsp-server-1.14.4/examples $ ./test-launch --gst-debug=3 "( v4l2src device=/dev/video0 ! video/x-h264, width=640, height=480, framerate=30/1 ! h264parse ! rtph264pay name=pay0 pt=96 )"
stream ready at rtsp://127.0.0.1:8554/test

So far seems good but when connecting from client:

pi@raspberrypi:~ $ gst-launch-1.0 -v rtspsrc location=rtsp://127.0.0.1:8554/test latency=0 buffer-mode=auto ! decodebin ! videoconvert ! autovideosink sync=false
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Got context from element 'autovideosink0': gst.gl.GLDisplay=context, gst.gl.GLDisplay=(GstGLDisplay)"(GstGLDisplayX11)\ gldisplayx11-0";
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://127.0.0.1:8554/test
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
ERROR: from element /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0: Could not read from resource.
Additional debug info:
gstrtspsrc.c(5691): gst_rtspsrc_send (): /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0:
Got error response: 503 (Service Unavailable).
ERROR: pipeline doesn't want to preroll.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

and on server side I get the following errors/warnings

pi@raspberrypi:~/gst-rtsp-server-1.14.4/examples $ ./test-launch --gst-debug=3 "( v4l2src device=/dev/video0 ! video/x-h264, width=640, height=480, framerate=30/1 ! h264parse ! rtph264pay pt=96 name=pay0 )"
stream ready at rtsp://127.0.0.1:8554/test
0:00:10.289544715 4763 0x74d0fa60 WARN basesrc gstbasesrc.c:3055:gst_base_src_loop: error: Internal data stream error.
0:00:10.289695600 4763 0x74d0fa60 WARN basesrc gstbasesrc.c:3055:gst_base_src_loop: error: streaming stopped, reason not-negotiated (-4)
0:00:10.290098880 4763 0x75b0d230 WARN rtspmedia rtsp-media.c:2722:default_handle_message: 0x75b2f1c0: got error Internal data stream error. (gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:media-pipeline/GstBin:bin0/GstV4l2Src:v4l2src0:
streaming stopped, reason not-negotiated (-4))
0:00:10.290276276 4763 0x1d64a90 WARN rtspmedia rtsp-media.c:2991:wait_preroll: failed to preroll pipeline
0:00:10.290338099 4763 0x1d64a90 WARN rtspmedia rtsp-media.c:3295:gst_rtsp_media_prepare: failed to preroll pipeline
0:00:10.293704079 4763 0x1d64a90 ERROR rtspclient rtsp-client.c:1044:find_media: client 0x1d6f090: can't prepare media
0:00:10.296189384 4763 0x1d64a90 ERROR rtspclient rtsp-client.c:2899:handle_describe_request: client 0x1d6f090: no media

@neilyoung

This comment has been minimized.

Copy link
Owner Author

@neilyoung neilyoung commented Oct 18, 2020

I don't think your USB cam will be able to deliver H.264 by hardware, so the entire source pipeline would need to look completely different. It seems to be not sufficient to just replace the source, as long as the hardware (the USB cam) does not provide the H.264 (as required by the rest of the pipeline). There are USB cams capable of doing this, but obviously not yours...

Haven't checked this reference completely (https://stackoverflow.com/questions/29236209/why-can-i-stream-h264-encoded-video-from-webcam-to-both-display-and-file-but-no) but basically this is the idea: Get raw frames from the camera (YUV preferably) and pipe that to a software encoder on the PI (H.264 or VP8)...

@krambriw

This comment has been minimized.

Copy link

@krambriw krambriw commented Oct 18, 2020

I don't think your USB cam will be able to deliver H.264 by hardware

Yes of course, you are right

Thanks anyway

@bobdavis512

This comment has been minimized.

Copy link

@bobdavis512 bobdavis512 commented Oct 19, 2020

Nice rtsp raspberry solution, thank you.

How may I configure Gstreamer source in OBS to capture the rtsp raspberry pi stream?

The following attempts (and more) did not produce video in OBS Studio from Raspberry Pi ip address.
rtspsrc location=rtsp://raspberrypi:8554/test latency=0 buffer-mode=auto ! decodebin ! video.
rtspsrc location=rtsp://raspberrypi:8554/test latency=0 buffer-mode=auto ! decodebin ! videoconvert ! video.
rtspsrc location=rtsp://raspberrypi:8554/test latency=0 buffer-mode=auto ! decodebin ! videoconvert ! autovideosink ! video.
rtspsrc location=rtsp://raspberrypi:8554/test latency=0 buffer-mode=auto ! decodebin ! videoconvert ! autovideosink sync=false ! video.

On the same client computer, the following gst-launch-1.0 works fine in command line.
gst-launch-1.0 -v rtspsrc location=rtsp://raspberrypi:8554/test latency=0 buffer-mode=auto ! decodebin ! videoconvert ! autovideosink sync=false

Any suggestions?

Thank you,
Bob512

@neilyoung

This comment has been minimized.

Copy link
Owner Author

@neilyoung neilyoung commented Oct 19, 2020

I'm not familiar with OBS studio, but since it is an app, you might try to replace "autovideosink sync=false" by "appsink", in the hope, that there will be some instance within OBS studio capable of dealing with it. Just a wild guess...

@neilyoung

This comment has been minimized.

Copy link
Owner Author

@neilyoung neilyoung commented Oct 19, 2020

BTW: What is that "! video" element? Is this required by OBS?

@sukrugorgulu

This comment has been minimized.

Copy link

@sukrugorgulu sukrugorgulu commented Oct 19, 2020

@sukrugorgulu

Hi, thanks for this detailed step-by-step guide. I succeeded to start stream video from the picamera of my raspberry pi. I also have an usb sound card for audio input. Could you tell how to add audio input to my stream? (usb-sound-card mic input is recognized and working that I could locally save a mp4 with muxing audio to video using ffmpeg in python). I want to use the raspberry as video+audio generic-rtsp ip-cam.
Edit:
I have just succeeded streaming live audio with rtsp:
./test-launch "( alsasrc device="default:CARD=Device" ! audioconvert ! rtpL16pay name=pay0 pt=11)"
Listened the audio with:
gst-launch-1.0 rtspsrc location=rtsp://<rpi's IP address>:8554/test latency=0 ! rtpL16depay ! audioconvert ! volume volume=1.3 ! alsasink
The question is how can I put video and audio together in one rtsp stream?

Try:

./gst-rtsp-server-1.14.4/examples/test-launch "( alsasrc device="hw:1,0" ! "audio/x-raw,channels=1,rate=48000" ! audioconvert ! opusenc ! rtpopuspay name=pay1 pt=97 ! rpicamsrc bitrate=8000000 awb-mode=tungsten preview=false ! video/x-h264, width=1640, height=1232, framerate=30/1 ! h264parse ! rtph264pay name=pay0 pt=96 )" 

You may need to install:

sudo apt install gstreamer1.0-alsa

I already found a solution. Thanks for the reply. If I need an alternative in the future, I will check this one.

@bobdavis512

This comment has been minimized.

Copy link

@bobdavis512 bobdavis512 commented Oct 19, 2020

BTW: What is that "! video" element? Is this required by OBS?

Yes, ! video. is required by OBS. Likewise if using Gstreamer source for audio, then ! audio. is required.

For example:
Gstreamer source test in OBS
videotestsrc is-live=true ! video/x-raw, framerate=30/1, width=960, height=540 ! video.
audiotestsrc wave=ticks is-live=true ! audio/x-raw, channels=2, rate=44100 ! audio.

@bobdavis512

This comment has been minimized.

Copy link

@bobdavis512 bobdavis512 commented Oct 19, 2020

Another question of how to get the client gst-launch-1.0 to open in separate command x-window?

The following gst-launch-1.0 will open the video, but it does not open in a desktop x-window screen with minimum, maximize options.

gst-launch-1.0 -v rtspsrc location=rtsp://raspberrypi:8554/test latency=0 buffer-mode=auto ! decodebin ! videoconvert ! autovideosink sync=false

Thank you for post support.
Bob512

@neilyoung

This comment has been minimized.

Copy link
Owner Author

@neilyoung neilyoung commented Oct 19, 2020

I'm sorry, but I think the current window supplied by autovideosink is all what you can get. If you write your own GStreamer client app then you are free to open your own windows.

@bobdavis512

This comment has been minimized.

Copy link

@bobdavis512 bobdavis512 commented Oct 19, 2020

On my client, although the video stream is working, nonetheless it is generating GStreamer-CRITICAL message.

gst-launch-1.0 -v rtspsrc location=rtsp://raspberrypi:8554/test latency=0 buffer-mode=auto ! decodebin ! videoconvert ! autovideosink sync=false

Returns and repeats ...
(gst-launch-1.0:15567): GStreamer-CRITICAL **: 16:55:58.976: gst_buffer_resize_range: assertion 'bufmax >= bufoffs + offset + size' failed

Any suggestion?
Thanks,
Bob512

@neilyoung

This comment has been minimized.

Copy link
Owner Author

@neilyoung neilyoung commented Oct 19, 2020

No unfortunately not. I'm not that inside GStreamer to be able to help you out here, but it seems to be a common problem (I quickly searched on Google)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.