Skip to content

Instantly share code, notes, and snippets.

@velovix
Last active April 11, 2024 15:26
Show Gist options
  • Star 44 You must be signed in to star a gist
  • Fork 11 You must be signed in to fork a gist
  • Save velovix/8cbb9bb7fe86a08fb5aa7909b2950259 to your computer and use it in GitHub Desktop.
Save velovix/8cbb9bb7fe86a08fb5aa7909b2950259 to your computer and use it in GitHub Desktop.
The text version of my GStreamer talk at sunhacks 2020

Introduction

Hi everyone! Today I'm going to be giving you a crash course in video processing using Python. Coming out of this talk, you'll be able to take video from pretty much any source, decode it, apply visual effects, and display it on-screen. To do this, we're going to be using a library named GStreamer, an incredibly powerful and versatile framework. This is the same tool that the pros use, but don't feel intimidated! GStreamer actually makes it very easy to do impressive things with video and you'll be well on your way to making something great in just the time it takes to watch this talk.

If you fall behind at any point during the live presentation, don't worry! I have a text version of this talk available with the same content and more. There should be a link in the description.

Installing Dependencies

Let's start by installing everything we'll need to start using GStreamer in Python. This is probably the hardest part, so if you managed to do it before this talk, it's all smooth sailing from here! If not, no worries! I'm going to go through how to install everything on Windows 10 right here. I would recommend opening up the text version of this talk, because I have links to the stuff we'll be downloading and you'll probably want to copy and paste a few of the long commands we'll be running. If you're using macOS or Linux, you'll find separate instructions for how to install everything for those platforms there as well.

Windows 10

We're going to be using a tool called MSYS2 to download everything we need to get started. MSYS2 makes it easy to set up development environments on Windows.

Download the latest stable release of MSYS2 from the releases page. At the time of writing, the latest release (2020-09-03) is available here. Then run the installer, accepting all the defaults, but unchecking "Run MSYS2".

Once it's installed, start "MSYS2 MinGW 64-bit" from the Start Menu. This will open up the MSYS2 terminal.

Let's get MSYS2 up-to-date by running the following command:

pacman -Syu

After this command finishes, it may need to close. Just open it right back up again!

Now, we're ready to install everything we need! The following command installs GStreamer, some plugins, Python, and the PyGObject library.

pacman -S mingw-w64-x86_64-gstreamer mingw-w64-x86_64-gst-devtools mingw-w64-x86_64-gst-plugins-{base,good,bad,ugly} mingw-w64-x86_64-python3 mingw-w64-x86_64-python3-gobject

Finally, you're going to need a text editor to write code! I use Visual Studio Code but you can use whatever you like. Even Notepad is fine!

macOS

Homebrew, a package manager for macOS, makes it easy to install everything we need for this project. To install Homebrew, simply run the following command in the terminal:

/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install.sh)"

Then, run this command to install everything we need:

brew install gstreamer gst-devtools gst-plugins-{base,good,bad,ugly} python@3 pygobject3

Ubuntu, Debian, elementary OS, Pop!_OS

Installing everything we need on Ubuntu and related operating systems is easy! Just run the following command in the terminal:

sudo apt install libgstreamer1.0-0 gstreamer1.0-plugins-{base,good,bad,ugly} gstreamer1.0-tools python3-gi gir1.2-gstreamer-1.0

Arch Linux, Manjaro

Like Ubuntu, installing everything on Arch Linux or Manjaro is just a matter of running the following command in the terminal:

sudo pacman -S gstreamer gst-plugins-{base,good,bad,ugly} python python-gobject

Digital Video Concepts

While you're waiting on everything to install, let's take a step back. A wise scholar once said: Before you decode the video, you must understand the video. You must be the video.

At a fundamental level, video is presented to viewers as a list of images that are shown one after the other at a high enough speed for our eyes to see it as a moving picture. Pretty simple, right? Well, there's just one problem. Storing all these thousands and thousands of images takes up a huge amount of space. An average 10 minute YouTube video would require over 100 GB of storage, and a feature-length movie could take upwards of a terabyte! Where is our escape from this madness??

Well, luckily, mathematicians and computer scientists have found many clever and sophisticated ways to compress video data down to a fraction of its original size. These researchers have turned their work into standards that define exactly how their technology works and how video data that is compressed this way can be decoded. We call these standards "video compression formats", and some popular ones include H.264, VP8, and AV1 among many others.

All the while, other smart people needed to figure out in what way the compressed video data should be saved to a video file, or split into chunks and streamed over the internet. This resulted in the development of special formats that hold both the compressed video data and additional information, like the title of the video, its resolution, and other stuff. We call these "container formats", and some popular ones include MPEG-4 and WebM.

So, in the end, you use a video camera to record something, those raw images get compressed in a video compression format, and once you're done your video is wrapped up with a nice little bow using a container format. Now, the video file is ready to be stored on your computer or streamed out for all the world to see.

GStreamer Concepts

Now that we know how video works, we can begin to understand how GStreamer lets us work with it. Working with GStreamer is kind of like creating an assembly line in a factory. Each step in the assembly line is in charge of doing one thing, and the results of one step are passed on to the next step until the process is complete. GStreamer calls this assembly line a "pipeline", and the steps are known as "elements".

Every pipeline starts with a source element, has some number of elements that process the data in the middle, and ends with a sink element. The source element is in charge of getting video data from somewhere, like a file on your computer or a video stream hosted online. That data is then passed to the next element, which does some processing on the data, and the result is passed on to the next element in the pipeline and so on. Finally, the fully processed data is passed to the sink element, which will take care of making the data available somewhere. That might involve saving it to your computer, hosting it as a live video stream, or passing it back to your application.

GStreamer has a lot of elements that do all kinds of different things. Each one has a name that we refer to it by, and certain rules governing what kinds of data it can take as input and what it produces as output.

Now, putting together one of these pipelines might sound hard, but GStreamer makes it pretty easy. All you have to do is give GStreamer a string with the names of each element you want in your pipeline, separated my exclamation marks. And that's it! GStreamer will take care of creating these elements and attaching them to each other.

Let's Get to the Code

With all that background in mind, let's jump into the code. Now, we're going to be writing in Python but the GStreamer library is written in C so we're going to be using what's called a "binding". A binding is simply a library that allows you to use a library in one language from another language. GStreamer's Python binding library is called PyGObject, and we import it like this:

import gi

Now we need to tell PyGObject the minimum version of GStreamer that our program requires, which the library shortens to "Gst". Once we've done that, we're ready to import the "Gst" module, as well as the "GLib" module which we will use shortly. Make sure to call Gst.init() to initialize GStreamer before doing anything else.

gi.require_version("Gst", "1.0")

from gi.repository import Gst, GLib


Gst.init()

After that, we need to start the main loop. The main loop is in charge of handling events and doing some other background tasks. Here we'll start it in a new thread, so that we can do other things in our main thread.

from threading import Thread

main_loop = GLib.MainLoop()
thread = Thread(target=main_loop.run)
thread.start()

Finally, we're ready to construct a simple pipeline! Like I mentioned earlier, all pipelines start with a source. Which source element we use will depend on where we want to get our video from. For now, let's try getting video from our webcam. On Windows, we can use the ksvideosrc element to do his. If you're on macOS, try autovideosrc. For Linux, it's v4l2src.

Then, we're going to follow that up with a decodebin element. This is a super helpful element that takes care of figuring out what container format and video compression format a source is providing, and handles decoding it for us into raw images.

Next let's add a videoconvert element, another handy tool that takes care of any format differences between the images that decodebin provides and what our next element expects.

Our pipeline is almost done! Just like how every pipeline starts with a source, they also end with a sink! Our sink of choice today will be autovideosink, which will display our webcam footage on-screen.

pipeline = Gst.parse_launch("ksvideosrc ! decodebin ! videoconvert ! autovideosink")

We've defined our pipeline, but we're not quite done yet! We still need to start the pipeline up. To do this, we use the set_state method, which asks the main loop that we started earlier to take care of initializing and playing our pipeline. All that work will be done in the background, so we can continue doing whatever we want in our program.

pipeline.set_state(Gst.State.PLAYING)

For this example, all we're going to do is wait around while our webcam footage is being played on-screen until you stop the program. At that point, we'll ask the pipeline to stop and clean up by setting it to the NULL state. Then, finally, we'll stop the main loop we started earlier.

try:
    while True:
        sleep(0.1)
except KeyboardInterrupt:
    pass

pipeline.set_state(Gst.State.NULL)
main_loop.quit()

Here's the example in full. Again, make sure to replace ksvideosrc with your platform's equivalent if you're not running on Windows.

from threading import Thread

import gi

gi.require_version("Gst", "1.0")

from gi.repository import Gst, GLib


Gst.init()

main_loop = GLib.MainLoop()
thread = Thread(target=main_loop.run)
thread.start()

pipeline = Gst.parse_launch("ksvideosrc ! decodebin ! videoconvert ! autovideosink")
pipeline.set_state(Gst.State.PLAYING)

try:
    while True:
        sleep(0.1)
except KeyboardInterrupt:
    pass

pipeline.set_state(Gst.State.NULL)
main_loop.quit()

Having Fun

Now that we've got our example application running, we can add some cool filters to our webcam stream! Some personal favorites of mine are edgetv and rippletv. Just make sure to add a videoconvert before and after them to ensure that the element is getting images in a format it's compatible with.

pipeline = Gst.parse_launch("ksvideosrc ! decodebin ! videoconvert ! edgetv ! "
                            "videoconvert ! autovideosink")

Doing Your Own Thing with Video

GStreamer has a huge number of fun and useful elements for just about everything, but what if you wanted to do something custom? Maybe you want to implement your own special filter, or send the images off to another service or library. For these cases, GStreamer provides the appsink element, which allows you to take data out of the pipeline. Let's check it out.

We're going to use our original webcam pipeline, but replace the autovideosink with appsink. We're going to give the element a name so that we can pull this element out of the pipeline and interact with it.

pipeline = Gst.parse_launch("ksvideosrc ! decodebin ! videoconvert ! "
                            "appsink name=sink")
appsink = pipeline.get_by_name("sink")

Now, we can pull images out of the appsink using the try_pull_sample method.

try:
    while True:
        sample = appsink.try_pull_sample(Gst.SECOND)
        if sample is None:
            continue
    
        print("Got a sample!")
except KeyboardInterrupt:
    pass

That's all I'm going to show you about appsink in this talk, just to whet your appetite. But, I have some more examples in the text version of this talk if this sounds like something your hack needs.

Conclusion

And.. that's a wrap! Thanks so much for listening, and I hope you found it enjoyable. If you have any questions, please feel free to reach out to me on the sunhacks Discord. Again, my name is Tyler and I should be marked as a "mentor". Happy hacking!

Extra Credit

My pipeline isn't working. How do I find out why?

When GStreamer encounters a problem, it prints an error message to the console. However, by default, these logs are hidden. To see them, we need to set the GST_DEBUG environment variable.

For example, if you're running your program like this:

python3 main.py

Run it like this, instead:

GST_DEBUG=2 python3 main.py

However, reading GStreamer logs can sometimes feel like an art form. Feel free to reach out to me if you're having trouble understanding what these logs are telling you!

@ehHan
Copy link

ehHan commented Mar 18, 2021

First of all, thanks for your tutorial.
I've got the same successful result with MSYS!
Btw, there were some issues when I tried it with Anaconda.
I installed several packages like below

$ conda install -c conda-forge gstreamer
$ conda install -c conda-forge gst-plugins-base
$ conda install -c conda-forge gst-plugins-good
$ conda install -c conda-forge pygobject

and execute the same code but it says
gst_parse_error: no element "ksvideosrc" (1)

I guess it's because there is no 'gst-plugins-bad .' // I guess I read it somewhere.

Instead, I tried it with 'videotestsrc'
It seems working but it doesn't give any screen window. (When I tried it with MSYS, it showed the default test screen.)

Do you have any idea about it?

@velovix
Copy link
Author

velovix commented Mar 18, 2021

Could you try running your code with the GST_DEBUG environment variable set to 2 or higher and send me the logs? I'm guessing we'll see an error of some kind.

Just a hunch, but if your pipeline looks like this: videotestsrc ! decodebin ! videoconvert ! autovideosink, you may want to try removing the decodebin element from the pipeline. videotestsrc produces raw video frames as output that don't need to be decoded beforehand.

@ehHan
Copy link

ehHan commented Mar 19, 2021

Thank you for your reply.
I removed 'decodebin' part but nothing changed.

And, I found something interesting with GST_DEBUG.

GST_DEBUG=2 python3 a1.py

It doesn't work with the Anaconda command.
However, it works with the MSYS command.

So, I added two lines in the code:

Gst.debug_set_active(True)
Gst.debug_set_default_threshold(3)

And it says,

0:00:00.089890000 13684 0000022839D7B2D0 ERROR GST_PIPELINE gst/parse/grammar.y:1121:priv_gst_parse_yyparse: specified empty bin "bin", not allowed
0:00:00.092168000 13684 0000022839D7B2D0 ERROR GST_PIPELINE gst/parse/grammar.y:938:priv_gst_parse_yyparse: link has no source [sink=@0000022839A0D840]
0:00:00.096156000 13684 0000022839D7B2D0 ERROR GST_PIPELINE gst/parse/grammar.y:1121:priv_gst_parse_yyparse: specified empty bin "bin", not allowed
0:01:11.301930000 13684 0000022839D7B2D0 WARN GST_ELEMENT_FACTORY gstelementfactory.c:467:gst_element_factory_make: no such element factory "ksvideosrc"!
0:01:11.302488000 13684 0000022839D7B2D0 ERROR GST_PIPELINE gst/parse/grammar.y:850:priv_gst_parse_yyparse: no element "ksvideosrc"
0:01:11.302789000 13684 0000022839D7B2D0 ERROR GST_PIPELINE gst/parse/grammar.y:938:priv_gst_parse_yyparse: link has no source [sink=@0000022839A0FEC0]
0:01:33.381410000 13684 0000022839D7B2D0 WARN autodetect gstautodetect.c:351:gst_auto_detect_find_best: warning: Failed to find a usable video sink
0:01:33.382057000 13684 0000022839D4F940 FIXME default gstutils.c:4026:gst_pad_create_stream_id_internal:videotestsrc1:src Creating random stream-id, consider implementing a deterministic way of creating a stream-id
0:01:50.066782000 13684 0000022839D7B2D0 WARN autodetect gstautodetect.c:351:gst_auto_detect_find_best: warning: Failed to find a usable video sink
0:01:50.067396000 13684 0000022839D4F880 FIXME default gstutils.c:4026:gst_pad_create_stream_id_internal:videotestsrc2:src Creating random stream-id, consider implementing a deterministic way of creating a stream-id
0:02:02.276641000 13684 0000022839D7B2D0 WARN autodetect gstautodetect.c:351:gst_auto_detect_find_best: warning: Failed to find a usable video sink
0:02:02.277123000 13684 0000022839D4FB40 FIXME default gstutils.c:4026:gst_pad_create_stream_id_internal:videotestsrc3:src Creating random stream-id, consider implementing a deterministic way of creating a stream-id
0:02:36.614642000 13684 0000022839D7B2D0 WARN autodetect gstautodetect.c:351:gst_auto_detect_find_best: warning: Failed to find a usable video sink
0:02:36.615274000 13684 0000022839D4F8C0 FIXME default gstutils.c:4026:gst_pad_create_stream_id_internal:videotestsrc4:src Creating random stream-id, consider implementing a deterministic way of creating a stream-id
0:04:52.176123000 13684 0000022839D7B2D0 WARN autodetect gstautodetect.c:351:gst_auto_detect_find_best: warning: Failed to find a usable video sink
0:04:52.176671000 13684 0000022839D4FA40 FIXME default gstutils.c:4026:gst_pad_create_stream_id_internal:videotestsrc5:src Creating random stream-id, consider implementing a deterministic way of creating a stream-id

I guess something is not installed properly?
Have you tried this example on Anaconda Env or any knowledge about it?

Thank you in advance.

FYI) full code that I wrote

from threading import Thread

import gi

gi.require_version("Gst", "1.0")

from gi.repository import Gst, GLib
from time import sleep


Gst.init(None)
Gst.debug_set_active(True)
Gst.debug_set_default_threshold(3)

main_loop = GLib.MainLoop()
thread = Thread(target=main_loop.run)
thread.start()

pipeline = Gst.parse_launch("videotestsrc ! decodebin ! videoconvert ! autovideosink")
pipeline.set_state(Gst.State.PLAYING)

try:
    while True:
        sleep(0.1)
        print("ing")
except KeyboardInterrupt:
    pass

pipeline.set_state(Gst.State.NULL)
main_loop.quit()

@velovix
Copy link
Author

velovix commented Apr 2, 2021

It looks to me like your Anaconda environment doesn't have gst-plugins-bad installed. After a quick Google search, I'm not sure if Anaconda has a package for the bad plugins. You may have to stick with MSYS2 or install gst-plugins-bad manually. I'm not very familiar with Anaconda, so I can't provide much guidance on how to do that second option.

@2014bcs007
Copy link

Am using MSYS2 but getting this error "error: mingw-w64-x86_64-mpfr: signature from "David Macek david.macek.0@gmail.com" is unknown trust
"
Am running windows 10

@sharique413
Copy link

I try to work on it but I found this kind of error .
Traceback (most recent call last):
File "/home/lenov/gstreamer/main.py", line 3, in
import gi
ModuleNotFoundError: No module named 'gi'

How I fixed it.

@velovix
Copy link
Author

velovix commented Dec 14, 2021

@sharique413 Have you installed these packages to your system already? If so, what command are you using to run your code?

@wxjames
Copy link

wxjames commented Jan 25, 2022

Hi, it's a very nice sharing. Do you have any ideas how to get the stream video data from rtsp using python + PyGobject?

@velovix
Copy link
Author

velovix commented Jan 26, 2022

@wxjames GStreamer makes RTSP streaming pretty easy. The main element you'll be interested in is rtspsrc, which takes care of connecting to an RTSP stream and converting it into RTP packets. You can usually just attach it to a decodebin element and get raw frames from there. Your pipeline will look something like this:

rtspsrc location={your url} ! decodebin ! videoconvert ! appsink

@callTx
Copy link

callTx commented Feb 11, 2022

After using pipeline = Gst.parse_launch("dx9screencapsrc ! videoconvert ! dshowvideosink") the following error gi.repository.GLib.GError: gst_parse_error: no element "dshowvideosink" (1). How could I use winscreencap to grap windows screen?

Edit:
Googling some stuffs I found this pipeline dxgiscreencapsrc monitor=1 ! videoscale method=0 ! queue max-size-buffers=0 max-size-bytes=0 max-size-time=0 min-threshold-time=0 ! videoconvert ! autovideosink with works great.

@velovix
Copy link
Author

velovix commented Feb 11, 2022

@callTx Glad you found something that works! Generally speaking, when you want to get frames on screen, autovideosink is the right choice. It will take care of finding the best way to do it in the background.

@grigmaros
Copy link

Hello @velovix ,

thanks for crash course, excellent job.
I could use some help with what i am trying to do. I need to stream video from Nao robot to my pc to use a deep learning model to recognize emotion state from the human faces Nao is seeing live.
As a result I decided to start with a simple task, just getting video input from a RTSP test link- following your instructions.
My code is as following:
`from threading import Thread

import gi
import time
gi.require_version("Gst", "1.0")

from gi.repository import Gst, GLib

Gst.init()

main_loop = GLib.MainLoop()
thread = Thread(target=main_loop.run)
thread.start()

pipeline = Gst.parse_launch("rtspsrc location={rtsp://mpv.cdn3.bigCDN.com:554/bigCDN/mp4:bigbuckbunnyiphone_400.mp4} ! decodebin ! videoconvert ! appsink")
pipeline.set_state(Gst.State.PLAYING)

try:
while True:
time.sleep(0.1)
except KeyboardInterrupt:
pass

pipeline.set_state(Gst.State.NULL)
main_loop.quit()`

However, there is no outcome..... Any ideas what I am getting wrong?

Thanks in advance

Greg

@velovix
Copy link
Author

velovix commented Mar 7, 2022

@grigmaros I think the issue is that you have curly braces before and after your RTSP URL, and GStreamer doesn't know how to parse that. You can either remove the curly braces, or wrap the URL in single quotes instead, which is what I would recommend. So, your pipeline would look like this:

rtspsrc location='rtsp://mpv.cdn3.bigCDN.com:554/bigCDN/mp4:bigbuckbunnyiphone_400.mp4' ! decodebin ! videoconvert ! appsink

To confirm this, you could try setting the GST_DEBUG environment variable to 3, and I'd bet that you'd see GStreamer complaining about a parsing error.

Let me know how this goes for you!

@zhouzhou0322
Copy link

Hi, there. At the end of your video, you mentioned in the text version, there are some examples of getting the samples working with NumPy, where can I find it, thank you.

@velovix
Copy link
Author

velovix commented Mar 12, 2022

@zhouzhou0322 I planned on having that in this doc but I ran out of time before the original hackathon that this talk was for. I'll see if I can get that information on here in the next few days.

@wren93
Copy link

wren93 commented May 12, 2022

Hi @velovix, thanks for the great tutorial. I tried to run the program on an m1 mac, but got the following error:

gi.repository.GLib.GError: gst_parse_error: no element "autovideosrc" (1)

I believe I had the required packages installed correctly as all the gst-plugins-{bad, base, good, ugly}, gstreamer and pygobject3 can be found using the command "brew list". I am able to run the command "gst-launch-1.0 autovideosrc ! decodebin ! videoconvert ! autovideosink" in terminal after adding "/Library/Frameworks/GStreamer.framework/Commands" to PATH so I guess this is also related to environment variable issues. Do you have any idea how to fix this? Thank you very much!

@velovix
Copy link
Author

velovix commented May 16, 2022

@wren93 Admittedly, I'm not familiar with how brew sets up GStreamer environments, but I have some guesses.

Are you running the Python code in the same terminal that you're able to run gst-launch-1.0 in? If not, give that a shot. If it works in that terminal, it suggests your guess about environment variables is correct. If it is an environment variable issue, you will probably be able to fix it by setting the GST_PLUGIN_PATH environment variable in the place you're running the Python code in. This environment variable adds to the paths that GStreamer searches for plugins in.

I'm not sure where brew puts plugins, but you can find out where autovideosink's plugin is by running gst-inspect-1.0 autovideosink and looking at the "Filename" field under the "Plugin Details" section. For example, on my machine I see /usr/lib/x86_64-linux-gnu/gstreamer-1.0/libgstautodetect.so in that field, so I would set the GST_PLUGIN_PATH environment variable to /usr/lib/x86_64-linux-gnu/gstreamer-1.0.

@Suryansh1089
Copy link

I am trying to make a program using gstreamer so that I can have multilple sinks in a pipeline and can easily switch between those sinks.
PLEASE HELP.

@velovix
Copy link
Author

velovix commented Jun 6, 2022

@Suryansh1089 What have you tried so far? It seems to me like the output-selector element would do what you want, but I haven't personally used it before.

@JaisonJHH
Copy link

@velovix, I tried the same code after watching your Youtube video and it gives out an error:
TypeError: 'MainLoop' object is not callable.

Nevertheless the code gives out the desired output and I was able to display the stream, but why do I get this error still??🥴
BTW pretty good explanation 👍

@velovix
Copy link
Author

velovix commented Jun 26, 2022

@JaisonJHH I'm not sure why that would be happening... Does it work if you use GLib.MainLoop.new() instead of GLib.MainLoop()? Either one should be equivalent. Can you tell me more about your environment?

The example may appear like it's working, but some key GStreamer functionality won't work unless the main loop is running properly.

@JaisonJHH
Copy link

JaisonJHH commented Jun 27, 2022

@JaisonJHH I'm not sure why that would be happening... Does it work if you use GLib.MainLoop.new() instead of GLib.MainLoop()? Either one should be equivalent. Can you tell me more about your environment?

The example may appear like it's working, but some key GStreamer functionality won't work unless the main loop is running properly.

@velovix Okay, I'll try with MainLoop.new(). Regarding environment I just followed your youtube video exactly in a STM32MP1 board.

By the way, I tried another method of implementing the same code and it worked without any issue:

import gi

gi.require_version('Gst', '1.0')
from gi.repository import Gst, GLib

Gst.init(None)

class Pipeline(object):
    def __init__(self):
        pipe_desc = ("rtspsrc location=rtsp://onviftest:onviftest@192.168.1.12:554/stream1 latency=0 buffer-mode=auto ! rtph264depay ! decodebin ! videoconvert ! autovideosink sync=false")

        self.pipeline = Gst.parse_launch(pipe_desc)


loop = GLib.MainLoop()

p = Pipeline()

try:
    p.pipeline.set_state(Gst.State.PLAYING)
    loop.run()

except KeyboardInterrupt:
    p.pipeline.set_state(Gst.State.NULL)
    loop.quit()

This works fine, give it a try. Also can you provide a basic boilerplate for using webrtcbin? Is it just kind of appsink for webrtc or can it actually create a peer?

@Suryansh1089
Copy link

I have to generate an RTSP link for my camera so that I can stream the feed on vlc from a remote location. How can it be done using GStreamer?
Please guide me here.
Thank you for your tutorials

@Suryansh1089
Copy link

@velovix I have to generate an RTSP link for my camera so that I can stream the feed on vlc from a remote location. How can it be done using GStreamer?
Please guide me here.
Thank you for your tutorials

@velovix
Copy link
Author

velovix commented Jul 19, 2022

@Suryansh1089 This is a big topic, but you're going to need gst-rtsp-server in order to do this. They have a few examples that show you how you can give the library a GStreamer pipeline, and it will take care of making the output of that pipeline available along with an RTSP URL.

@faridelya
Copy link

HI @velovix what about gstreamer python tutorial
we will be happy if you start these tutorials.

@velovix
Copy link
Author

velovix commented Feb 3, 2023

@faridelya Sorry, I'm not sure what you're asking. The official tutorials are great and I don't intend to replace them! I did this talk as part of a hackathon so that I could answer participants' questions while the event was ongoing.

@franhidalgocavs
Copy link

Hi, i need to do a stream from 3 x11 windows with 3 stream keys diferent and i need too do it with 0 ms delay, can i do it with gstremer comand line tool?
PD: sorry for how i speak english

@velovix
Copy link
Author

velovix commented Feb 21, 2023

@franhidalgocavs You may be able to use ximagesrc to accomplish this. You can set the xid parameter to select which window to use. You can then encode it in some format and stream it out using gst-rtsp-server. Achieving 0ms latency will be very difficult, though. Make sure to read up on the available parameters for whatever encoder you decide to use, as most of them have ways to tweak their behavior for real-time streaming.

@AhmedYasserrr
Copy link

AhmedYasserrr commented Mar 4, 2024

That was awesome, Thank you for sharing your knowledge and I hope if you can do more advanced videos about Gstreamer and other interesting topics about working with cameras, computer vision models, etc

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment