Skip to content

Instantly share code, notes, and snippets.

Embed
What would you like to do?
Gstreamer example preview, save and stream video
#!/bin/sh
# NVIDIA Jetson TK1
# Use Gstreamer to grab H.264 video and audio stream from Logitech c920 webcam
# Preview video on screen
# Save Video and Audio to a file
# Send video as RTSP stream over TCP
# IP Address of the this machine hosting the TCP stream
IP_ADDRESS=<ENTER IP ADDRESS HERE e.g 10.10.10.10>
# You can list devices:
# $ v4l2-ctl --list-devices
VELEM="v4l2src device=/dev/video0" #video0 is a Logitech c920 webcam with built-in H.264 compression
# Video capability from the camera - Choose One, this size will be sent out over the network
# VCAPS="video/x-h264, width=800, height=448, framerate=30/1"
# VCAPS="video/x-h264, width=1280, height=720, framerate=30/1"
VCAPS="video/x-h264, width=1920, height=1080, framerate=30/1"
# Video Source
VSOURCE="$VELEM ! $VCAPS"
# Decode the video - parse the h264 from the camera and then decode it
# Hardware accelerated by using omxh264dec
VIDEO_DEC="h264parse ! omxh264dec"
# SIZE OF THE PREVIEW WINDOW (Optional - you can remove this by modifying VIDEO_SINK)
# Here for demo purposes
PREVIEW_SCALE="video/x-raw, width=1280, height=720"
# VIDEO_SINK is the preview window
VIDEO_SINK="videoconvert ! videoscale ! $PREVIEW_SCALE ! xvimagesink sync=false"
#AUDIO
AELEM="pulsesrc device=alsa_input.usb-046d_HD_Pro_Webcam_C920_A116B66F-02-C920.analog-stereo do-timestamp=true"
AUDIO_CAPS="audio/x-raw"
AUDIO_ENC="audioconvert ! voaacenc"
ASOURCE="$AELEM ! $AUDIO_CAPS"
# FILE_SINK is the name of the file that the video will be saved in
# File is a .mp4, Video is H.264 encoded, audio is aac encoded
FILE_SINK="filesink location=gtest1.mp4"
# Address and port to serve the video stream; check to make sure ports are available and firewalls don't block it!
TCP_SINK="tcpserversink host=$IP_ADDRESS port=5000"
#show gst-launch on the command line; can be useful for debugging
echo gst-launch-1.0 -vvv -e \
mp4mux name=mux ! $FILE_SINK \
$VSOURCE ! tee name=tsplit \
! queue ! $VIDEO_DEC ! $VIDEO_SINK tsplit. \
! queue ! h264parse ! mux.video_0 tsplit. \
! queue ! h264parse ! mpegtsmux ! $TCP_SINK \
$ASOURCE ! queue ! $AUDIO_ENC ! queue ! mux.audio_0
# first queue is for the preview
# second queue writes to the file gtest1.mp4
# third queue sends H.264 in MPEG container over TCP
gst-launch-1.0 -vvv -e \
mp4mux name=mux ! $FILE_SINK \
$VSOURCE ! tee name=tsplit \
! queue ! $VIDEO_DEC ! $VIDEO_SINK tsplit. \
! queue ! h264parse ! mux.video_0 tsplit. \
! queue ! h264parse ! mpegtsmux ! $TCP_SINK \
$ASOURCE ! queue ! $AUDIO_ENC ! queue ! mux.audio_0
@OferBaharUnisor

This comment has been minimized.

Copy link

OferBaharUnisor commented Sep 18, 2017

Hello

I have the Jetson TX2 installed with Ubuntu 14.04, so as my host, and I develop a c++ gstreamer application
I have the IDE (Eclipse Nsight) installed and working with remote debugging for CUDA programs and basic c++ programs as well,also i run many gstreamer pipelines successfully using gst-launch-1.0 ......

BUT i can not compile with the gstreamer-1.0 on my host with cross compilation, it looks like the installed gstreamer lib is for x86 and not for arm64

do you have some experience with this ?
appreciate your help

Regards

Ofer

@akmalhisyam36

This comment has been minimized.

Copy link

akmalhisyam36 commented Jun 6, 2018

Hi Jetsonhacks,

I tried your pipeline with 'nvcamerasrc' as the video source and it works! I also modified the pipeline for video recording and for sending stream over network by TCP only without the preview queue. So, it goes like this :

gst-launch-1.0 avimux name=mux
! filesink location=/media/nvidia/SSDJetson/test.mp4 nvcamerasrc fpsRange="30.0 30.0"
! 'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)I420, framerate=(fraction)30/1' ! omxh264enc bitrate=14000000 control-rate=variable ! tee name=tsplit
! queue ! h264parse ! mux.video_0 tsplit.
! queue ! h264parse ! queue ! matroskamux
! queue leaky=2 ! tcpserversink host=192.x.x.x port=7001

Thanks to your guide, I managed to launch this on command line. I'm used to transform gstreamer pipeline into C/C++ programming code but I'm still a beginner. However, this is my first time I saw this form of pipeline where it starts with the muxer and then the "filesink" and "nvcamerasrc" elements are bordered in the same "! filesink location=/media/nvidia/SSDJetson/test.mp4 nvcamerasrc fpsRange="30.0 30.0" !". I'm not familiar at all with this form of pipeline but I wanted to transform this into code.

I have two problems, I don't know how to create the "filesink" and "nvcamerasrc" elements in the same 'gst_element_factory_make'.
Plus, I don't know how to transform "mux.video_0 tsplit." to code.

Do you have any examples on this? So far, I haven't managed to find any code examples related to this on internet.

Thanks in advance!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.