Skip to content

Instantly share code, notes, and snippets.

@nitheeshkl
Last active April 12, 2024 02:44
Show Gist options
  • Star 24 You must be signed in to star a gist
  • Fork 4 You must be signed in to fork a gist
  • Save nitheeshkl/5cbf1a0777801a7e9b8e12f8252d465e to your computer and use it in GitHub Desktop.
Save nitheeshkl/5cbf1a0777801a7e9b8e12f8252d465e to your computer and use it in GitHub Desktop.
Gstreamer pipelines to use GigE cams as webcam for Zoom/Teams/Skype

Using GigE cam as webcam for Zoom/Skype/Teams

TL;DR: Creates a Gstreamer pipeline to read camera frames from a GigE camera, using Aravis library, and publish them as V4l2 camera source, using V4l2loopback, that can be read by video conferencing programs like Zoom/Skype/Teams.

gst-launch-1.0 aravissrc blocksize=5013504 h-binning=1 v-binning=1 ! video/x-bayer,format=rggb,framerate=100/5,width=2448,height=2048 ! bayer2rgb ! video/x-raw,format=RGBx ! videoconvert ! video/x-raw,format=YUY2 !  aspectratiocrop aspect-ratio=16/9 ! videoscale ! video/x-raw,width=1280,height=720 ! queue ! v4l2sink device=/dev/video0

The Basics

Gigabit-Ethernet (GigE) cameras are a type of cameras that work with gigabit ethernet (RS232) interface unlike the usual cameras that operate over usb. They receive their power via power-over-ethernet (POE) and transfer data like network packets over sockets. Example: FLIR BlackFly, Basler Ace. These are generally industrial cameras that conform to and operate with GigE Vision standards.

Video for Linux (V4l2) as the name suggests is the default standard in linux for image and video capture. V4l2 comprises of a set of kernel drivers and userspace libraries that provide support for capturing images/frames from differnt kind of cameras. By default it operates/captures from usb cameras and provides a /dev/videoX character device. Most of the applications and libraries on linux use this /dev/videoX to read the camera images and work with them.

Gstreamer is one of, if not the most, popular multimedia frameworks on linux. It provides a bunch of libraries, plugins, and binaries to process audio and video data. It works as graph based pipelines and has modules to support a vast array of image manipulation operations.

Aravis is a linux based open source project to work with GigE cameras. It provides a set of libraries and tools to configure and read data from GigE cams. It is based on Gobject and provides Gstreamer plugins that can capture camera frames from a GigE camera.

V4l2Loopback: A loopback device in linux is a dummy/fake/virtual device that is used for testing purposes. It basically takes in some data as input and dumps out the same data as output. V4l2loopback is a kernel driver that creates a v4l2 loopback device, i.e, it creates a dummy /dev/videoX device that reads in image data and dumps out the same image data as a v4l2 compatible output.


The Problem Statement

V4l2 by default has no support for GigE cams. Plugging in a GigE cam to a linux system does not create a /dev/videoX device to be used by standard camera capture applications. As a result, video conferencing applications like Zoom/Teams/Skype etc will fail to recognize any GigE cameras as they only look for /dev/videoX as a camera source.

So, how can we make such video conferencing applications to recoginze and read frames from a GigE camera source?

The Solution

Since these video conferencing applications only recognise a v4l2 source, the only option is to convert the GigE source into a v4l2 source. To achieve this, a corresponding /dev/videoX device must be created and the data from GigE source must be routed to this v4l2 device. The data must be converted into a format that is interpretable by the video conferencing applications.

Since v4l2 does not have any native support for GigE vision, Aravis can be used for reading and controlling the GigE cam. V4l2loopback provides a /dev/videoX device that can receive the GigE camera frames as input and dump the same as ouput. However, the frames from GigE cam has to be formatted correctly before it is writted to the v4l2loopback device. Gstreamer can be used to perform this image processing and formatting. Since Aravis provides Gstreamer plugins, and Gstreamer has native support to read and write to a v4l2 device, the entire process can be converted into a Gstreamer pipeline.

Therefore, we can build a Gstreamer pipeline that reads data from a GigE cam, convertes into a compatible format and writes it to a /dev/videoX v4l2 device. Video conferencing applications can then reading from this /dev/videoX device and function as though it is receiving image frames from a standard webcam/usb camera.


Implementation

This solution has been implemented and tested on a Ubuntu 20.04 system. The same configuration should work on most of the other linux systems as well.

Setup

GigE Camera

GigE cams operate over POE. Therefore, the camera must be connected to a POE port on the system, or via a POE router. A CAT-5e cable should be used for connection at the least. The GigE cam can be configured to use a static IP or a dynamic IP (dhcp). The camera manual and guides from the manufacturer would provide details on how to configure the camera.

For this setup, I have used a PointGrey (PtGry) BlackyFly-S camera that operates at 24 fps, connected to the system using CAT-7 cables via a POE router.

Gstreamer

Install Gstreamer from source following the official docs, or installed the pre-compiled binaries for your distro. It is important to install good and bad plugins, and also the development libs as they will be needed to build and install Aravis.

sudo apt-get install libgstreamer1.0-0 gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly gstreamer1.0-libav gstreamer1.0-doc gstreamer1.0-tools gstreamer1.0-x gstreamer1.0-alsa gstreamer1.0-gl gstreamer1.0-gtk3 gstreamer1.0-qt5 gstreamer1.0-pulseaudio libgstreamer-plugins-base1.0-dev

Aravis

Aravis is not available as a pre-built package for Ubuntu. It has to be built from source. Get Aravis from https://github.com/AravisProject/aravis (0.6 is the latest stable version at the time of writing). Install required packages and dependencies to build aravis.

sudo apt install libxml++2.6-dev libusb-1.0-0-dev libnotify-dev libaudit-dev libaudit1

Congifure Aravis to create Gstreamer plugins as it is not enabled by default. Other features are optional and can be turned off. Do not enable gst-0.10-plugin. This is for Gstreamer 0.1. We use Gstreamer 1.0. Compile and then install as usual.

./configure --enable-usb --enable-packet-socket --enable-viewer --enable-gst-plugin --enable-zlib-pc --enable-fast-heartbeat --enable-cpp-test
make
make install

This should install libaravis-0.6.so to /usr/local/lib/ and the gst-plugin libgstaravis.0.6.so to /usr/local/lib/gstreamer-1.-. It also provides arv-viewer, a GUI application to view the camera frames, and arv-tool-0.6, a CLI program to query and configure the GigE cam.

V4l2 and V4l2loopback

Most standard distributions come with v4l2 installed by default. If not, follow your distribution docs to install and setup v4l2. V4l2loopback can be installed either from source or pre-built packages. However, the pre-built package for the distribution did not work for me.

Download v4l2loopback from https://github.com/umlaeute/v4l2loopback. Build and install as shown in their docs.

make
make install

This should create and install the v4l2loopback.ko kernel driver.

(optional) Xawtv

While Gstreamer plugins is sufficient to test reading from a v4l2 and displaying the frames, I find xawtv more handy for such testing purposes. Install xawtv for viewing/testing.

sudo apt install xawtv

Testing GigE Cam

arv-viewer

Use the arv-viewer to check if the camera comes up. A successfully setup camera will be shown with a corresponding IP address in the viewer. Selecting the camera will show the camera properties like pixel format, size, etc. Click the play button to start capturing the frames from the camera and show them in the viewer.

If successfull, you should be able to see a smooth flow of frames in the viewer operating at the default fps.

Note:

  1. Aravis support Genicam based models only.
  2. Exposure and Gain is set to auto by default. If you see a dark image, change the settings physically on the camera.
  3. The capture mode must be set to continuous for aravis to capture and receive the frames continuoulsy.
  4. If the fps is low, it is probably due to the pixel format. Change it to an appropriate format the supports the required fps.

arv-tool

arv-tool enables use to query the camera parameters and also set them. GigE Vision provides a lot of tunable parameters. arv-tool-0.6 features lists all the available features. arv-tool-0.6 control allows to read/write camera features. Example,

arv-tool-0.6 features
arv-tool-0.6 control Width Height
arv-tool-0.6 -a 192.168.1.17 control Width=128 Height=128 PixelFormat

See arv-tool-0.6 --help for more details.

For this setup, we use the following configuration

$ arv-tool-0.6 -a 192.168.1.17 control AcquisitionFrameRateEnable AcquisitionFrameRate AcquisitionResultingFrameRate Height Width GainAuto PixelFormat PixelSize PixelColorFilter BinningHorizontal BinningVertical PayloadSize
AcquisitionFrameRateEnable = false
AcquisitionFrameRate = 21.9515 Hz (min:1;max:22.0957)
AcquisitionResultingFrameRate = 21.9515 Hz
Height = 2048 (min:6;max:2048)
Width = 2448 (min:8;max:2448)
GainAuto = Continuous
PixelFormat = BayerRG8
PixelSize = Bpp8
PixelColorFilter = BayerRG
BinningHorizontal = 1 (min:1;max:2)
BinningVertical = 1 (min:1;max:2)
PayloadSize = 5013504

Note: Test with various PixelFormat, Size, and other paramerters that suits your needs. These values will be used to define the pipelines, as shown in the below sections.


V4l2loopback Device

A v4l2loopback device can be setup be loading the kernel driver as follows

sudo modprobe v4l2loopback card_label="MyCam"

This will create a /dev/videoX with the device name set to "MyCam". 'X' is the next available number for the video device /dev/. For ex, /dev/video1. See v4l2loopback docs for mode details.


Pipelines

In the following sections, we will use aravissrc gst element, provided by libgstaravis.so plugin, to read from the GigE camera. GST_PLUGIN_PATH must be set correctly for Gstreamer to find and load this plugin. Therefore, each of pipelines below presumes the gst plugin path is set correctly as follows.

export GST_PLUGIN_PATH=/usr/local/lib/gstreamer-1.0/:$GST_PLUGIN_PATH

Basic read and display

This is the most basic pipeline. It reads from the aravis source and displays it without any other formatting in between.

gst-launch-1.0 aravissrc ! videoconvert ! xvimagesink

If this works, then you should be able to continue with the rest of the pipelines below. If not, review the instructions provided in the previous sections and ensure everything is setup correctly.

Note: If you receive a grayscale image instead of color, then it is because the aravissrc by default presumes Mono8 pixel format. This color space conversions has to be performed according to the configured settings. More on this topic later.

Basic write to v4l2loopback device

The next step is to be able to write the frames to our /dev/videoX v4l2loopback device that we setup earlier. This pipeline reads from the aravis source and write it to the v4l2sink (v4l2 device) without any other modifications in between.

gst-launch-1.0 aravissrc  ! videoconvert ! v4l2sink device=/dev/video0

Use xawtv to see the output from /dev/videoX

xawtv -c /dev/video0

If the pipeline is successful, you should now be able to see the camera frames in xawtc. This confirms that the pipeline is able to read from the GigE camera source and write it to the v4l2loopback device, which is then read by a camera application and is able to correctly display the frames.

Note: If you notice a lot of frame drops in the pipeline, it is possibly to do read-write sync/race conflicts. This can be solved by introducing a queue in our pipeline as shown below

gst-launch-1.0 aravissrc  ! videoconvert ! queue ! v4l2sink device=/dev/video0

PixelFormat conversions

The next step is ensure correct pixel format conversion in our pipeline. Standard webcams generate frames in a YUV422 (YUY2) format and is well recognized by most camera and video conferencing applictaions like Zoom/Skype/Teams. Therefore, the input to our v4l2loopback device must be in YUV422. The pixel format configured in the GigE cam can be obtained using arv-tool control PixelFormat. Gstreamer provides several elements like videoconvert to convert from one pixel format to another. Use appropriate Gstreamer plugins to convert from your camera pixel format to the required YUV422 format.

In my case, the GigE camera provides images in a 8bit BayerRgb format. The following pipeline converts the GigE frames from BayerRgb to YUV422 format.

gst-launch-1.0 aravissrc ! video/x-bayer,format=rggb ! bayer2rgb ! video/x-raw,format=RGBx ! videoconvert ! video/x-raw,format=UYVY ! queue ! v4l2sink device=/dev/video0

Use xawtv like before to see the output from /dev/videoX. If the pipeline is successful, you shoud now be able to see the camera feed in a correct color scheme.

FPS, Image size, etc.

As a last step, we need to ensure aravissrc reads correct blocksize of data corresponding to the pixel format and image size, and also at the required fps. Use the arv-tool to find the right parameters for your camera and update the pipeline to use these values.

My camera is configured as shown below:

$ arv-tool-0.6 -a 192.168.1.17 control AcquisitionFrameRateEnable AcquisitionFrameRate AcquisitionResultingFrameRate Height Width GainAuto PixelFormat PixelSize PixelColorFilter BinningHorizontal BinningVertical PayloadSize
AcquisitionFrameRateEnable = false
AcquisitionFrameRate = 21.9515 Hz (min:1;max:22.0957)
AcquisitionResultingFrameRate = 21.9515 Hz
Height = 2048 (min:6;max:2048)
Width = 2448 (min:8;max:2448)
GainAuto = Continuous
PixelFormat = BayerRG8
PixelSize = Bpp8
PixelColorFilter = BayerRG
BinningHorizontal = 1 (min:1;max:2)
BinningVertical = 1 (min:1;max:2)
PayloadSize = 5013504

The corresponding pipeline for these values is

gst-launch-1.0 aravissrc blocksize=5013504 h-binning=1 v-binning=1 ! video/x-bayer,format=rggb,framerate=21/1 ! bayer2rgb ! video/x-raw,format=RGBx ! videoconvert ! video/x-raw,format=YUY2 ! queue ! v4l2sink device=/dev/video0

This should result in a smooth video capture and playback. The fps can be verified as follows

gst-launch-1.0 v4l2src device=/dev/video0 ! videoconvert ! fpsdisplaysink

Fine tuning for applications

The last pipeline shown above achieves our required solution described above. When tested with Zoom, Skype, and Teams, Zoom is able to easily pickup the camera frames from our /dev/videoX ("MyCam") device and works well as expected. However, Skype and Teams failed to recognize the camera feed from /dev/videoX. After further experimentation, it turned out that Skype and Teams is able to correctly read from /dev/videoX only if the frame size if a standard aspect ratio, like 3:2 or 16:9. I could not find out why this limitation is enforced, so I updated the pipeline to crop and resize the frames to conform to this aspect ratio.

The following pipelines use aspectratiocrop and videoscale elements to resize and scale the camera frames to a standard aspect ratio and size.

3:2 at 1080x720:

gst-launch-1.0 aravissrc blocksize=5013504 h-binning=1 v-binning=1 ! video/x-bayer,format=rggb,framerate=21/1,width=2448,height=2048 ! bayer2rgb ! video/x-raw,format=RGBx ! videoconvert ! video/x-raw,format=YUY2 !  aspectratiocrop aspect-ratio=3/2 ! videoscale ! video/x-raw,width=1080,height=720 ! queue ! v4l2sink device=/dev/video0

16:9 at 1280x720:

gst-launch-1.0 aravissrc blocksize=5013504 h-binning=1 v-binning=1 ! video/x-bayer,format=rggb,framerate=21/1,width=2448,height=2048 ! bayer2rgb ! video/x-raw,format=RGBx ! videoconvert ! video/x-raw,format=YUY2 !  aspectratiocrop aspect-ratio=16/9 ! videoscale ! video/x-raw,width=1280,height=720 ! queue ! v4l2sink device=/dev/video0

Note: Even though the images were scaled to 3:2 or 16:9 aspect ratio, Skype & Teams failed to recognize any stream with image size exceeding 2000x2000. Hence the above pipelines resize the frame size to lower values.

3:2 at 1080x720 in Mono8:

gst-launch-1.0 aravissrc blocksize=1253376 h-binning=2 v-binning=2 ! video/x-raw,format=GRAY8,framerate=21/1,width=1224,height=1024 ! videoconvert ! video/x-raw,format=YUY2 !  aspectratiocrop aspect-ratio=16/9 ! videoscale ! video/x-raw,width=1280,height=720 ! queue ! v4l2sink device=/dev/video0

Launch Scripts (WIP)

With the pipelines working, the final step is to convert this Gstreamer pipeline into a launcher script that can be tiggered when needed. I'm currently working on this, and will update in the near future.


Conclusion

"All of this is fine, but, who in the real world has access to such industrial GigE cameras, and who'd want to use them as webcams" - you ask?! Well, that is true. Like most things in life, this too was done just for fun since I had access to a GigE cam. It took quite some time for me to dig the details and get this working. During this research, I noticed the lack of documentation and examples online to work with such GigE cameras using open source libraries. While Flir and Basler have their own SDKs like Spinnaker and Pylon, respectively, I did not want to use their closed source products and wanted an open source alternative. Using these GigE cams as a webcam for video conferencing was a fun usecase. The higher motive was to identify an open source alternative like Aravis and showcase it's capabilities, to use it successfully in some real world usecases.


PS: I've successfully been using this setup for video conferencing in the last two weeks and it is working spectacularly well; far more that I had imagine.

My only issue so far is that the camera picks up / produces flickering when the tubelight is on during night times. This flickering-like effect does not occur with an incadescent blub or during day light. I've no idea why this occurs, and how to fix it. Perhaps this is due to a low fps of 21?!


Update

The flickering issue was due to the mismatch between the camera fps and electrical frequency. Algthough I kind of knew this, I wasn't particularly confident because the math wasn't adding up. The key was in finding out that many types of artificial lighting flicker at twice the rate of the power source. See https://www.red.com/red-101/flicker-free-video-tutorial for details. (Thanks Chitra & Mayank for pointing me to this source). So, here in India, the electrical frequency is 50Hz and the tubelights flicker at 100Hz. Therefore, setting the fps to any factor of 100 does the trick. (side note - this also made sense as to why the gstreamer frequency is set as fraction. Rather than specifying 20/1, specifying 100/5 or any 100/x is far more useful)

The Final pipeline:

gst-launch-1.0 aravissrc blocksize=5013504 h-binning=1 v-binning=1 ! video/x-bayer,format=rggb,framerate=100/5,width=2448,height=2048 ! bayer2rgb ! video/x-raw,format=RGBx ! videoconvert ! video/x-raw,format=YUY2 !  aspectratiocrop aspect-ratio=16/9 ! videoscale ! video/x-raw,width=1280,height=720 ! queue ! v4l2sink device=/dev/video0
#!/bin/bash
CAM_NAME="kln_cam"
GIGE_ADDR="192.168.1.17"
# parse cmdline options
function usage() {
echo "Usage: $0 [-a 3:2|16:9|raw] [-f mono|rgb] [-t fps] [-h]"
exit 1
}
while getopts a:f:t:h flag;do
case "${flag}" in
a) resolution=${OPTARG};;
f) format=${OPTARG};;
t) test=${OPTARG};;
h) usage;;
:) echo "Invalid option: $OPTARG requires an argument" 1>&2; usage;;
esac
done
shift $((OPTIND -1))
# setup v4l2loopback device
if lsmod | grep -q v4l2loopback; then
echo "removing v4l2loopback driver..."
sudo rmmod v4l2loopback
sleep 1
fi
echo "loading v4l2loopback driver with card_label=$CAM_NAME"
sudo modprobe v4l2loopback card_label=$CAM_NAME
sleep 1
# find video device
function findV4l2Cam() {
local cam_name="$1"
ls /sys/class/video4linux/ | while read _dev; do
unset $_name; unset $_index;
_name=$(cat /sys/class/video4linux/$_dev/name)
if [ "$_name" == "$cam_name" ]; then
_index=$(cat /sys/class/video4linux/$_dev/index)
echo "/dev/video${_index}"
return 0
fi
done
return 1
}
camdev=$(findV4l2Cam $CAM_NAME)
if [ -z ${camdev+x} ]; then
echo "Couldn't find v4l2loopback device. Exiting!" >&2
exit 1
fi
echo "using $camdev as v4l2loopback device..."
# Get Gige cam properties
echo "reading Gige cam properties..."
payloadsize=$(arv-tool-0.6 -a $GIGE_ADDR control PayloadSize | cut -d '=' -f2 | cut -d ' ' -f2)
echo "PayloadSize = $payloadsize"
h_binning=$(arv-tool-0.6 -a $GIGE_ADDR control BinningHorizontal | cut -d '=' -f2 | cut -d ' ' -f2)
echo "H-Binning = $h_binning"
v_binning=$(arv-tool-0.6 -a $GIGE_ADDR control BinningVertical | cut -d '=' -f2 | cut -d ' ' -f2)
echo "V-Binning = $v_binning"
width=$(arv-tool-0.6 -a $GIGE_ADDR control Width | cut -d '=' -f2 | cut -d ' ' -f2)
echo "Width = $width"
height=$(arv-tool-0.6 -a $GIGE_ADDR control Height | cut -d '=' -f2 | cut -d ' ' -f2)
echo "Height = $height"
# setup GST pipeline
gst_launch=/usr/bin/gst-launch-1.0
export GST_ARAVIS_PLUGIN=/usr/local/lib/gstreamer-1.0/libgstaravis.0.6.so
echo "using $gst_launch ..."
echo "loading $GST_ARAVIS_PLUGIN ..."
gst_cmd="$gst_launch --gst-plugin-load=$GST_ARAVIS_PLUGIN"
echo "building pipeline..."
pipeline="aravissrc blocksize=$payloadsize h-binning=$h_binning v-binning=$v_binning"
case "${format}" in
"mono" | "MONO" | "Mono" | "mono8")
echo "using PixelFormat = ${format}"
pipeline="$pipeline ! video/x-bayer,format=GRAY8,framerate=100/5,width=$width,height=$height ! videoconvert ! video/x-raw,format=YUY2"
;;
"rgb" | "RGB")
;&
*)
echo "using PixelFormat = rgb"
pipeline="$pipeline ! video/x-bayer,format=rggb,framerate=100/5,width=$width,height=$height ! bayer2rgb ! video/x-raw,format=RGBx ! videoconvert ! video/x-raw,format=YUY2"
;;
esac
case ${resolution} in
"3:2")
echo "setting output aspect-ratio = ${resolution}"
pipeline="$pipeline ! aspectratiocrop aspect-ratio=3/2 ! videoscale ! video/x-raw,width=1080,height=720"
;;
"16:9")
echo "setting output aspect-ratio = ${resolution}"
pipeline="$pipeline ! aspectratiocrop aspect-ratio=16/9 ! videoscale ! video/x-raw,width=1280,height=720"
;;
"raw" | "Raw")
;&
*)
echo "setting output aspect-ratio = raw"
;;
esac
pipeline="$pipeline ! queue ! v4l2sink device=$camdev"
if [ -n "$test" ]; then
test_cmd=""
case "${test}" in
fps)
test_cmd="$gst_launch v4l2src device=$camdev ! videoconvert ! fpsdisplaysink"
;;
*)
echo "${test} : Invalid test option!"
usage
;;
esac
echo "starting test..."
gnome-terminal --window -t "Gige-to-V4l2 Pipeline" -- bash -c "$gst_cmd $pipeline"
sleep 2
gnome-terminal --window -t "${test} test" -- bash -c "$test_cmd"
else
echo "starting pipeline..."
echo "$gst_cmd $pipeline"
$gst_cmd $pipeline
fi
@cloudy-mist
Copy link

cloudy-mist commented Jul 5, 2021

Hi! I'm new to Linux and I'm trying out your solution for hooking up a Gige camera to detect as dev/videoX. I noticed that the section "Congifure Aravis to create Gstreamer plugins as it is not enabled by default." which I copypasted into the terminal with this command:

./configure --enable-usb --enable-packet-socket --enable-viewer --enable-gst-plugin --enable-zlib-pc --enable-fast-heartbeat --enable-cpp-test

It says configure command not found, since there is no command file in the Aravis files.

How do I get around this? I hope you can assist me with this, thank you.

@nitheeshkl
Copy link
Author

nitheeshkl commented Jul 6, 2021 via email

@guillebot
Copy link

This is great writting. I'm struggling to connect Daheng cameras to gstreamer and Aravis pointers here are really useful. Thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment