Skip to content

Instantly share code, notes, and snippets.

@perigee
Last active October 29, 2018 21:28
Show Gist options
  • Save perigee/b53dae17a764cdf0a4ca to your computer and use it in GitHub Desktop.
Save perigee/b53dae17a764cdf0a4ca to your computer and use it in GitHub Desktop.

Hardware

Logitech C920 h264 direct output

ffmpeg with -vcodec for input streaming

$ ffmpeg -s 1920x1080 -f v4l2 -vcodec h264 -i /dev/video0 \
                   -vcodec copy -f rtp rtp://192.168.0.10:8090/

##Microsoft lifecam vx-2000

Motion Video: 640x480 pixel resolution

Actual setting for the project (the setting can be done by avconv)

$ v4l2-ctl -d /dev/video0 --list-formats-ext
$ v4l2-ctl -d /dev/video0 --set-fmt-video=width=352,height=288,pixelformat=1
$ v4l2-ctl -d /dev/video0 -V # query for camera output format

##Beaglebone black rev B2

###Device Tree Overlay related issues From the kernel 3.13, the capemgr is removed from the kernel. And everthing seems easier, the steps:

Install the kernel > 3.13.+ (with bone mentioned)

> sudo apt-get install linux-image-3.18.5-bone1

modify the /boot/uEnv.txt

First load the dtb file from /boot/dtbs/{uname -r}, for example the audio cape, and then enable it while disable the HDMI audio (in conflict with).

Docs: http://elinux.org/Beagleboard:U-boot_partitioning_layout_2.0

uname_r=3.18.5-bone1

dtb=am335x-boneblack-audio-revb.dtb

cmdline=quiet

##Example
#cape_disable=capemgr.disable_partno=
cape_enable=capemgr.enable_partno=BB-BONE-AUDI-02
cape_disable=capemgr.disable_partno=BB-BONELT-HDMI


##enable BBB: eMMC Flasher:
#cmdline=init=/opt/scripts/tools/eMMC/init-eMMC-flasher-v3.sh

uuid=95a7ff4f-07ac-42cb-9123-360b930c88a6

##Audio cape rev B1 (kernel 3.8 with Capemgr, Device tree overlay)

arecord

A very comprehensive tutorial

list availabe devices
 > arecord --list-devices
configure to use different devices for audio playback and capture

Create ~/.asoundrc file and set hw:[Card],[Device]

pcm. !default{
    type asym
	playback.pcm {
	    type plug
	    slave.pcm "hw:0,0"
	}

    capture.pcm {
	    type plug
	    slave.pcm "hw:1,0"
	}
}
																												  

For the root user, this file should also be presented under root folder.

test the playback and capture

Test two channels in wav format

> speaker-test -c 2 -t wav

Record the sound from capture device to a wav file

> arecord -r 48000 -D hw:1,0 -c1 -f S16_LE -t wav -vv -d 10 /tmp/example.wav

the parameters can be found here.

##GPIO (ex. Button) The GPIO needs to be exported to GPIOlib and be set to IN or OUT (high, low)

 ls -al /sys/class/gpio
 echo 60 > /sys/class/gpio/export # BBB rev B2 pinout P9_12 
 echo in > /sys/class/gpio/gpio60/direction

a comprehensive tutorial could be found in here

The adafruit made a python package for GPIO (see tutorial)

 >>> import Adafruit_BBIO.GPIO as GPIO
 >>> GPIO.setup("P9_12", GPIO.IN)
 >>> GPIO.input("P9_12")

Listen the GPIO interrupt with python select epoll

Protocol

SIP tutorial

Reference (Video)

  • SIP
  • SIP Trunking

Software

  • libx264-dev for h264
  • libvpx-dev for vp8 webm

Install Gunicorn with Nginx

An official tutorial can be found here.

Be award of:

  • Turn off proxy_buffering in Nginx
  • Using Upstart in ubuntu 14.04
  • Using socket to Nginx (same host)
  • Each worker in gunicorn will create an instance of app, so the singleton object will be created as many as workers

ffmpeg streaming (or ffmpeg)

Using ffmpeg, for some reasons, the ffmpeg does not recorgnize the -pixel_format parameter of video4linux2.

     > ffmpeg -f video4linux2 -vcodec mjpeg -video_size 160x120 -i /dev/video0 \
       -c:v libx264 -tune zerolatency -an -f flv rtmp://127.0.0.1/live/cam1 > /dev/null 2>&1 &
  • -an without sound

  • -vcodec video compression format

    avconv -f video4linux2 -vcodec mjpeg -video_size 160x120 -i /dev/video0
    -c:v libx264 -tune zerolatency -an -f flv rtmp://127.0.0.1/live/cam1

! Seems the avconv more stable on BBB. Be caseful, don't set -framerate too low, it will cause the longer delay on streaming. The smaller video size, the lower latency.

output camera

Check device format

 > avconv -f video4linux2 -list_formats all -i /dev/video0

Compiled the ffmpeg with libvpx support

  • Install libvpx

    sudo apt-get install libvpx-dev

  • Compile ffmpeg

    ./configure --enable-gpl --enable-libx264 --enable-libvpx --disable-yasm

The fastest (low CPU occupation)

 > avconv -f video4linux2 -pix_fmt yuv420p -video_size 80x60 -i /dev/video0 \
   -c:v libx264 -an -f flv -preset ultrafast rtmp://127.0.0.1/live/cam1

Several examples

    > /usr/local/bin/ffmpeg -rtsp_transport tcp -i rtsp://admin:123456@172.16.0.201 -vcodec copy -f flv -muxdelay 0.1 -r 25 -s 400x268 -an rtmp://localhost/live/cam1

    > /usr/local/bin/ffmpeg -rtsp_transport tcp -i 'rtsp://172.16.0.202/user=admin&password=admin&channel=1&stream=0.sdp' -vcodec copy -f flv -an rtmp://localhost/live/cam2

Reference

SIP application server

###Asterisk

####Configuration

#####Install from repository > apt-get install asterisk

#####Configuare it as

  1. create users in Asterisk (file /etc/asterisk/sip.conf)
  2. add numbers on users in section internal of /etc/asterisk/extensions.conf, don't forget reloading adduser
  3. Need to configure for SIP JS client
  4. websocket connection troubleshot on Asterisk 11, and WebRTC issue

###NKSIP NKSIP is a SIP application server based on Erlang

 nksip:start(c2, nksip_tutorial_sipapp_client, [],  \
            [{plugins, [nksip_uac_auto_auth]}, {from, "sip:c2@nksip"}, \
			 {transports, [{udp, {127,0,0,1}, 5080}, {tls, {127,0,0,1}, 5081}]}  \
			]).

##crtmpserver

Git repo with README about how to configure the server

Linphone

The reference of command line linphonecsh, linphone configuration file .linphonerc

Compile the source of linphone

compile source

Configuration on Asterisk

  • The linphone SIP account configuration in Asterisk
  • Another configuration Part 2

#SIP js Browser side

WebRTC with Asterisk configuration

webRTC + cordova

Approach

Problems

No audio from SIP client to WebRTC with asterisk server in middle

###Stackoverflow threads

Attachment

Bluetooth C protocol stack

BlueZ

FFserver configuation file


HTTPPort 8090
#BindAddress 0.0.0.0
MaxHTTPConnections 20
MaxClients 10
MaxBandwidth 1000000
#NoDaemon

<Feed feed1.ffm>
File /tmp/feed1.ffm
FileMaxSize 50M
ACL allow 127.0.0.1
</Feed>

<Stream live.webm>
Feed feed1.ffm
Format webm
NoAudio
VideoCodec libvpx
VideoSize 160x120
VideoFrameRate 24
AVOptionVideo flags +global_header
AVOptionVideo quality realtime
PreRoll 20
StartSendOnKey
#VideoBitRate 256
#VideoFrameRate 1/24
VideoBufferSize 0
</Stream>

<Stream stat.html>
Format status
# Only allow local people to get the status
ACL allow localhost
ACL allow 172.16.0.0 172.16.255.255
</Stream>
# Redirect index.html to the appropriate site

<Redirect index.html>
URL http://www.ffmpeg.org/
</Redirect>


ffmpeg webm setting

     > ffmpeg -f video video4linux2 -video_size 80x60 -i /dev/video0 http://127.0.0.1:8090/feed1.ffm
     > ffserver -f configfile

Streaming with node, websocket and js

blog

streaming with crtmp

  • something about stream out rtsp
  • Great demo for streaming via websocket

Asterisk 13 Installation

  1. https://wiki.asterisk.org/wiki/display/AST/Building+and+Installing+pjproject
  2. https://kunjans.wordpress.com/2015/01/09/web-sip-client-sipml5-with-asterisk-13-on-centos-6-6/

add Asterisk in upstart (ubuntu)

This link gives the tutorial about upstart also.

Dependencies

	 > aptitude install g++ build-essential git subversion wget

install pjproject

     > git clone https://github.com/asterisk/pjproject.git
     > git checkout tags/pjproject-2.3
	 > ./configure --prefix=/usr --enable-shared --disable-sound --disable-resample --disable-video --disable-opencore-amr
	 > make dep && make && make install && ldconfig
	 > ldconfig -p | grep pj
     > ./configure
     $ ./configure --with-crypto --with-ssl --with-srtp # here or not
     > make menuselect # check packages
     > ./contrib/scripts/install_prereq install
     > ./contrib/scripts/install_prereq install-unpackaged
	 

     > aptitude install libsrtp-dev libjansson-dev libncurses-dev uuid-dev libgnutls-dev libneon27-gnutls-dev libsnmp-dev libsqlite3-dev sqlite3 libspeex-dev libgsm1-dev

     > svn checkout http://svn.asterisk.org/svn/asterisk/branches/13 asterisk-13
	 > aptitude install libsrtp-dev
	 > aptitude install libjansson-dev
	 > aptitude install libncurses-dev
	 > aptitude install uuid-dev
	 > aptitude install libgnutls-dev
	 > aptitude install libneon27-gnutls-dev
	 #> aptitude install linux-kernel-headers
	 > aptitude install libsnmp-dev
	 > aptitude install libsqlite3-dev
	 > aptitude install sqlite3
	 > aptitude install libspeex-dev
	 > aptitude install libgsm1-dev
	 #> aptitude install libpjsip2
	 #> aptitude install libpjsip-ua2
     > ./configure --with-crypto --with-ssl --with-srtp

Create the DTLS certificates (replace pbx.mycompany.com with your ip address or dns name, replace My Super Company with your company name):

$ ./ast_tls_cert -C pbx.mycompany.com -O "My Super Company" -d /etc/asterisk/keys
  1. compile with libsrtp problem
  2. install asterisk 13 on ubuntu 14.04
  3. asterisk configuration with SIP.JS

Motion detection

  1. simple method with openCV, and a low cost camera computer vision software

OverSIP

  1. configuration for WebRTC

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment