Video and Audio Streaming from A20SOM-EVB using crtmpserver

To stream video and audio from A20SOM-EVB, ffmpeg and crtmpserver are used. FFmpeg is used to encode the video and audio from A20SOM-EVB csi-camera and mic-in and then feed the stream to crtmpserver. Crtmpserver act as streaming server to provide rtmp stream for flash player, or rtsp stream for video player such as videolan.

If the csi camera is not yet functioning on the default debian distro, please follow my previous article how to enable csi-camera on A20SOM-EVB.

FFmpeg Installation

If we want to have h264 hardware encoding support, we need to compile FFmpeg with cedrus driver.
  • First, install required library:
sudo aptitude install libvdpau-dev libx264-dev libmp3lame-dev libpulse-dev libv4l-dev libtheora-dev libvorbis-dev libopencore-amrnb-dev libopencore-amrwb-dev libvo-amrwbenc-dev libasound2-dev
  • Download and compile FFmpeg with cedrus support from git.
git clone https://github.com/Alcantor/FFmpeg.git -b sunxi-cedrus
cd FFmpeg
./configure --prefix=/usr --enable-nonfree --enable-gpl --enable-version3 \
--enable-vdpau --enable-libx264 --enable-libmp3lame --enable-libpulse \
--enable-libv4l2 --enable-libtheora --enable-libvorbis \
--enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libvo-amrwbenc
make
sudo make install

Crtmpserver installation

  • Install crtmpserver.
sudo aptitude install crtmpserver
Edit /etc/crtmpserver/applications/flvplayback.lua, and adjust the configuration to be like the following:
application=
{
        description="FLV Playback Sample",
        name="flvplayback",
        protocol="dynamiclinklibrary",
        mediaFolder="/var/lib/crtmpserver/mediaFolder",
        aliases=
        {
                "simpleLive",
                "vod",
                "live",
                "WeeklyQuest",
                "SOSample",
                "oflaDemo",
        },
        acceptors =
        {
                {
                        ip="0.0.0.0",
                        port=6666,
                        protocol="inboundLiveFlv",
                        waitForMetadata=true,
                },
                {
                        ip="0.0.0.0",
                        port=554,
                        protocol="inboundRtsp"
                },
        },
        validateHandshake=false,
 keyframeSeek=false,
 seekGranularity=0.1,
 clientSideBuffer=30,
}
Edit /etc/default/crtmpserver, and change ENABLED="no" to ENABLED="yes".
  • Start crtmpserver.
sudo service crtmpserver start

Encoding and init script

Now we need to encode the video and audio from A20SOM-EVB and feed the result to crtmpserver.
Plug in a microphone to the mic-in port so that we can stream captured audio from the board.

Create /etc/campipe directory and create capture.sh file, and paste the following code:
#!/bin/bash

ffmpeg -f v4l2 -s hd720 -pix_fmt nv12 -i /dev/video0 -f alsa -i sysdefault:CARD=sunxicodec -pix_fmt nv12 -qp 20 -c:v cedrus264 -b:v 300k -r 30 -vewait 3600 -ar 44.1k -b:a 128k -c:a aac -strict -2 -f matroska - | ffmpeg -i - -c:a copy -c:v copy -f flv -metadata streamName=livestream tcp://0.0.0.0:6666
Above script will capture video from csi-camera on /dev/video0 and audio from mic-in port, encode it using h264 hardware encoder and aac audio codec and combine it as matroska container muxer and then pipe the stream to another FFmpeg and then feed the result as flv container muxer to crtmpserver. We must pass the first stream to another FFmpeg because if not, the video cannot be played by video client player (Flash player). I don't know why, but this is what I found after several hours struggling with the problem.

Now, to make the script to be executed at boot time, we create an init script /etc/init.d/campipe with this content :
#!/bin/sh
### BEGIN INIT INFO
# Provides:          campipe
# Required-Start:    $network crtmpserver
# Required-Stop:     $remote_fs
# Default-Start:     2 3 4 5
# Default-Stop:      0 1 6
# Short-Description: Piping and encoding camera to streaming server
# Description:       Piping and encoding using ffmpeg and send to streaming server
### END INIT INFO

# Author: Arieedzig 

PATH=/sbin:/usr/sbin:/bin:/usr/bin
DESC="Camera pipe"
NAME=campipe
DAEMON=/usr/bin/screen
SCREEN_NAME="Capture"
DAEMON_ARGS=" -dmS $SCREEN_NAME "
STOP_ARGS=" -XS $SCREEN_NAME quit"
DAEMON_CONF="/etc/campipe/capture.sh"
DAEMON_USER="root"
PIDFILE=/var/run/$NAME.pid
SCRIPTNAME=/etc/init.d/$NAME
ENABLED="yes"

#[ -r /etc/default/$NAME ] && . /etc/default/$NAME

[ -x $DAEMON ] || exit 0
[ -r $DAEMON_CONF ] || exit 0

[ $ENABLED = "yes" ] || exit 0

. /lib/init/vars.sh
. /lib/lsb/init-functions

do_start()
{
    if [ -f $PIDFILE ]; then
      RETVAL=2
      echo -n " is already running ($PIDFILE found)"
    else
        start-stop-daemon --start --quiet --background -m --pidfile $PIDFILE --exec $DAEMON --test > /dev/null \
                || return 1
        start-stop-daemon --start --quiet --background -m --pidfile $PIDFILE --exec $DAEMON -- \
               $DAEMON_ARGS $DAEMON_CONF \
                || return 2
        RETVAL="$?"
   fi
   return "$RETVAL"
}

do_stop()
{
    if ! [ -f $PIDFILE ]; then
      RETVAL=2
      echo -n " not running ($PIDFILE not found)"
    else
        start-stop-daemon --stop --quiet --oknodo --exec $DAEMON -- $STOP_ARGS
        RETVAL="$?"
        if [ $RETVAL -eq 2 ]; then
                start-stop-daemon --stop --quiet --retry=TERM/30/KILL/5 --pidfile $PIDFILE --name $NAME
                [ "$RETVAL" = 2 ] && return 2

                start-stop-daemon --stop --quiet --oknodo --retry=INT/30/KILL/5 --exec $DAEMON
                [ "$?" = 2 ] && return 2
        fi

        case "$?" in
                0|1) rm -f $PIDFILE ;;
        esac
     fi
     return "$RETVAL"
}

case "$1" in
  start)
    log_daemon_msg "Starting $DESC " "$NAME"
    do_start
    case "$?" in
                0|1) log_end_msg 0 ;;
                2) log_end_msg 1 ;;
        esac
        ;;
  stop)
        log_daemon_msg "Stopping $DESC" "$NAME"
        do_stop
        case "$?" in
                0|1) log_end_msg 0 ;;
                2) log_end_msg 1 ;;
        esac
        ;;
  status)
       status_of_proc "$DAEMON" "$NAME" && exit 0 || exit $?
       ;;
  restart)
        log_daemon_msg "Restarting $DESC" "$NAME"
        do_stop
        case "$?" in
          0|1)
                do_start
                case "$?" in
                        0) log_end_msg 0 ;;
                        1) log_end_msg 1 ;; # Old process is still running
                        *) log_end_msg 1 ;; # Failed to start
                esac
                ;;
          *)
                # Failed to stop
                log_end_msg 1
                ;;
        esac
        ;;
  force-reload)
        restart
        ;;
  *)
        echo "Usage: $SCRIPTNAME {start|stop|status|restart}" >&2
        exit 3
        ;;
esac

:
and then activate the init script at boot time.
update-rc.d campipe defaults
start the init script.
sudo service campipe start
At this point, we should be able to play our video and audio streaming from this address:
rtsp://A20SOM_IP/live/livestream
using vlc player.

Adding a web interface

To make it more cool, we add a web interface so we can see the video using web browser.
  • First, we must install a web server. My example is using Apache.
sudo aptitude install apache2
  • Download flowplayer from their website and extract it in /var/www
Edit index.html and change the content as follows:
<!doctype html>

<head>
   <!-- player skin -->
   <link rel="stylesheet" href="skin/minimalist.css">

   <!-- site specific styling -->
   <style>
   body { font: 12px "Myriad Pro", "Lucida Grande", sans-serif; text-align: center; padding-top: 5%; }
   .flowplayer { max-width: 1280px; }
   </style>

   <!-- flowplayer depends on jQuery 1.7.1+ (for now) -->
   <script src="//code.jquery.com/jquery-1.11.0.min.js"></script>

   <!-- include flowplayer -->
   <script src="flowplayer.min.js"></script>
</head>

<body>
   <!-- the player -->
   <div class="flowplayer" data-swf="flowplayer.swf" data-ratio="0.5625" >
      <video autoplay>
         <source type="video/flash" src="livestream">
      </video>
   </div>

<script>
flowplayer(function (api) {
  // work around autoplay live stream bug in Flash engine
  api.bind("ready", function (e, api) {
    api.resume();
  });
});

$(".flowplayer").flowplayer({
   live: true,
   rtmp: 'rtmp://'+window.location.host+'/live'
});
</script>
</body>
and now, we should have our video streaming via web browser on this address.
http://A20SOM_IP
You can see that the streaming has about 1 second delay, but I think it still acceptable. I have played with different combination of ffmpeg but have not succeed yet. when I try to use FFmpeg/arecord to capture video and audio independently and then combine them, I can get video without delay but the audio either too fast or too late. When some one know how to improve it, please share with me...

You can get the script on my github.


References

https://flowplayer.org/docs/setup.html#configuration
http://www.cubieforums.com/index.php/topic,2810.msg19893.html#msg19893
http://wiki.rtmpd.com/documentation


Comments

Unknown said…
Hi arieedzig,

I tried csi camera in olimex A20 board which is working fine with frame rate 15fp/s but i need to use usb camera.
but usb camera both recording and streaming is not goog, it is taking frame rate 4fp/s.
could you tell what ffmpeg command need to pass for streaming and recording?.

Regards
Punith
Unknown said…
Hi,

Once i tried to stream audio i am getting
fallowing error.

Unknown input format: 'alsa'
pipe:: Invalid data found when processing input

Regards
Punith
Unknown said…
hi Arieedzig,

i able to stream usb camera and csi camera
fine and also solve unknow input format alsa error.
for that i install libasound2-dev and compile ffmpeg again.
but now i am not able to stream with audio , i run ffmpeg as show below and my log is as fallows.

ffmpeg -f v4l2 -s hd720 -pix_fmt nv12 -i /dev/video0 -f alsa -i sysdefault:CARD=sunxicodec -pix_fmt nv12 -qp 20 -c:v cedrus264 -b:v 300k -r 30 -vewait 3600 -ar 44.1k -b:c 128k -c:a aac -strict -2 -f matroska - | ffmpeg -i - -c:a copy -c:v copy -f flv -metadata streamName=livestream tcp://0.0.0.0:6666

log:

[cedrus264 @ 0x1bbdcf0] VE in use!
Output #0, matroska, to 'pipe:':
Stream #0:0: Video: h264, q=2-31, 128 kb/s, 30 fps
Metadata:
encoder : Lavc56.0.101 cedrus264
Stream #0:1: Audio: aac, 0 channels, 128 kb/s
Metadata:
encoder : Lavc56.0.101 aac
Stream mapping:
Stream #0:0 -> #0:0 (rawvideo (native) -> h264 (cedrus264))
Stream #1:0 -> #0:1 (pcm_s16le (native) -> aac (native))
Error while opening encoder for output stream #0:0 - maybe incorrect parameters such as bit_rate, rate, width or height
pipe:: Invalid data found when processing input


and also i tried fallowing command it is streaming fine with audio but not getting audio on client side.


ffmpeg -f v4l2 -s hd720 -pix_fmt nv12 -i /dev/video0 -f alsa -i sysdefault:CARD=sunxicodec -pix_fmt nv12 -qp 20 -c:v cedrus264 -b:v 300k -r 30 -vewait 3600 -ar 44.1k -b:c 128k -acodec libmp3lame -ab 96k -f matroska - | ffmpeg -i - -c:a copy -c:v copy -f flv -metadata streamName=livestream tcp://0.0.0.0:6666
arieedzig said…
I had once the same Problem with audio but I don't remember. One of the container flv or matroska is not able to stream mp3, try to change the container. The Problem with "VE in use!", I think there is a problem with the driver of Video encoder which is compiled in ffmpeg. The driver thinks that hardware VE is still in use even though ffmpeg is already stopped. Try to restart or combine the ffmpeg command.
Unknown said…
Hi ArieedZig,

i found the problem for that it is because root permission, once i run same command with sudo i have fallowing log


sudo ffmpeg -f v4l2 -s hd720 -pix_fmt nv12 -i /dev/video0 -f alsa -i sysdefault:CARD=sunxicodec -pix_fmt nv12 -qp 20 -c:v cedrus264 -b:v 300k -r 30 -vewait 3600 -ar 44.1k -b:c 128k -c:a aac -strict -2 -f matroska - | ffmpeg -i - -c:a copy -c:v copy -f flv -metadata streamName=livestream tcp://0.0.0.0:6666

LOG:
Codec AVOption qp (Constant quantization parameter rate control method) specified for output file #0 (pipe:) has not been used for any stream. .
Codec AVOption vewait (Time to wait if the VE is busy (default 0)) specified for output file #0 (pipe:) has not been used for any stream. The m.
Codec AVOption b (set bitrate (in bits/s)) specified for output file #0 (pipe:) has not been used for any stream. The most likely reason is eit.
[aac @ 0x26b94c0] The encoder 'aac' is experimental but experimental codecs are not enabled, add '-strict -2' if you want to use it.
[lrc @ 0x1880460] Format lrc detected only with low score of 5, misdetection possible!
Input #0, lrc, from 'pipe:':
Duration: N/A, bitrate: N/A
Stream #0:0: Subtitle: text
Output #0, flv, to 'tcp://0.0.0.0:6666':
Metadata:
streamName : livestream
Output file #0 does not contain any stream
root@a20-Lime2-SOM:/#

Do have any idea how to fix this?
i am testing for csi camera now.
and also this command will work for USB camera ?

Regards
Punith
arieedzig said…
you should sudo in both ffmpeg, but I don't think it would work.Try using root user. For usb camera, it depends on your usb camera, you should adjust the parameter. You must change the -s and -pix_fmt in "-s hd720 -pix_fmt nv12 -i /dev/video0"
Unknown said…
Hi Arieedzig,

I try under root also the log is same
for that.Could you tell me how to fix this?

Regards
Punith
arieedzig said…
do you use my initscript example?try to use that script and restart the board or try to remove this parameter -vewait. I think it is better to discuss this in this forum https://www.olimex.com/forum/index.php?topic=3871.0

Popular Posts