So, I’ve been trying to stream audio off of a USB microphone connected to an Arduino Yun.
Looking into it online I found some examples using ffserver & ffmpeg, which sounded like they could do the trick.
However right from the start I’ve had many problems with playing the streams on Android and iOS devices.
Seems Android likes a certain list of codecs (http://developer.android.com/guide/appendix/media-formats.html) and iOS like a different set of codecs (Link here), but they do have on codec in common – good ol’ MP3.
Unfortunately, the OpenWRT on the Arduino Yun has an ffmpeg build which does not provide MP3 encoding… it does have the MP3 muxer/container format, but streaming anything other then MP3 in it (for example MP2, which the Yun-ffmpeg does have) simply doesn’t work on the Android/iOS.
From experiments streaming from my PC a ffmpeg/libmp3lame MP3 stream, it looks like the mobile devices are quite happy with it – so I will need to recompile ffmpeg with Lame MP3 support to be able to stream it.
Toolchain to build for the Yun
First we need to setup the toolchain for the OpenWRT-Yun.
Fortunately, the guys at Arduino have made this much simpler than the original OpenWRT toolchain building at: http://wiki.openwrt.org/about/toolchain
The instructions are here: https://github.com/arduino/openwrt-yun
I’ve setup a Debian Wheezy in a VirtualBox VM, and followed the instructions.
Quickly though I discovered I didn’t allocate enoiugh space for the VM, so I had to extend it using GParted, following this: http://blog.mwpreston.net/2012/06/22/expanding-a-linux-disk-with-gparted-and-getting-swap-out-of-the-way/
That was painless.
I ended up needing 40Gb of space, and not 30Gb like the Arduino guide says.
Apparently I had to remove the ‘kissdx’ Makefile to make it compile successfully all the way through. That library is missing it’s SVN source repo…
Modifications to the ffmpeg package
So the ffmpeg package on the OpenWRT-Yun does not include libmp3lame, I changed the Makefile to add it: (you can pick up this diff and patch it, be sure to be in the ${OPENWRT}/feeds/packages/
dir to patch it right)
The major difficulties were to
– identify the right dependency library (which is ‘lame-lib’ and not ‘lame’, because only that supplies the liblamemp3.so.0 file that ffmpeg needs).
– add in the ‘eval’ section in the end.
Both things needed to be done in order for it to build properly and not crash in the end with “missing dependencies” on the liblamemp3.so.0 file.
To build just the package run:
make package/ffmpeg/compile
You can add “V=s” for verbosity
Download the complied libraries here:
Building, Uploading and Testing
Once it’s built I had to move the packages to the Yun for installing with opkg, using scp.
The packages are going to be stored in {$BASE_DIR}/bin/ar71xx/packages
And the packages to move are:
- ffmpeg_0.8.7-2_ar71xx.ipk
- libffmpeg-full_0.8.7-2_ar71xx.ipk
- ffserver_0.8.7-2_ar71xx.ipk
On the Yun, I installed ‘lame-lib’ from the ordinary repo:
$ opkg update $ opkg install lame-lib Installing lame-lib (398-2-3) to root... Downloading http://downloads.arduino.cc/openwrtyun/1/packages/lame-lib_398-2-3_ar71xx.ipk. Configuring lame-lib.
Then I installed the ffmpeg libs, starting with libffmpeg-full:
$ opkg install libffmpeg-full_0.8.7-2_ar71xx.ipk Installing libffmpeg-full (0.8.7-2) to root... Collected errors: * opkg_install_pkg: Package libffmpeg-full md5sum mismatch. Either the opkg or the package index are corrupt. Try 'opkg update'. * opkg_install_cmd: Cannot install package libffmpeg-full
Oh oh… can’t install due to weird shit. Nevermind, the solution is to wipe the Yun’s registry of the md5sums:
$ rm /tmp/opkg-lists/attitude_adjustment
Now, don’t run opkg update
again or it will bring the md5sums back…
$ opkg install libffmpeg-full_0.8.7-2_ar71xx.ipk Installing libffmpeg-full (0.8.7-2) to root... Configuring libffmpeg-full.
And install the rest as well the same way (using the direct filename).
The ffmpeg should now be happy about it’s libmp3lame encoder:
# ffmpeg -codecs | grep mp3 ffmpeg version 0.8.7, Copyright (c) 2000-2011 the FFmpeg developers built on Sep 29 2014 11:31:59 with gcc 4.6.3 20120201 (prerelease) configuration: --enable-cross-compile --cross-prefix=mips-openwrt-linux-uclibc- --arch=mips --target-os=linux --prefix=/usr --enable-shared --enable-static --disable-debug --pkg-config=pkg-config --enable-gpl --enable-version3 --disable-asm --disable-doc --disable-dxva2 --enable-pthreads --disable-optimizations --enable-small --disable-stripping --enable-zlib --disable-outdevs --enable-libmp3lame libavutil 51. 9. 1 / 51. 9. 1 libavcodec 53. 8. 0 / 53. 8. 0 libavformat 53. 5. 0 / 53. 5. 0 libavdevice 53. 1. 1 / 53. 1. 1 libavfilter 2. 23. 0 / 2. 23. 0 libswscale 2. 0. 0 / 2. 0. 0 libpostproc 51. 2. 0 / 51. 2. 0 EA libmp3lame
Huzzah!
Streamin’
So streaming mp3s now is easier, but not totally because the Yun can’t handle the MP3 encoding fast enough… WTF
I found a workaround, that sucks because it won’t help streaming live feed from the microphone, but it will stream an MP3.
First I encode the MP3 with Lame in offline and the use -acodec copy
on ffmpeg to make it simply copy the stream instead of transcoding.
The ffserver.conf
file looks like:
Port 8090 BindAddress 0.0.0.0 MaxHTTPConnections 2000 MaxClients 1000 MaxBandwidth 1000 CustomLog - NoDaemon <Feed feed1.ffm> File /tmp/feed1.ffm FileMaxSize 200K # WORKAROUND! Launch ffmpeg -re -i /mnt/sda1/sample.mp3 -acodec copy ACL allow 127.0.0.1 </Feed> <Stream test.mp3> Feed feed1.ffm Format mp3 AudioCodec libmp3lame AVOptionAudio flags +global_header AudioBitRate 32 AudioChannels 1 NoVideo </Stream>
Then simply running: ffserver -f ffserver.conf -d
will begin streaming.
For convenience, I also find it nice to setup an HTML file with a link to the stream, or an <aduio>
tag, to make it easier on the mobile browsing. Serving it either with python -m SimpleHTTPServer 8000
or Node.JS.
Capturing sound
But my original plan was to stream from the microphone… wuuut?
So I’ve purchased a USB audio adapter (cheapest on Amazon: http://amzn.to/2tXy6tb) and the Yun is happy with it (after following: http://wiki.openwrt.org/doc/howto/usb.audio).
To setup ffmpeg to read the stream from the audio driver, I first needed to lookup the format:
# cat /proc/asound/card0/stream0 C-Media USB Headphone Set at usb-ehci-platform-1.1, full speed : USB Audio Playback: Status: Stop Interface 1 Altset 1 Format: S16_LE Channels: 2 Endpoint: 1 OUT (ADAPTIVE) Rates: 48000, 44100 Capture: Status: Stop Interface 2 Altset 1 Format: S16_LE Channels: 1 Endpoint: 2 IN (ASYNC) Rates: 48000, 44100
If I used ffmpeg with a wrong sample format it would simply say: [alsa @ 0x78e410] cannot set sample format 0x10001 3 (Invalid argument)
So the right way is:
ffmpeg -f alsa -acodec pcm_s16le -ac 1 -ar 44100 -i hw:0,0,0 -acodec mp2 output.mp3
Notice the first -acodec pcm_s16le
will match the S16_LE
from the ALSA card description.
I’m also converting it to MP3 (container for an MP2 stream) so it won’t take up too much space on the Yun’s little disk, but you can encode to anything else.
Also note the -i hw:0,0,0
matches my configuration where the card gets ID 0, the device ID is also 0, and stream ID – 0. In another configuration it could be different, so explore the /proc/asound/cards
and /proc/asound/devices
descriptor files.
Discussion
When all the pieces are in place, potentially this would stream the microphone to an Android & iOS readable MP3 format.
But, alas, the Yun can’t handle the Lame encoding…
I’ve tried streaming just about any other format and container ffmpeg/Yun knows (MP3/MP2, MP4/AAC, MPEGTS/…, WAV/PCM, OGG/Vorbis) and nothing works consistently with both Android&iOS, and even Chrome/Safari on the PC differ… Chrome seems to like MP3/MP2 but Safari likes WAV/PCM…
Anyway the Yun also seems to be capped in terms of outgoing bandwidth, and I’ve had no luck streaming in more than 100kbps. Luckily compressed streams can be as low as 32kbps (22050Hz, mono, s16), but uncompressed PCM u8 8kHz can also do 64kbps.
So, there you have it. I was partially successful, but I’ve learned a lot from messing around with the Yun.
Now it’s your turn to fix everything!
Enjoy
Roy.
2 replies on “FFMpeg with Lame MP3 and streaming for the Arduino Yun”
What was the audio quality from the microphone like?
Hi Roy,
I am attempting the same thing as you. Where did you find that the bandwidth of the Yun is capped? Since if not, it may be able to support a small number of listeners using WAV/PCM streaming live without delay.
Also I am trying different codecs at the moment. The sound quality of MP2 encoder at 64kbps is okay but I get delays at the client side every 10 to 20 seconds. Next up is AAC and MP3 using Lame.