I wish to capture an rtsp stream and convert it to an mjpeg (over http) stream using ffmpeg. I am running Ubuntu 20. I have searched and searched for the solution, and mostly find:
a) solutions requiring ffserver (deprecated)
b) solutions converting from mjpeg to rtsp
c) solutions converting from rtsp to hls (nginx, wowza, etc…) which doesn’t work in my application. I need http output as mjpeg.
d) vlc – which does work but requires way too much of my available processor (80%)
e) rtsp2mjpg – github project which I installed, but could not get to work and can’t get any support.
I am not an ffmpeg expert, so if someone could step me through an ffmpeg solution to this, if it exists, I’d really appreciate it.
2
Answers
I’ve very recently solved this myself, after finding the exact same things as you. The two parts are you need are (1) ffmpeg conversion in a script, and (2) something like lighttpd+cgibin or nginix+fastcgi to serve it over http/https. I don’t expect you’ll be able to do much better in terms of CPU use than vlc, though.
This bash script will do the ffmpeg conversion to MJPEG, and send the output to stdout. Put this in lighttpd’s cgi-bin folder (/var/www/cgi-bin for me). Call it something like "webcamstream", and adjust the rtsp:// URL to suit your camera:
Enable cgi-bin for lighttpd:
..and then adjust lighttp’s cgi-bin configuration (/etc/lighttpd/conf-enabled/10-cgi.conf) as shown below. The stream-response-body setting is important, as it’ll both stop the stream when the client disconnects, and also avoid having lighttpd try to buffer the entire infinite stream before sending anything to the client.
Make the cgi-bin script executable and restart lighttpd:
…and that should be it. You can then access the MJPEG stream at a URL like this, where the last part is your script’s name:
I’ve written it up in more detail here: Converting RTSP to HTTP on demand
As far as I can tell, you can’t avoid taking the CPU hit of the conversion — the format/encoding of RTSP vs. MJPEG frames are different. I reduced my CPU load by configuring the camera to reduce the source’s framerate and resolution until it was an acceptable load on ffmpeg. You can change the resolution and framerate with ffmpeg arguments as well, but it would still have to decode the full frames first and do the work of resizing.
The paths above are on Debian, so you may need to adjust them to suit your Ubuntu system.
Convert RTSP to MJPEG via FFSERVER
ffmpeg download:
choose old version before 3.4(since this version FFSERVER WAS REMOVED), recommand to use
3.2.16
Compile
FFSERVER
cd /u/tool/ffserver/bin
edit {ffserver.conf}
Run FFSERVER
pixel err
1280×720 == 720×405
if you use
VideoSize 720x405
,startup error message shows:fix 405 to 404.
FEED STREAMING
ffmpeg feed, DO NOT USE SYSTEM BUILD FFMPEG!! CAUSE AFTER VERSION 3.4, IT WAS REMOVED!
use the ffmpeg you just compiled same directory with ffserver.
Browse mjpeg:
Browse snap image:
mjpeg status
Prevent RTSP stopped broke mjpeg image updating,loop update image path in JS every N seconds(ig: 15):
run server
run debug mode
If your want to run FFMPEG in console background, try to use
-nostdin
to FFMPEG or run in terminal multiplexer like SCREEN or TMUX.