- Fresh Raspbian Install
- Build Dependencies
- Git, Make, Install
- Test The Build
- Tweaking configuration
- Running On Startup
After my recent move, the streaming and ‘workshop/den/laboratory’ setup has been a lot of work from the ground up on the computer system side. I used to have a Raspberry PI 4 used as a remote webcam that covered my old place’s 3D printing area. While that worked, I never liked the idea of dedicating a whole Raspberry Pi 4 to that I wanted to try a Raspberry Pi Zero W and see if it could handle the load. Utilizing the popular MJPG-Streamer package, I was able to get that installed and running, though I had a few hiccups which I’ll reference at the bottom.
The article is a combination of various articles I found on the web, referencing older versions of one part or another, and this is a reference for my setup should I need to rebuild it.
- Raspberry Pi Zero W Purchased one of those kits on Amazon and it came with various cases and cables I needed, so I was happy with it.
- Logitech USB Webcam
Fresh Raspbian Install
I’m utilizing the latest version of Raspbian as of this writing, which was released around February, 2020. Flashing that to a simple 16 GB Micro SD Card for a blank slate:
Always fresh, never frozen
Assuming you’re relatively familiar with the Pi, here’s the minimum I’m generally doing as post boot up task work:
- Changing default password
- Setting Up Wifi
- Enabling SSH
- Setting Hostname
- Setting Locale/Internationalization Options
After that, the obligatory ‘update’ install:
I can now SSH into the Pi Zero W, disconnect the keyboard, and move along with the install.
We need to download some things first before we can build mjpg-streamer. I found some references to articles that pulled from the SVN repo to retrieve and compile, but I ended up getting a large amount of compile errors that look like some library was swapped but the source wasn’t updated. It’s the ‘redefinition of ‘struct statx_timestamp’’ error.
Instead of pulling the SVN version, I’m going to be pulling the Git repo of mjpg-streamer that was created by jacksonliam. This (https://github.com/jacksonliam/mjpg-streamer) is the ‘official’ successor to the now abandoned SVN version at https://sourceforge.net/projects/mjpg-streamer/.
We’ll now install our dependencies before we download and compile mjpg-streamer code:
Git, Make, Install
The make install will copy binaries, libraries and the www pages to the /usr/local/ directory structure:
- /usr/local/bin/mjpg_streamer The primary binary
- /usr/local/lib/mjpg-streamer/ The directory of input/output modules
- /usr/local/share/mjpg-streamer/www The www server interface
Test The Build
I’m going to test the build by plugging in a webcam into the Pi Zero W’s lone USB port. Executing dmesg shows that it was loaded properly:
Let’s test by trying to run mjpg-streamer against the input_uvc.so plugin, since I”m not using a Raspi-Camera, but a USB one instead. (Otherwise I’d use input_raspicam.so and enabling the camera module in raspi-config). I’ll also output using the http plugin. So my testing command line looks like this:
Executing that line gives us a dump of data here’s the relevant info:
Our key variables to tweak are -f for frame rate, and -r for resolution. I’ll change those later, but for a quick test I head on out to my browser and point it at the www server at http://cam-pi-zero.local:8080/
I always feel like… somebody’s watching me…
Not only do we see the web interface, but a snapshot of the web cam, which was, clearly, laying down on a workbench behind me while I tried this out.
Validating the stream worked (by clicking ‘Stream’) and turning around, I could see myself moving:
Hello you… come here often?
You can now hit CTRL-C in your SSH window (or terminal) and quit the stream. After this point, I could delete the build files I downloaded.
I wanted to run my camera at its intended resolution and at least a frame rate of 30fps. To do that I modified my command line:
I then loaded my own stream up in VLC by opening the direct network path to the stream:
That worked, and I get about a 1-2 second delay from Pi to VLC. So I wouldn’t use this with audio feeds unless I was planning on creating a sync delay. I’m using this to monitor 3D prints, or watch birds outside, so my use case does not demand low latency. Is it exactly 30 frames per second? No…. NO it’s not.
At 1280×720, it seemed closer, but at 1920×1080, when set at 30fps, it definitely wasn’t even close. I’d guess more like 15.
The PI wasn’t maxed out on CPU, but it could just be the nature of USB 2.0 at this point. I’m not sure how I can tell if it’s overloaded there or not, but, once again, this is a light monitoring video stream, I don’t care too much about latency. It could be WiFi as well. Someday I may dig in a bit more and find out where the bottleneck is.
I think you could probably choose either 1920×1080 15fps or 1280×720 30fps routes and be okay.
Running On Startup
I liked Jacob Salmela‘s script all I did was change it for my command line on the stop and restart section for my resolution and frame-rate. Save the following script by doing a sudo vi:
Then we enable this on startup by performing:
Now you can reboot your pi, or use commands like ‘sudo service livestream start’ I rebooted my pi, and my fed was running, and running a status command on my service yields:
Using the Raspberry Pi for video streams is good enough if we’re looking for low frame rate monitoring without audio. I’ve yet to find anything that really gives me the frame rate and audio (regardless of latency) that a standard USB webcam directly into my OBS machine would give. Maybe if NDI ever makes it on to the raspberry Pi, or if there is ever SLDP support.
Have you tried streaming across the network on a Raspberry Pi with a USB webcam? Did you fare better than I? Let me know!
Nevertheless, with that, I’m done! I can now embed this into OBS via the VLC media source or anything that can handle an HTTP Motion JPEG video stream. I won’t have audio, but that’s okay for what I’m using this for.