Building a Video Baby Monitor with a Raspberry Pi and Wt

How it looks on the Raspberry Pi LCD touchscreen.

There are a lot of baby monitors out there on the market. But none of the ones with video really seemed like a good fit for us. Most either work over wifi, in which case who knows where else that video is going, or talk directly to a monitoring device, which then means keeping track of and charging another thing. I thought I could do a little better.

The task seemed straightforward enough: somehow get pictures from a camera pointed at the baby, onto my local Raspberry Pi server, where they could then be viewed by anybody with a web browser on the LAN. This way, everything stays local to our home network and we could use our smartphones to check in on the little sleeper.

I tackled the receive image and show it on a webpage side of the project first. For this I used the Wt C++ toolkit. Wt is kind of like Qt but for the web. It makes designing layouts and interfaces like building any other GUI app. It's also possible to create REST endpoints using Wt. This meant I was able have the ability to receive an image and have the viewing interface in the same application. This simplified things a lot since all the business logic could be placed in a single executable.

One cool side-effect of working with pictures is that it makes creating timelapses, like the one below, extremely easy. Watching a full night timelapse can then be pretty entertaining to see exactly how the little one ends up in such weird positions. Sometimes, the baby is most definitely not how you left it.

Timelapse shot using a few frames.

The second piece, the camera, is built from a Raspberry Pi Zero W with an infrared camera and a couple of IR LEDs, all mounted to a mini-tripod. All the camera does is take pictures every few seconds and send them to the local server, running the Wt application, via an HTTP POST request. Because all the important logic is in the Wt application, the camera unit itself remains relatively dumb. This is important so that the camera can be unplugged or moved without affecting the viewer application. No need to worry about pulling the power cord whenever.

Any web browser on the local network can access the viewing page. In practice this usually means a smartphone, but also sometimes another browser window set to 'always on top' when on a PC. The viewer shows the most recent picture it received and the timestamp that it was taken. The timestamp is important so it is easy to tell if something has stopped working. The viewer will also push the newest image over websockets to the browser and update itself without having to do any page reloads.

Overall, this turned out to be one of my more useful household projects. Both my wife and I have been using it to keep an eye on the little one while napping or in the evenings during bedtime.

Check out the full code on GitHub if you're interesting in the specifics.

I should also note, that this isn't - and shouldn't be - used as the main source of monitoring a child or baby. Our apartment is rather small so we are always in earshot. This is not a safety device! It's simply to check in on how the little one is sleeping.


Tags:


A Year of Job Postings from the Yukon Government

Preamble

Skip ahead to the next section if you just want to see some pictures

About a year ago, I unfortunately had to turn down a job in Whitehorse with the Yukon Government. So to make sure that I didn't miss any postings in the future, I wrote a short R script to scrape the Yukon's job listings page.

Every morning the script would run on my local Raspberry Pi server and see if there were any new job listings. If there were, it would send me an email with some basic information like title and department. In addition to emails, the script also saved new listings in an SQLite database for preservation. This is also how the script knew that a job post was new. It compared the scraped job postings to those already saved in the database.

Just anecdotally, I noticed the daily emails stopped for a while around April. I guess recruitment was put on hold during the early days of Corona lockdown. Once the emails did resume, it was mainly medical positions, like nurses etc.

Sadly, no suitable position popped up during the past year but, being a data analyst, I didn't want all the data I scraped to go to waste. So I thought it would be interesting to take a look at a few stats about Yukon Government job postings over the past year.

Note: Just to preface these numbers, when I talk about a new job listing what I mean is a job listing with either a new ID or a new closing date. I chose this definition because I noticed that sometimes the same job listing would get re-posted with a new closing date. Also, I think some job listings can be for multiple open positions. So a new posting doesn't necessarily equate to only a single job opening.

Totals

In total, I collected 440 job postings throughout 2020. I started collecting listings in late January 2020 until the end of December 2020. So not quite a full year but most of the postings from the beginning of January were probably gathered in the initial scraping. So I would say it's roughly the full year.

51 job listings contained the title 'nurse' or 'RN', by far the most common. This means over 10% of job listings are for different types of nurses. Become a nurse if you want an easy time finding work in the Yukon.

36 job postings were re-listings (same ID but different closing date). I don't know the specifics of why each posting was re-listed, it could be a lot of reasons. However, one reason might be that the position was unable to be filled and maybe this means that these are difficult-to-fill positions.

Of these 36 re-postings, 7 (almost 20%!) were nurses or RNs. Did I already mention you should become a nurse if you want to find work in the Yukon? Several others were generally high-level positions like: directors, managers, supervisors, etc. And some were more specialized jobs like Infection Control Coordinator.

Cities


It shouldn't be much of a surprise that the vast majority of job postings are for positions in Whitehorse, followed by Dawson City at a distant second.

Job postings that contained multiple locations got counted for each city. So the percentages shown add up to more than 100%. There were quite a few job postings applicable to more than one location.

Departments


Here we see the number of job postings aggregated by department. I left out sub-departments because there are too many and would clutter the chart.

Similar to locations, departments are also quite skewed towards a few big ones. With Corona putting a freeze on a lot of hiring, then resuming only health-related recruitment, it makes sense that the Health and Social Services department had the most job postings in the last year. Highways and Public Works also had quite a few open positions this year. Education having so few job postings surprised me a little. I thought it would have been higher.

It would be very interesting to see how this compares to previous non-Coronavirus years.

All Job Postings

Just for fun, below is a sunburst plot with all the job postings I collected grouped by department and then job title. Try clicking on a department to view all the job postings under them over the past year.

Data and Scripts

For anybody interested in the data I collected, or the scripts I used, visit the project on my GitHub. I have the job postings there in CSV format as well if somebody wants to take a look themselves.

One thing I didn't collect (that I really should have, had I had more forethought) was the day each job posting was listed. It would have been cool to see recruitment of different department change drastically week-by-week during the lockdown.

Thanks for reading!


Tags:


Rewriting LaserChess in the Godot Engine


Download link a the bottom of this post

I'm not a professional developer by any metric, but I do like to dabble. About 8 years ago now, I created a computer version of the board game Khet, an Egyptian-themed, chess-like game involving mirrors and lasers. I changed it to a space theme and gave it the creative title of LaserChess.

LaserChess was written in the now ancient SDL 1.2 and targeted the GPH Caanoo open source handheld. SDL has sinced moved on, and this made trying to recompile it on a modern Linux installation a bit of pain. So I took this as a fun excuse to dive a little bit into the Godot Engine. Godot is an open source 2D/3D game engine and editor.

With Godot's editor and scripting language, it was pretty easy to get a quick prototype of the basic gameplay functionality up and running. Implementing the game's logic in GDScript was also surprisingly easy. LaserChess being a turn-based board game with strict rules certainly helped make things a bit simpler. Most of my time was actually spent on the user experience, trying to make buttons and menus look fancy (I hope somewhat successfully).

LaserChess also features an AI opponent, built from scratch in C, using well established techniques from chess AI engines. Godot has a feature that it calls GDNative which allows you to compile C code for integration into a Godot application. So using GDNative, I was able to re-use my old AI code to include in the Godot version of LaserChess. Despite the AI missing many optimizations, it still poses quite a challenge and I have yet to beat it on higher difficulties.

In the end, I was really impressed at what Godot was able to do with relative ease. Getting a functional prototype running was so quick I can see it being a great tool to evaluate gameplay ideas without having to start something from scratch. GDNative is also a pretty attractive feature by letting you interface with so much other code; you're not restricted to staying within Godot's scripting language.

If you're interested in the game itself, give it a try on Windows or Linux.


Tags:


Historical Linux Statistics from Steam's Hardware & Software Survey

Steam is a popular gaming storefront and platform on Window, macOS, and Linux. Every month they publish their Hardware & Software Survey with overall summary statistics of their users. The data can also be segregated by operating system.

I wanted to see how the share of Linux has changed over time on Steam. Unfortunately, the survey data is only ever available for the previous month. So I wrote a small R script (GitHub link) to scrape historic survey results from the internet archive's Wayback Machine and current data directly from Valve. A few months were missing from the Wayback Machine, which was a bummer, but enough data was available to get a feel for how the metrics have changed over time.

Steam Survey Linux User Percentage for this month

Linux User Percentage

The drop in Linux user share in late 2017 to early 2018 was due to a combination of factors. First was a counting error in the survey that Valve admitted to and later fixed. The error resulted in over-inflated user numbers from net cafes. Additionally, this time period was the peak player count for the hugely successful title PlayerUnknown's Battlegrounds. PUBG brought a lot of new players to the Steam platform from regions where net cafes are a popular way to play games. Both of these factors combined to substantially deflate the Linux user share on Steam.

Because of overall growth of Steam however, a drop in Linux share does not necessarily mean an absolute drop in Linux players. In Steam 2019 Year in Review, they mention and monthly active user count of almost 95 million. That equates to about 850k monthly Linux players during 2019.

Steam Survey Linux User Percentage

Linux User Percentage by OS Language

Restricting to reported OS language shows some interesting results with regards to the Linux share on Steam. The percentage of Linux users, of those with an English language OS, is around twice that of the general population. It's unclear whether this difference is due more English speakers preferring Linux, or more Linux users preferring English. But it's a surprising difference non-the-less.

Steam Survey Linux User Percentage by OS Language

Processor Preference of Linux Users

Since the release of AMDs Ryzen CPU line in early 2017, more and more Linux users have been foregoing Intel processors in favour of AMD. However, Intel still has a clear market lead.

Steam Survey Linux User Processor Vendor Usage

GPU Vendor Preference of Linux Users

AMD is taking up ground in the GPU space among Linux users on Steam. The results of AMDs open source initiative began to bear fruit in 2017/18 as game performance approached some of Nvidia's offerings.

Despite a closed-source video driver, Nvidia still remains the main choice among Linux users.

Steam Survey Linux User GPU Vendor Usage

Most Popular Linux Distros

Steam (unfortunately) does not report many different Linux distributions, preferring to group most in the 'Other' category.

Ubuntu remains the most popular Linux distribution on Steam and many Linux games specifically target Ubuntu as a supported OS. This has the effect of generally being the smoothest experience for new users.

Outside of Ubuntu, there is a great variety of Linux distributions, many of which will also have no issues running games on Steam.

Steam Survey Linux Top Distros

In addition to the snapshot of data above, I've setup a page on my Linux gaming blog with the same charts that are updated automatically, with an R script, whenever new data becomes available.


Tags:


Overlaying Frames-per-Second on a Benchmark Video Using R, ffmpeg, and Kdenlive


Feral Interactive is a UK-based porting house that specializes in bringing Windows games to other platforms like Linux and macOS. One of their most recent projects was bringing the game Shadow of the Tomb Raider to Linux. I wanted to compare the performance of their native Linux version of the game versus running it in Linux using a popular compatibility layer called Wine. Running games on Linux with Wine often incurs some performance cost compared to Windows so there is still a market for native Linux ports that can recover some of that lost performance.

Conveniently, Shadow of the Tomb Raider contains a built in benchmark tool that will spit out its results to a text file where it can then be analyzed with R. The raw data looks a little like this:

  frame  time delta memory
  <int> <dbl> <dbl>  <dbl>
1     1   0     0     2341
2     2  14.4  14.4   4462
3     3  35.7  21.3   4462
4     4  53    17.3   4462
5     5  72.1  19.1   4462
6     6  91.6  19.5   4462

Frame is the id of the current frame, time is the milliseconds since the start of the benchmark, and delta is the amount of time it took to draw the frame. Most gamers don't really care about these numbers though; the most relatable metric is frames-per-second which is the number of frames that are able to be drawn in one second. To calculate this I just look a the time it took to draw the previous 50 frames, then 50 divided that time is the rolling FPS.

With FPS calculated, it's easy to use R and ggplot2 to make a nice graph showing the performance of the benchmark over time.


That's neat, but what I really wanted was to overlay the chart over footage of the actual benchmark so that people could see how different in-game scenes effect the frames-per-second. To do this I used a few tools: R again for the chart generation, ffmpeg to turn pictures into a video, and then Kdenlive to edit the video.

Generating Charts:

To embed a moving chart in a video, I used R and ggplot2 to generate 1 chart per video frame. That works out to 4000 individual charts due to the benchmark being 160 seconds long and wanting 25 frames per second. Each new frame shifts a window showing the next 1/25th of a second of data and 10 seconds worth data over the whole image.

To make things look a bit nicer in the final video, the background of the charts had to be a colour that could easily be chroma keyed out. Chroma keying can remove a certain colour from a video layer, basically green screening. So all 4000 charts looked something like the following beautiful image.


Turning Charts into a Video:

Thankfully, turning a series of images into a video is rather common problem and there are a lot examples online of using ffmpeg to do this conversion. So I shameless borrowed the following command to turn all 4000 charts into a video. I won't pretend to know what all of the arguments do, but importantly it is set to 25 frames-per-second to match the timing of the generated charts. Without this the scrolling chart would be too fast or too slow and would not line up with the benchmark footage.

ffmpeg -r 25 -f image2 -start_number 1 -i plots/fps_%d.png -vcodec libx264 -profile:v high444 -crf 0  -pix_fmt yuv420p sottr_fps.mp4

Overlay FPS Video on top of Benchmark:

Kdenlive is an open-source video editor for Linux. Video editing is still one of the areas of desktop Linux that is still a bit lacking, but Kdenlive crucially has a chroma key feature which is the key component in this step. The video generated from the bright green charts is overlayed on the footage of the Tomb Raider benchmark then the chroma key is applied.

In this screenshot you can see the chroma key effect being applied to the bright green of the chart video. It removes the background and turns it into a very nice looking overlay.


So that's it. I really enjoyed this little project because it was the combination of several tools (R, ffmpeg, and Kdenlive) that really made it possible. Each had a specific task and it all came together nicely.

Check out the final result on YouTube.


Tags: