All posts by Kevin Campbell

Digital Photo Frame – DIY

If you’ve ever fancied a digital display frame, but perhaps wanted more than just a simple carousel of your pictures, or greater control over how the device behaves, then read on.

What we’ll build

A digital display board that showcases your photographs, with an overlaid calendar and weather forecast. For those of you into home automation, I’ll also cover how to turn the display on and off by integrating with Samsung SmartThings home hub – the basic approach could serve as a starting point for other integrations.

How we’ll do it

A Raspberry Pi zero w (about $26 with case and power supply) combined with a Dell 24″ monitor (about $120), and the Dakboard web digital display service.


The simplest way to get started is to follow this handy guide. This will help you get your Pi up and running, though I found that for the release of Raspbian current as of December 2019, I need to follow the alternative instructions for updating the autostart instructions :


Automating the Pi display

We’re going to create a RESTful web service (API) that can be invoked with a simple URL to turn the display on and off, this is what our home automation will call. This will leverage Python3, Flask, and NGINX.


This is a popular web server for the Pi, you could use apache but the tutorials I was following used nginx.

sudo apt-get install nginx


If your Raspbian install is fresh (you used noobs to get up and running) you probably don’t need to sudo apt-get update, but it might not include Python3, so let’s install that.

Log into your Pi (I use SSH from my Macbook)

sudo apt-get install python3-pip

UWSGI flask

This will install the flask API framework, which greatly simplifies the work required to create RESTful apis in python.

sudo pip3 install flask uwsgi
sudo pip3 install flask-restful

Python api program

You can clone from or use the code as displayed there.

from flask import Flask
from flask_restful import Resource, Api
import os
app = Flask(__name__)
api = Api(app)

class ScreenOff(Resource):
    def get(self):
        os.system("vcgencmd display_power 0")
        return "off"
class ScreenOn(Resource):
    def get(self):
        os.system("vcgencmd display_power 1")
        return "on"


if __name__ == '__main__':'')


We should now be able to test the flask API using the debug environment:

uwsgi --socket --protocol=http -w api:app

and then invoke the URL:


You should see the single word “off” in your browser, and if you look in your ssh output you should see something like this:

*** uWSGI is running in multiple interpreter mode ***
spawned uWSGI worker 1 (and the only) (pid: 7822, cores: 1)
[pid: 7822|app: 0|req: 1/1] () {32 vars in 644 bytes} [Wed Jan  1 12:16:54 2020] GET /screen/off => generated 6 bytes in 273 msecs (HTTP/1.1 200) 2 headers in 70 bytes (1 switches on core 0)
[pid: 7822|app: 0|req: 2/2] () {32 vars in 574 bytes} [Wed Jan  1 12:16:55 2020] GET /favicon.ico => generated 233 bytes in 330 msecs (HTTP/1.1 404) 2 headers in 72 bytes (1 switches on core 0)

Oh, and your HDMI display is now off 🙂 Type ctrl-C in your ssh session to stop the uWSGI test server. To make things more robust we’re now going to integrate uWSGI with the nginx web server.

nginx integration

If you didn’t download the project from git then you’ll need to create a file uwsgi.ini in the directory containing your .py file:

chdir = /home/pi/Documents/pirest
module = api:app
master = true
processes = 1
threads = 2

uid = www-data
gid = www-data

socket = /tmp/pirest.sock
chmod-socket = 664
vacuum = true
die-on-term = true
touch-reload = /home/pi/Documents/pirest/

This will provide the startup instructions for uwsgi when invoked from nginx.
Now we’ll get rid of the default nginx site

sudo rm /etc/nginx/sites-enabled/default

and now create a proxy file

sudo nano /etc/nginx/sites-available/pirest_proxy

containing the following:

server {
listen 80;
server_name localhost;
location / { try_files $uri @app; }
location @app {
include uwsgi_params;
uwsgi_pass unix:/tmp/pirest.sock;

Now we need to create a symbolic link to this file

sudo ln -s /etc/nginx/sites-available/pirest_proxy /etc/nginx/sites-enabled

then restart nginx

sudo systemctl restart nginx

Set uWSGI to run on boot

cd /etc/systemd/system
sudo nano uwsgi.service

and into this new file, paste the following:

Description=uWSGI Service

ExecStart=/usr/local/bin/uwsgi --ini /home/pi/Documents/pirest/uwsgi.ini


restart the daemon so the new configuration is picked up:

sudo systemctl daemon-reload

and start the service we’ve just defined:

sudo systemctl start uwsgi.service

You can check the status of the running service like this:

sudo systemctl status uwsgi.service

you should see something like this:

? uwsgi.service - uWSGI Service
   Loaded: loaded (/etc/systemd/system/uwsgi.service; disabled; vendor preset: enabled)
   Active: active (running) since Wed 2020-01-01 12:49:52 CST; 1min 3s ago

 Main PID: 8038 (uwsgi)
   Memory: 13.1M
   CGroup: /system.slice/uwsgi.service
           ??8038 /usr/local/bin/uwsgi --ini /home/pi/Documents/pirest/uwsgi.ini
           ??8044 /usr/local/bin/uwsgi --ini /home/pi/Documents/pirest/uwsgi.ini
Jan 01 12:49:55 piboard02 uwsgi[8038]: mapped 143936 bytes (140 KB) for 2 cores
Jan 01 12:49:55 piboard02 uwsgi[8038]: *** Operational MODE: threaded ***
Jan 01 12:50:07 piboard02 uwsgi[8038]: WSGI app 0 (mountpoint='') ready in 12 seconds on interpreter 0x16984f0 pid: 8038 (de
Jan 01 12:50:07 piboard02 uwsgi[8038]: *** uWSGI is running in multiple interpreter mode ***
Jan 01 12:50:07 piboard02 uwsgi[8038]: spawned uWSGI master process (pid: 8038)
Jan 01 12:50:07 piboard02 uwsgi[8038]: spawned uWSGI worker 1 (pid: 8044, cores: 2)
Jan 01 12:50:07 piboard02 uwsgi[8038]: display_power=0
Jan 01 12:50:07 piboard02 uwsgi[8038]: [pid: 8044|app: 0|req: 1/1] () {42 vars in 767 bytes} [Wed Jan  1 12:50
Jan 01 12:50:15 piboard02 uwsgi[8038]: display_power=1
Jan 01 12:50:15 piboard02 uwsgi[8038]: [pid: 8044|app: 0|req: 2/2] () {40 vars in 734 bytes} [Wed Jan  1 12:50
lines 1-19/19 (END)

You’ll see here that I tried the service on and off by entering this URL in my browser:


The final step is to ensure that this starts with every boot:

sudo systemctl enable uwsgi.service

and you’ll see something along these lines:

“Created symlink /etc/systemd/system/ /etc/systemd/system/uwsgi.service”

Reboot and test using the URL – give it time to boot, especially if using a pi zero as they’re pretty slow.

SmartThings Integration

Device Handler

OK, so you’ve got a pi running a display board, and you can turn the HDMI output on and off by opening a URL. Now to create a device handler for SmartThings that enables you to include this in automations, and of course control it from your SmartThings app.

Go grab the source over at my github repo. For now, I’m going to assume the reader knows how to create an account at the Samsung developer site

Go to My Device Handlers, and Create – From Code, then paste the code from Git. Or, you could just import my repo into your account. You’ll then need to publish it, choosing “just me” should be sufficient.

New Device

Using the IDE, create a new device:


Complete the new device screen – use your own values for Network ID (which must be unique), name, label, etc. I’ve used IP address for network ID, but what you use actually doesn’t matter.

When you get to the Type pulldown, scroll all the way down past the commercial handlers until you see the REST Switch device handler we just created. Location and Hub must be set to your SmartThings hub, you’ll typically only see a single choice in those pulldowns.

Click on the Edit link alongside Settings in the device display, and then enter the IP address and port 80 for your Pi. Your hub must be on the same network as your pi, as the hub itself will run the device handler code we just established.

After a couple of minutes the newly created device should appear in your app:

The Late Show with Stephen Colbert

You may have read other commentaries about what it takes to see The Late Show with Stephen Colbert, and this is mine!

Getting Tickets

Tickets are free, and there’s only one place to get them:

You’ll need to create an account, and then you’re able to request tickets for the show.

Tickets go quickly so don’t think you can apply for them on a whim. If you really want to be certain of seeing the show you’ll need priority tickets, as these are the only ones guaranteed admittance. Here’s the way it works:

  • The beginning of the calendar month prior to the month of the show, tickets are available through the site. Want tickets for a date in April? Login and look for them at the beginning of March
  • Request tickets for the date/time you want, you’ll only see offered those dates when taping is happening. Pay attention to the description as sometimes the taping is only for a guest segment, or only for a house band performance.
  • You’re on a waitlist, with everyone else, so wait. Nothing will happen.
  • 14 days before the date you’ve requested (generally) you’ll get an email telling you your tickets are available to be claimed. You don’t have tickets yet!
  • Login quickly and claim your tickets. You’ll see immediately if you have Priority tickets or General Admission.

If you have Priority tickets you are guaranteed admission if you arrive before the check-in time shown, if you have General Admission tickets you’ll get in if there’s room after the priority tickets are seated.

The day of the show

If you have Priority Tickets, the only question is: do you care where you sit, and are you willing to work for it?

If YES, then here’s the deal:

  • Somewhere around 2:00-2:30pm a line will start to form outside the Ed Sullivan Theatre, often close in to the building on the left side of the sidewalk
  • Between 2:30 and 3:00 the show ushers will start to move the line over to the street side of the curb, and install stanchions with tape barriers to define the line.
  • Starting at around 3:00, the Ushers will ask all those with General Admission tickets to follow them to the other side of the street, and form a new line there.
  • Check-in now begins out on the street. Ushers will examine photo-id to compare with your tickets, and issue you an arm band. You’re going to be asked to squash up close to the others in line, forming rows 4 across. They’re going to really insist on squashing you together.
  • At about 4:00 you’ll be moved inside, after going through a metal detector, and asked to continue standing in a very squished line.
  • Once everyone has been moved in off the street, they’ll start opening each section/group up to a bathroom break. This is the only one you’ll get.
  • You’ve been standing for 2 hours or so, and your back is probably killing you. It is what it is.
  • At about 4:15 you’ll see groups of people being ushered into the front of the line, in a separate waiting area. These are the Priority ticket holders who decided to forego waiting, and turn up at the last minute.
  • Somewhere around 4:45 they’ll start getting ready for seating. The last minute priority ticket holders will be given different colored wrist bands, and they’ll be taken upstairs to the back of the balcony. They were admitted, but they’ll sit right at the back of the nosebleeds
  • The doors open and you’re told where to sit. We were about 20th in line, and were seated directly opposite Stephen’s desk in the 3rd row. Pretty easy to see ourselves in the audience sweep shots 🙂
  • You’ve been standing for almost 3 hours, and sitting never felt so good.

If NO then turn up before check-in closes (4:30 typically), and you’ll spend only a small amount of time standing in line. You’ll be seated at the very rear of the balcony.

General Admission Tickets – it’s complicated. If you want to see the show, be there early. On the day we attended none of the front runners in line were general admission – when the ushers split the lines and took General Admission across the street the first 100 or so people stayed right where they were.

Your process will be the same as I’ve detailed above, but success will be merely getting in at all.

The Show

You’ll be briefed about what to expect by the Floor Manager, who actually has a pretty intimate role to play with Stephen during the show – he’s almost like a real time acting coach.

Paul Mecurio will come out to warm up the crowd, and he’ll have had 30 Red Bulls and a dozen Espressos before hand. He wants you loud and pumped. Every night it’s a new crowd, and every night they’ve got 20 minutes to turn you into the frenzied, cheering, throng that they want as a audio backdrop to the show.

Stephen will come out in advance and answer a few questions, the band is introduced, and then it’s time to start the monologue. When the show starts for real you’ve been told to keep screaming “Stephen! Stephen!” even after he tells you to pipe down, and so you do.

Once Stephen starts speaking to camera you won’t initially hear him, as the crowd is still yelling, but of course he’s mic’d and so the TV audience hears him over you just fine. If you’re seated front and center, as we were, you’ll actually see almost nothing of the monologue, as the cameras and floor manager are in the way!

The show unfolds as you’d expect – except it didn’t in our case. We were told we were lucky, because something was going to happen that almost never happens – they were taping two shows! It was Tuesday before Thanksgiving, and there was to be no show taping on the Wednesday, so two opening bits, one of which pretends to be Wednesday.

More bits are filmed, and the band performs two numbers on a heavily dressed set, when Stephen suddenly announces that there are no guests, because the guests have already taped their segment.

At this point in the proceedings we realize that we’re pretty much just a prop; cattle to be ushered in, minimally coached for background noise and energy, then disposed of. They could have told us earlier that we wouldn’t see any guests, but then would we have been as willing to participate in all the frenzied whooping and clapping?

Canon Lenses on a Fujifilm body

I’ve shot with Canon digital gear since the Rebel XT/EOS 350D I acquired in 2004, and naturally I’ve accumulated a fair few lenses along the way.

I’ve been shooting with Fujifilm, both fixed and interchangeable lens models, since I bought an X-PRO1 in 2012. I like the size, the controls, and the image quality.

With the X-T3 my Fuji APS-C body now has a higher resolution sensor than my 5DII, and I use the 5DII less and less due simply to weight and bulk. So, the natural question: can I get more life from my Canon lenses on the X-T3?

The TS-E 17mm doing what only it can: multiple shifted shots stitched into a seamless image inside a small space.

The primary goal – TS-E lenses.

Each of the TS-E lenses I own cost more than the X-T3, and there’s nothing equivalent from Fuji to take their place. I use them extensively for landscape and architectural photography, almost never with tilt but almost always with shift to either compose without distortion, or to stitch multiple images for an even wide field of view.

In order for them to be as useful as possible, I would prefer to preserve the original field of view, so the 17mm would still give me close to the 93° horizontally I enjoyed with the 5DII. There’s really only one way to do this: the Viltrox EF-FX2 “speedbooster” adapter.

This comes very close to preserving the original field of view, and provides full support for electronic aperture and auto-focus. I paid about $160 for mine through ebay.

I only expect this adapter to work with full frame EF lens, not the EFS models intended for Canon’s APS-C models. It’s designed to take the image circle formed by a full frame lens and reduce it to the size of one suitable for the smaller APS-C sensor, so putting the smaller APS-C intended image circle in front of it (by using an EFS lens) seems like a recipe for disappointment.

If preserving the field of view wasn’t important, I’d probably go with the Fringer EF-FX Pro II, from which I would expect no degradation in image quality at all.

Field of View

Canon 5DII top, Fujifilm X-T3 with Viltrox adapter bottom

Pretty close – Viltrox claims 0.71x, which is only slightly off from the 0.66X needed to accommodate the “1.5 crop” of the APS-C sensor.

Both cameras were on the same tripod, at the same location, with no adjustments applied at all during ACR processing.

Corner sharpness

Adapted Fuji left, Canon 5DII right

This sample is the worst case scenario: the lens shifted vertically to its maximum, and then the very top right corner sample. Both examples are at f/8 with no sharpening or other adjustments during ACR processing. I adjusted the zoom of the Fuji image so that the two would look about the same on screen.

I should also note that I’m far from proficient at manual focusing with the X-T3, and it’s entirely possible that I could have been more precise. Stopping down a little further would probably improve things too, though I tend to shoot this from f/8 to f/11 as that seems to be the sweet spot.

X-T3, Viltrox EF-FX2, Canon T-SE 17mm fully shifted vertically

This second sample is at 100% after my standard raw sharpening, and it looks pretty darn promising.


It works, and it’s accurate. I haven’t used it enough yet to form a stronger opinion than that, and it certainly seems a little less snappy than with a native Fuji lens, but it does work.


For my specific use case this seems to be an entirely effective solution. It’s inexpensive and the image quality is only slightly degraded from the results I get with the 5DII; field tests will tell whether there are flare or chroma distortions to be concerned about.

Farewell A380, we hardly knew you

I’ve flown across the Atlantic for decades in a variety of 747s, and occasionally a 777 or two, but mainly 747s. Thinking that they’d be retired sometime in the not too distant future, we’ve been looking for chances to ride upstairs whenever we could, while we still could.

On our most recent return trip from LHR to ORD we had our first chance to sample to A380, the Airbus Industries answer to the 747. We flew on the upper deck partly because the slightly lower seating density made me think it might be a little quieter.

The overall feel is of a 777 or similar, with no sense at all that’s there’s what amounts to another plane load of people underneath you. The fuselage is pudgy, bulging, and a little bloated looking, but inside it’s all very familiar. The first clue that things were different came on takeoff, which was as quiet, or quieter, as any commercial aircraft I can think of. There’s very little wind noise, and almost no engine noise, it’s almost eerie. Looking out the window across the wing, I suspect that the composite manufacture is responsible for some of this – there are no seams or rivets, gaps, or openings. I have the feeling that A.I. also invested heavily in sound insulation behind the trim panels, but however they achieved it, the level of quiet is impressive.

Our flight was uneventful, and our descent into O’Hare felt steep and fast – perhaps it was? Landing was buttery smooth, with none of the bone jarring drama that I’ve experienced more than once on a 747, including one occasion where the luggage bins flew open and oxygen masks fell from their cubby holes, so great was the impact.

The 747 will always be a more iconic aircraft, in part because it was so bold when introduced over 50 years ago, but as a passenger I’m inclined to take the A380 given the choice.

In praise of small airports

Thanksgiving 2018 – we travel to Tucson, AZ for a week of sunshine in the South West. Flying out of O’Hare is about what you’d expect the weekend before Thanksgiving – busy, mainly with recreational travelers. That means many, many more “service animals” than you’d see on a more typical business travel day.

Upon landing in Tucson the first impression of the airport is that it’s small – the taxi from landing to the gate is short. Upon deplaning the impression continues – is there simply no one else here on a Saturday afternoon? Everywhere seems deserted, the only cluster of activity being around the only operating baggage carousel.

Bags collected we start the walk to the car hire center, which is onsite within the airport grounds. Other than being clean, flat, and nicely decorated, the walk is reminiscent of Aberdeen. We’re quickly introduced to a Ocotillo growing in an outdoor space between the main terminal and the garage.

Hertz Gold customers bypass the (utterly empty) main rental hall and walk through to the parking area, and a dedicated kiosk. We’re the only customers visible anywhere and, as might be expected, we’re quickly walking to find the car, keys in hand. Complimentary bottles of water were welcome, thank you.

We load up the bags, install the GPS, find something not obnoxious on the radio, and prepare to hit the road. It takes a moment for it to sink in – there’s no exit procedure, we’re suddenly just driving on the public road – I can’t remember the last time that happened!

The return journey a week later was much the same, in reverse. Security was quick, uneventful, though the departures area was much busier than when we arrived. Our flight was delayed an hour by winter storms (elsewhere, needless to say) and so we had extra time to sit and read. While busy it wasn’t hectic, and felt relaxed by comparison with O’Hare, on almost any day.

Cause & effect in amateur photography

Glass plates and room sized cameras

In the early part of the 20th Century photographic chemistry was comparatively crude. Emulsions were not very light sensitive, meaning that large surface areas were required in order to expose enough of the material to light – more surface area captures more light. As a consequence photography was largely the province of the well heeled or portrait artists. Equipment was big, expensive, hand made and required lots of time to setup and operate. Look at photographs from the late 1890’s through to 1920 and you’ll see little that is spontaneous or casual; making a picture was an undertaking.

Faster film on a flexible base

A large film media requires a large image circle, which in turn results in a lens that’s sited further away from the film plane – and as a result the whole light box that is the camera must itself be large too. Various techniques were tried to combat this, most common being the bellows unit and folding or collapsing lens mechanism. Although invented in 1885, the flexible, transparent film as we knew it was not widely used until 1910 or so – this was another major step forward in democratizing access to photography, as now a separate, expensive, glass plate was not required for each image.


From the 1920’s to the 1940’s several things changed; film emulsions became more sensitive which lead to the ability to use smaller negatives, and manufacturing techniques scaled up thus reducing unit costs. Across every sphere of human industry manufacturing was automating, and every manner of manufactured item was becoming more economical and accessible.

The advent of 35mm and rangefinders

Everything changed profoundly when Kodak introduced the 135 film canister in 1934 – we’d come to know it as 35mm film. This addressed ease of handling by holding the film stock in a light tight canister, and promised many more exposures per roll, with easier loading, than had ever been possible with 120 roll film. This new, smaller film format opened the door to the rapid expansion of photography and cameras, starting with the rangefinder.

The first Leica rangefinders sold for around $4,500 in 2013 terms, a premium product then and now. By the late 30’s other compact rangefinders were entering the market – the Argus C3 sold for the equivalent of $350 and the Kodak 35RF for $780 (2013) in 1940 when it was introduced. Rangefinders were great for all manner of everyday photography, especially the recently emerged field of photojournalism, travel and family keepsakes. Over the next 20 years the photographic equipment market would blossom, and names from far flung corners of the world would become common place.

War and post-war

Images were captured during World War II that would have been much more challenging in the days of TLRs; a field reporter or infantryman could carry a compact 35mm camera almost anywhere. In the aftermath of their defeat the Japanese needed a way to boost their economy; they need something that required the minimum of raw materials, was cost effective to ship great distances and could then be sold for a premium price. Just as the Swiss before them with watches, for the Japanese manufacturers of the late 1940’s cameras were a boon. Several manufacturers had been producing specialized lenses for X-ray and other applications, and their first in-house cameras stared appearing in the late 1940s.

In a pattern that was to be repeated in several other fields, early Japanese models were heavily influenced by older designs from traditional names, such as Leitz.