Category Archives: Technique

Digital Photo Frame – DIY

If you’ve ever fancied a digital display frame, but perhaps wanted more than just a simple carousel of your pictures, or greater control over how the device behaves, then read on.

What we’ll build

A digital display board that showcases your photographs, with an overlaid calendar and weather forecast. For those of you into home automation, I’ll also cover how to turn the display on and off by integrating with Samsung SmartThings home hub – the basic approach could serve as a starting point for other integrations.

How we’ll do it

A Raspberry Pi zero w (about $26 with case and power supply) combined with a Dell 24″ monitor (about $120), and the Dakboard web digital display service.

Dakboard

The simplest way to get started is to follow this handy guide. This will help you get your Pi up and running, though I found that for the release of Raspbian current as of December 2019, I need to follow the alternative instructions for updating the autostart instructions :

/etc/xdg/lxsession/LXDE-pi/autostart

Automating the Pi display

We’re going to create a RESTful web service (API) that can be invoked with a simple URL to turn the display on and off, this is what our home automation will call. This will leverage Python3, Flask, and NGINX.

NGINX

This is a popular web server for the Pi, you could use apache but the tutorials I was following used nginx.

sudo apt-get install nginx

Python3

If your Raspbian install is fresh (you used noobs to get up and running) you probably don’t need to sudo apt-get update, but it might not include Python3, so let’s install that.

Log into your Pi (I use SSH from my Macbook)

sudo apt-get install python3-pip

UWSGI flask

This will install the flask API framework, which greatly simplifies the work required to create RESTful apis in python.

sudo pip3 install flask uwsgi
sudo pip3 install flask-restful

Python api program

You can clone from https://github.com/klcampbell6502/pirest.git or use the code as displayed there.

from flask import Flask
from flask_restful import Resource, Api
import os
 
app = Flask(__name__)
 
api = Api(app)

class ScreenOff(Resource):
    def get(self):
        os.system("vcgencmd display_power 0")
        return "off"
 
class ScreenOn(Resource):
    def get(self):
        os.system("vcgencmd display_power 1")
        return "on"

api.add_resource(ScreenOff,'/screen/off')
api.add_resource(ScreenOn,'/screen/on')

if __name__ == '__main__':
    app.run(host='0.0.0.0')

Testing

We should now be able to test the flask API using the debug environment:

uwsgi --socket 0.0.0.0:8000 --protocol=http -w api:app

and then invoke the URL:

http://yourpiaddress:8000/screen/off

You should see the single word “off” in your browser, and if you look in your ssh output you should see something like this:

*** uWSGI is running in multiple interpreter mode ***
spawned uWSGI worker 1 (and the only) (pid: 7822, cores: 1)
display_power=0
[pid: 7822|app: 0|req: 1/1] 192.168.86.24 () {32 vars in 644 bytes} [Wed Jan  1 12:16:54 2020] GET /screen/off => generated 6 bytes in 273 msecs (HTTP/1.1 200) 2 headers in 70 bytes (1 switches on core 0)
[pid: 7822|app: 0|req: 2/2] 192.168.86.24 () {32 vars in 574 bytes} [Wed Jan  1 12:16:55 2020] GET /favicon.ico => generated 233 bytes in 330 msecs (HTTP/1.1 404) 2 headers in 72 bytes (1 switches on core 0)

Oh, and your HDMI display is now off 🙂 Type ctrl-C in your ssh session to stop the uWSGI test server. To make things more robust we’re now going to integrate uWSGI with the nginx web server.

nginx integration

If you didn’t download the project from git then you’ll need to create a file uwsgi.ini in the directory containing your .py file:

[uwsgi]
chdir = /home/pi/Documents/pirest
module = api:app
master = true
processes = 1
threads = 2

uid = www-data
gid = www-data

socket = /tmp/pirest.sock
chmod-socket = 664
vacuum = true
die-on-term = true
touch-reload = /home/pi/Documents/pirest/api.py

This will provide the startup instructions for uwsgi when invoked from nginx.
Now we’ll get rid of the default nginx site

sudo rm /etc/nginx/sites-enabled/default

and now create a proxy file

sudo nano /etc/nginx/sites-available/pirest_proxy

containing the following:

server {
listen 80;
server_name localhost;
location / { try_files $uri @app; }
location @app {
include uwsgi_params;
uwsgi_pass unix:/tmp/pirest.sock;
}
}

Now we need to create a symbolic link to this file

sudo ln -s /etc/nginx/sites-available/pirest_proxy /etc/nginx/sites-enabled

then restart nginx

sudo systemctl restart nginx

Set uWSGI to run on boot

cd /etc/systemd/system
sudo nano uwsgi.service

and into this new file, paste the following:

[Unit]
Description=uWSGI Service
After=network.target

[Service]
User=pi
Group=www-data
WorkingDirectory=/home/pi/Documents/pirest
ExecStart=/usr/local/bin/uwsgi --ini /home/pi/Documents/pirest/uwsgi.ini
Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:$

[Install]
WantedBy=multi-user.target

restart the daemon so the new configuration is picked up:

sudo systemctl daemon-reload

and start the service we’ve just defined:

sudo systemctl start uwsgi.service

You can check the status of the running service like this:

sudo systemctl status uwsgi.service

you should see something like this:

? uwsgi.service - uWSGI Service
   Loaded: loaded (/etc/systemd/system/uwsgi.service; disabled; vendor preset: enabled)
   Active: active (running) since Wed 2020-01-01 12:49:52 CST; 1min 3s ago

 Main PID: 8038 (uwsgi)
   Memory: 13.1M
   CGroup: /system.slice/uwsgi.service
           ??8038 /usr/local/bin/uwsgi --ini /home/pi/Documents/pirest/uwsgi.ini
           ??8044 /usr/local/bin/uwsgi --ini /home/pi/Documents/pirest/uwsgi.ini
Jan 01 12:49:55 piboard02 uwsgi[8038]: mapped 143936 bytes (140 KB) for 2 cores
Jan 01 12:49:55 piboard02 uwsgi[8038]: *** Operational MODE: threaded ***
Jan 01 12:50:07 piboard02 uwsgi[8038]: WSGI app 0 (mountpoint='') ready in 12 seconds on interpreter 0x16984f0 pid: 8038 (de
Jan 01 12:50:07 piboard02 uwsgi[8038]: *** uWSGI is running in multiple interpreter mode ***
Jan 01 12:50:07 piboard02 uwsgi[8038]: spawned uWSGI master process (pid: 8038)
Jan 01 12:50:07 piboard02 uwsgi[8038]: spawned uWSGI worker 1 (pid: 8044, cores: 2)
Jan 01 12:50:07 piboard02 uwsgi[8038]: display_power=0
Jan 01 12:50:07 piboard02 uwsgi[8038]: [pid: 8044|app: 0|req: 1/1] 192.168.86.24 () {42 vars in 767 bytes} [Wed Jan  1 12:50
Jan 01 12:50:15 piboard02 uwsgi[8038]: display_power=1
Jan 01 12:50:15 piboard02 uwsgi[8038]: [pid: 8044|app: 0|req: 2/2] 192.168.86.24 () {40 vars in 734 bytes} [Wed Jan  1 12:50
lines 1-19/19 (END)

You’ll see here that I tried the service on and off by entering this URL in my browser:

http://piboard02/screen/on

The final step is to ensure that this starts with every boot:

sudo systemctl enable uwsgi.service

and you’ll see something along these lines:

“Created symlink /etc/systemd/system/multi-user.target.wants/uwsgi.service /etc/systemd/system/uwsgi.service”

Reboot and test using the URL – give it time to boot, especially if using a pi zero as they’re pretty slow.

SmartThings Integration

Device Handler

OK, so you’ve got a pi running a display board, and you can turn the HDMI output on and off by opening a URL. Now to create a device handler for SmartThings that enables you to include this in automations, and of course control it from your SmartThings app.

Go grab the source over at my github repo. For now, I’m going to assume the reader knows how to create an account at the Samsung developer site https://graph.api.smartthings.com/

Go to My Device Handlers, and Create – From Code, then paste the code from Git. Or, you could just import my repo into your account. You’ll then need to publish it, choosing “just me” should be sufficient.

New Device

Using the IDE, create a new device:

 

Complete the new device screen – use your own values for Network ID (which must be unique), name, label, etc. I’ve used IP address for network ID, but what you use actually doesn’t matter.

When you get to the Type pulldown, scroll all the way down past the commercial handlers until you see the REST Switch device handler we just created. Location and Hub must be set to your SmartThings hub, you’ll typically only see a single choice in those pulldowns.

Click on the Edit link alongside Settings in the device display, and then enter the IP address and port 80 for your Pi. Your hub must be on the same network as your pi, as the hub itself will run the device handler code we just established.

After a couple of minutes the newly created device should appear in your app:

Canon Lenses on a Fujifilm body

I’ve shot with Canon digital gear since the Rebel XT/EOS 350D I acquired in 2004, and naturally I’ve accumulated a fair few lenses along the way.

I’ve been shooting with Fujifilm, both fixed and interchangeable lens models, since I bought an X-PRO1 in 2012. I like the size, the controls, and the image quality.

With the X-T3 my Fuji APS-C body now has a higher resolution sensor than my 5DII, and I use the 5DII less and less due simply to weight and bulk. So, the natural question: can I get more life from my Canon lenses on the X-T3?

The TS-E 17mm doing what only it can: multiple shifted shots stitched into a seamless image inside a small space.

The primary goal – TS-E lenses.

Each of the TS-E lenses I own cost more than the X-T3, and there’s nothing equivalent from Fuji to take their place. I use them extensively for landscape and architectural photography, almost never with tilt but almost always with shift to either compose without distortion, or to stitch multiple images for an even wide field of view.

In order for them to be as useful as possible, I would prefer to preserve the original field of view, so the 17mm would still give me close to the 93° horizontally I enjoyed with the 5DII. There’s really only one way to do this: the Viltrox EF-FX2 “speedbooster” adapter.

This comes very close to preserving the original field of view, and provides full support for electronic aperture and auto-focus. I paid about $160 for mine through ebay.

I only expect this adapter to work with full frame EF lens, not the EFS models intended for Canon’s APS-C models. It’s designed to take the image circle formed by a full frame lens and reduce it to the size of one suitable for the smaller APS-C sensor, so putting the smaller APS-C intended image circle in front of it (by using an EFS lens) seems like a recipe for disappointment.

If preserving the field of view wasn’t important, I’d probably go with the Fringer EF-FX Pro II, from which I would expect no degradation in image quality at all.

Field of View

Canon 5DII top, Fujifilm X-T3 with Viltrox adapter bottom

Pretty close – Viltrox claims 0.71x, which is only slightly off from the 0.66X needed to accommodate the “1.5 crop” of the APS-C sensor.

Both cameras were on the same tripod, at the same location, with no adjustments applied at all during ACR processing.

Corner sharpness

Adapted Fuji left, Canon 5DII right

This sample is the worst case scenario: the lens shifted vertically to its maximum, and then the very top right corner sample. Both examples are at f/8 with no sharpening or other adjustments during ACR processing. I adjusted the zoom of the Fuji image so that the two would look about the same on screen.

I should also note that I’m far from proficient at manual focusing with the X-T3, and it’s entirely possible that I could have been more precise. Stopping down a little further would probably improve things too, though I tend to shoot this from f/8 to f/11 as that seems to be the sweet spot.

X-T3, Viltrox EF-FX2, Canon T-SE 17mm fully shifted vertically

This second sample is at 100% after my standard raw sharpening, and it looks pretty darn promising.

Auto-focus

It works, and it’s accurate. I haven’t used it enough yet to form a stronger opinion than that, and it certainly seems a little less snappy than with a native Fuji lens, but it does work.

Conclusion

For my specific use case this seems to be an entirely effective solution. It’s inexpensive and the image quality is only slightly degraded from the results I get with the 5DII; field tests will tell whether there are flare or chroma distortions to be concerned about.

Field of view and 35mm equivalency

Another topic that frequently seems to confuse people, and for which they appear unable to engage in even a small amount of self education.

The focal length of a lens is the distance between its optical center and the point at which parallel rays of light are brought to a point, or focused. It is an optical characteristic of the lens and never changes regardless of the size or type of imaging sensor upon which the image is focused (film, digital). This isn’t a matter of interpretation or opinion, it’s physics.

Lenses produce a circular image, no surprise given that lens elements are circular. The brightness of the image falls off towards the edges of the circle when the lens is wide open, the effect diminishes as the aperture is closed. The recording medium, whether it’s film or digital, must fit within the usable portion of the image circle:

fov1 (1)

Here the largest rectangle that fits within the usable image circle is the film/sensor size – making a large image circle is expensive, and so the lens is designed to create a usable circle only as big as necessary. Note that in reality it’s not a crisply defined circle – more a case of a bright center that falls off to very dim the further out from the center you go.

If the same lens projects the same image circle onto a smaller sensor, all that happens is that a smaller section from the image circle is used – the lens hasn’t changed, we’ve merely cropped the image being projected by the lens and used only a portion:

fov2 (1)

From the point of view of the camera, the field of view has changed – the angle which the picture occupies within the scene.

fov3

It’s the same lens, at the same distance, but a smaller sensor means that only a portion of the original image is captured, yielding a narrower field of view.

35mm Equivalency

Photographers need an easy way to describe what’s happening here, in terms of what they see. The most accurate way would be to talk in terms of “field of view”; a 50mm lens on a 35mm sensor has a 39 degree angle of view, the same lens on a “1.6 crop” sensor has a field of view of 25 degrees. To describe this we talk about the 35mm equivalent focal length: to get a 25 degree horizontal field of view on a 35mm sensor you’d need an 80mm lens – so a 50mm lens on a 1.6 crop sensor produces an image which is considered equivalent to an 80mm lens on a full frame 35mm sensor.

 

 

 

Pick a new camera for me

Read through the Beginners Questions forum on dpreview.com and you’ll probably see one question in 4 asks “what should I buy”, or a variation on the theme. This on a site that is filled with reviews and any number of tools aimed at answering specifically this question, and yet still the posts appear day after day.

There are no bad cameras, just cameras that are somewhat more or less suitable for specific types of photography.

If you’re a true beginner and all the specifications confuse you, it doesn’t matter what you buy – they’ll all be more capable than your skills at this point. Here are some suggestions to help seal the deal:

  • Pick a price
  • Go to Amazon and use search, enter a price range that starts 15% below what you’d like to spend, and goes to 10% over. Want to spend $200? Search from $170 to $220
  • Sort by Avg User Review.
  • Anything at the top of the list with at least 30 or 40 reviews should work just fine

As an alternative, how about your local independent camera store? Not a big box that also sells appliances and TVs, but an old fashioned camera store – they still exist. Tell the helpful staff what you want to pay and handle cameras in person, then pick whatever feels comfortable to use.

Search, where art thou?

This blog started as a result of me encountering one of the most frequently asked questions on a photography forum, for what felt like the thousandth time. If you regularly visit some of the consumer oriented forums you’ll find certain questions arise, with excruciating predictability, several times each week.

This continues to puzzle me: with apparently no expenditure of effort on their part, at all, someone expects a complete stranger to give up their time answering a question that has been asked, and answered, many times before. Often many times within the last few days.

How does this thought process work? The forum features clear instructions to anyone starting a new thread to use the search to see if their question has arisen before, but they ignore that. Some complain that there are simply too many messages to read, that’s it’s too hard to find an answer. Because everyone keeps asking the same bloody questions over and over again!

In many cases merely typing the title of their post into a Google search box would yield a workable answer within the first 3 results, but even that is apparently too much to ask.

What, ultimately, will become the fate of forums if this continues? Will there inevitably be an evolutionary process where those seeking knowledge gain it, share it, become frustrated and leave – at an ever increasing pace?

Filters, the great debate

Right up there in the list of most re-asked questions seems to be “what UV filter should I buy”.

There are only really two camps to this debate: every lens needs a filter and no lens needs a filter. I’m firmly in the later camp: UV filters are a waste of money and will quite probably degrade your image quality. There are specific lenses (the Canon 100-400 for example) which exhibit quite predictable image quality degradation when using certain filters for protection.

But what about scratches to the front element of my lens?
How often do you worry about scratches to the windshield/windscreen of your car? Think about the constant abuse your windscreen takes, can you still see through it? Do you know how hard glass is, and what it takes to scratch it? Glass is very, very hard and extremely difficult to scratch. Even if you could easily scratch the front element it wouldn’t matter:

http://www.lensrentals.com/blog/2008/10/front-element-scratches

What if I drop my lens, or bump into something?
Then your extremely thin filter will break, and the small, hard, sharp pieces will be ground into the front of your lens.

I’d rather clean the filter than my lens
Why are you needing to clean either with any regularity? Every now and then, probably no more than once a year, it occurs to me to look at the front element of my lenses. I’ll probably dust them off with a lens brush. End of story.

What about protection?
Use a rigid lens hood at all times. Unlike a filter it will improve image quality and physically protect the front of your lens from bumping into things. Digital sensors are not sensitive to UV, so there’s no need to filter it out.

What about special effects?
With only a couple of exceptions you can recreate just about any effect imaginable during post processing, with the added benefit that you can experiment with different effects on the same scene without retaking the photograph.

The exceptions? You cannot reproduce the effect that a polarizing filter has on reflections in post processing, neither can you block highlights to the same degree that a graduated neutral density filter will.

So: save your money, use a lens hood, and forget about useless filters.