This year I decided to do some making for Christmas and now the gifts are out and the surprise is over I can share them with you! I managed two projects, and it turned into a lot of work! This is the first which I called TweeterPiTweets:
You can probably guess a little about what this device does from the images – it is an automatic camera that responds to motion and then tweets the captured images. I made this project for my parents as they have a perfectly placed bird feeder on their kitchen windowsill – so the idea is to capture images of the visiting garden birds. After a little bit of tweaking I think I got this device working effectively and from the first day it is has been running we already have some impressive results!
You can view the twitter feed here: https://twitter.com/tweeterpitweets
Great pics from day 1:
Some notes on the build:
I made the enclosure in Cardiff FabLab, the front half is 3D printed and the back pieces are laser cut layers – I intended for there to be just two separate pieces, with a single laser cut back piece fitting in to the 3D printed front, but I underestimated the depth of the parts inside hence the extra layers.
The electronics are made up of a Raspberry Pi Zero W, a PIR sensor, a 160degree variable focus camera, a Pimoroni Wide Input Shim and a cool Rainbow LED – links are all to the Pimoroni store, I managed to get most of the parts in a pre-Christmas sale which made it quite inexpensive to build 🙂
In the end I ditched using the PIR sensor – there are some many aggravating nuances to getting it working correctly, and it turns out that infrared wavelengths don’t pass through glass so well anyway (just as a small tip though, the PIR sensor seems to work a lot better if you bypass the 5V regulator and power it directly from the Pi’s 3v3 rail, despite have a clean regulated output from the Wide Input Shim!)
It turned out that actually using the Pi Camera as a motion sensor was much much better. Gareth Halfacree (@ghalfacree on twitter and @ghalfacree on GitHub as well) has created an awesome bash script that is designed for camera trap set ups. The idea is that the camera constantly takes low-resolution images and scans for pixel changes between the latest and preceding images, if a change is detected then a high-resolution image is taken. There are a number of variables in the script that allow you to define a particular area of the image to scan for change (so you can ignore things like branches waving in the wind for example) and you can also tweak the sensitivity and thresholds for successful triggers.
I combined this script with Tweepy, a Twitter client for Python to get the final result.
I had to build this project with a pretty strict time limit which is why hardware issues haven’t been totally resolved. If you liked this post, let me know in the comments below and in the new year I will find time to upload a full tutorial 🙂