Traffic Pi

Using my Raspberry Pi, Piglow and the traffic API feeds I have created a script to give me a visual representation of the journey time to work. This gives me an idea of the traffic before I leave the house in the morning, or so that when I’m working at home I can look at it and see how glad I am that I’m not sitting in traffic on the way to work :)

https://github.com/rickymoorhouse/trafficpi

Elasticsearch Server Second Edition is a good book to read if you’re getting started with Elasticsearch or considering using it. It goes through all the main areas of getting your data indexed and then searching and analysing it.

The book is well written and easy to read through and serves well as a reference guide to refer back to later. It has helped me get an overview of some of the features of Elasticseach that I’ve not yet used, some of which I hope to explore in further depth following on from the examples in the book. All of the chapters in the book include useful references to sources for further information on the topic covered and for more in-depth coverage the authors recommend going on to read their other book, Mastering Elasticsearch which I hope to read as well as a follow on.

  1. Boot from Live CD / USB

  2. Decrypt the filesystem

<code class="markdown">cryptsetup luksOpen /dev/sda5 <span class="emphasis">*hostname*</span>
</code>
  1. Mount filesystems
<code class="sql">mount /dev/dm-2 /mnt
mount /dev/dm-3 /mnt/home
mount /dev/sda1 /mnt/boot
mount <span class="comment">--bind /dev /mnt/dev</span>
mount <span class="comment">--bind /sys /mnt/sys</span>
mount <span class="comment">--bind /proc /mnt/proc</span>
</code>
  1. Enter chroot chroot /mnt

/etc/crypttab should have: sda5_crypt UUID=sda5_uuid

A while back I had an e-mail from my web hosting company saying they were increasing the price for the package I was using. This got me thinking about whether the route I’d taken for hosting was the best option and if I’m getting my money’s worth there. For reference I had a reseller hosting package, hosting a few sites for family members using not very much disk space or bandwidth - certainly not near the allowance on the package. So I started thinking about what I would do if it was just my site, and this is my thinking out loud/somewhere to document my ideas/findings:

Requirements

  • Host a blog

  • Easy to update

  • Ability to experiment with styling

  • Use my existing URLs

After considering a few options including wordpress.com, github pages, Scriptogr.am, my Raspberry Pi and various static site generators - I decided to move my sites to run on a Droplet at DigitalOcean which gives me the flexibility I want for my site, whilst still being able to host the other sites in the same place.

Currently I’m still using WordPress for my blog, but I’m experimenting with a static site generator for the next round of changes :)

For Christmas the girls gave me a Raspberry Pi, and for my first project I decided to try recording how bright the sunlight at our window is using an old Webcam.

raspberrypi

The basic idea is:

  1.  Capture a photo

  2. Analyse the brightness of the photo

  3. Log it (and eventually publish to cosm)

So first of all getting the webcam set up - My webcam is a Logitech Quickcam Express, which proved to work nicely with the Raspberry Pi, after plugging it in, it showed up straight away in the output to lsusb: ricky@pi ~ $ lsusb Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 001 Device 002: ID 0424:9512 Standard Microsystems Corp. Bus 001 Device 003: ID 0424:ec00 Standard Microsystems Corp. Bus 001 Device 004: ID 046d:0840 Logitech, Inc. QuickCam Express

To get the photo from the webcam I used fswebcam which was simple to install (sudo apt-get install fswebcam) and use: fswebcam --no-banner -d /dev/video0 webcam.jpg The no-banner removes the default date and time at the bottom of the image, /dev/video0 is where the webcam appeared and webcam.jpg is the file to save the image to.

I found a python function to calculate the brightness of the image from StackExchange so put it all together and here is the python script I’m using:

<code class="language-python">
#!/usr/bin/python
import Image
import ImageStat
import math
import os
import datetime
os.system("fswebcam --no-banner --scale 50x50 -d /dev/video0 webcam.jpg")
im = Image.open("webcam.jpg")
stat = ImageStat.Stat(im)
r,g,b = stat.mean
brightness = math.sqrt(0.241*(r**2) + 0.691*(g**2) + 0.068*(b**2))
dt = datetime.datetime.now().strftime("%Y%m%d-%H:%M:%S")
data = '%s,%sn' % (dt, brightness)
open("brightness.csv", 'a').write(data)
</code>

It could be tidied up quite a bit and I’m sure there’s a way to capture the image within Python without having to write it to disk first as well. My first days readings taken every 10 minutes look something like this: //

Mini Great South Run

This morning I had fun doing the mini great south run with Abi, unfortunately RunKeeper crashed as we went through the start line but I think we did the 1.5k in roughly 11minutes. She had a great time and did it with a mixture of running at a good pace and then walking for a little bit until she saw too many people overtake, then she would pick up pace and dive through gaps in front of her. She enjoyed the stretch on grass much more than the road and path sections and was thrilled with her medal at the end.

20121027-174154.jpg

20121027-174204.jpg

20121027-174139.jpg