For a while I’ve been meaning to write up the details of the Raspberry Pi weather station that I have built with my eldest daughter. This project builds on a number of examples I’ve seen across the internet, particularly sensing the weather. This details how our system is put together.

Temperature monitoring

We took two temperature sensors and mounted them in a garden post with one pushed down to the bottom for soil temperature and one in the cap for the air temperature. The one-wire sensors can share the same three wires, so are both connected to a wire leading back to the Raspberry Pi through a hole drilled into the side of the post. For waterproofing we surrounded the whole with hot glue. The post is situated in a shady spot and pushed about 30 centimeters deep in the soil.

Wind speed

You can see my graphs of the data, and the code is on github.

For a while I’ve had a variation on my map of the places I’ve visited - here’s a summary of how my current version is working.

The whole site is currently generated by hugo, a static site generator with no server side component. The map is powered by MapBox GL which lets me choose any of the mapbox styles to use for my map. I create a markdown file for each place on the map, with the latitude and longitude in the ‘front matter’ for the post which looks something like this:

    ---
    title: "Salto"
    layout: travel
    datePosted: 2003
    photo: "/travel/image.jpg"
    lat: -31.387022
    lng: -57.968802
    ---

I then have a list layout for travel items which will generate GeoJSON data from the list of places. In my current version of the map this is inline within the page and fed directly into the Mapbox javascript method like this (roughly based on the Mapbox GeoJSON points tutorial):

        ...
        map.addLayer({
            "id": "places",
            "type": "circle",
            "source": {
                "type": "geojson",
                "data": {
                    "type": "FeatureCollection",
                    "features": [
        {{ range .Pages }}
                    {
                        "type": "Feature",
                        "geometry": {
                            "type": "Point",
                            "coordinates": [{{ .Params.lng }}, {{ .Params.lat }}]
                        },
                        "properties": {
                            "title": "{{ .Title }}",
                            "description": "{{ .Content }}"
                        }
                    },
        {{ end }}
                    ]
        ...

Introducing hem

hem is a synthetic monitoring tool which monitors HTTP resources on a regular schedule, storing details of the time taken and the reponse code returned.

I’ve been using Uptime at work for a while for endpoint monitoring and over the time we’ve been using it made a few tweaks or plugins for it - in particular being able to send metrics from Uptime to Graphite. There were also some more substantial changes we were considering making and we’d built up a number of supporting scripts to populate the checks via the Uptime API when hosts changed. We also have all our other monitoring dashboards in Grafana. In this context I decided that what would be nice is a simple tool that could replace the checking piece and feed that data into our graphite data store to be viewed and alerted on from Grafana.

hem runs from a simple config file with three main sections in it - discovery, tests and metrics. Both discovery and metrics have been designed as pluggable to give hem versatility - so far I’ve built discovery drivers for dns, consul and json/yaml and metrics drivers for graphite, kafka and the console. hem will iterate over the tests on a custom interval performing discovery each time to ensure it has the latest list of hosts for that test.

hem stats in Grafana steps

Getting started with hem

To start using hem, you can install it from PyPI with pip:

pip install hemApp

Then create a config file - it will look something like this:

    discovery:
      type: dns
    metrics:
      type: graphite
      server: 127.0.0.1
      port: 2003
    tests:
      homepage:
        path: /index.html
        secure: false
        hosts:
           - example.com
           - example.org

Run hem and start to see metrics flowing to graphite

hem -c config.yaml

In grafana I have the Discrete plugin installed to give the coloured bar look you see above.

Over the last few months we’ve started to try and reduce our use of plastics - especially single use plastics. Here are some of the areas where we’ve made changes.

Reusable drinks containers/straws

We’ve been making an effort to take our reusable cups, bottles and stainless steel straws with us when we’re out and about - and if we don’t have them asking for drinks in non-plastic cups.

Refillable Splosh cleaning

Splosh is the great new way to buy essentials like washing up liquid, laundry detergent and surface cleaners. It’s great value, more convenient than the supermarkets and miles better for the environment.You can read all about it on their website. If you use this special code 151F69 when you buy from Splosh you’ll get money off your first order.

Milk bottles

We’ve started to have milk (and occasionally juice) delivered by milk and more in glass bottles, which can then be returned for reuse.

Present wrapping

For birthdays and Christmas we’ve tried to wrap presents with tissue paper and string or reusing paper - for any wrapping material we’ve bought we tried to avoid the foil and laminated papers.

Soap instead of shower gel

Another area we seem to end up with a lot of un-necessary plastic is shower gels and shampoos so I’ve been attempting to reduce this with bar soap. The soap works well in place of shower gel and also can replace shaving foam but I’ve not found one thats a good shampoo substitute yet. The other issue with soaps is a lot of the soaps themselves are packaged in plastic as well so avoiding that adds another complication to it.

Supermarket awareness

Not buying drinks in plastic bottles. Choosing individual vegetables over prepackaged ones and not putting them in bags for weighing. Choosing drinks in glass bottles or cans instead of plastic.

When we set up our office earlier in the year I decided on a standing desk which I put together using the Ikea Algot system.
In order to avoid using up desk space with a light I originally planned on a clip on light but then saw my Pi Zero and Unicorn pHat and thought they could make a good alternative.

Hardware

The Raspberry Pi Zero is in a simple case, mounted onto the underside of the shelf above my working space. Attached is it’s power cable, a PiHut wireless adapter connected via USB and of course the Unicorn pHat. The power cable is routed down the side of the shelf to my PowerCube, which will eventually be mounted under my work surface but it’s sticky pad wasn’t strong enough to hold it on the underside of the desk!

Software

As I’ve not yet added any switch for my light, it all has to be controllable remotely, so I set up an API to set the colour of the light which I initially controlled via a web browser with urls like:

http://192.168.0.15:8009/colour/<red>/<green>/<blue>

As you can imagine that got a bit tedious - especially to turn off after I’d shut down my laptop! The next step was to add a simpler way to control the light through my phone so I set up iControl Web with buttons to adjust the light settings. Then when I saw the Home app on iOS 10, I researched ways to get my custom light controllable through that and came across Homebridge which I could point to my API via it’s Better HTTP RGB plugin, a bit of config and a couple of changes to my API.

All the code for my API is on github and is very much a work in progress!

At Jessica’s school, where I am a governor and Laura works, they’ve got an incubator of eggs from Living Eggs, so that the school can watch as they hatch into chicks.

On Monday I got a text from Laura wondering if it would be possible to set up a webcam to watch them hatch.  We happened to have a wireless webcam that wasn’t being used so that evening I got it out to make sure it was working and configured it with an FTP server to upload photos every minute and if there was motion detected.

The next day I took the webcam into school and got it set up with the help of James who looks after the school computers and network. You can see the latest pictures from the webcam on the school website.

In the early hours of this morning the first egg hatched:

Updated 11th October 2016 for API Connect

In less than half an hour I could update my project to automatically publish my API in IBM API Connect - Here’s the steps…

Sign up for API Connect through Bluemix by creating an API Connect service instance - if you don’t already have a Bluemix account you can sign up for a free trial account.

Install and configure the new toolkit CLI - replacing eu with au or us if you chose a different bluemix region:

npm install -g apiconnect 
apic config:set server=eu.apiconnect.ibmcloud.com
apic login

Create a product definition for your API:

apic create --type product --title "Travel Information" --apis product.yaml

Adjust the product definition as needed in your favourite editor

Add the x-ibm-configuration extensions to your swagger document to configure what happens when someone calls the API - in my case invoke the backend API

x-ibm-configuration:
  enforced: true
  phase: realized
  testable: true
  cors:
    enabled: true
  assembly:
    execute:
      - invoke:
          title: invoke
          target-url: '<backend url>'

Now switch over to your CodeShip account, load your project and go to the Deployment section of your project.

Add a custom script option and confiigure the following script (adding your details as needed):

npm install -g apiconnect
apic config:set server=eu.apiconnect.ibmcloud.com
apic login -u <username> -p <password>
apic config:set organization=<org>
apic push docs/swagger.yaml
apic stage --catalog=sb docs/travel-information.yaml
apic publish --catalog=sb docs/travel-information.yaml</code>

Commit and push to your repository and your updated API will be pushed to API Management! - Here is my example API

If you don’t already have a CodeShip account you can sign up to CodeShip with your github account and create link in your github repository. You can then set up the tests and deployment steps in the project settings.

Great South Run weekend is here! Today we had the 5k run which Laura, Anne and Des took part in and all did very well, and Abi’s 1.5k Mini Run - even Jessica was enjoying running on the race track they had there and is keen to do the mini run next time round.

Now all that’s left is my one tomorrow - I’m going to be running the 10 mile Great South Run for the first time to raise money for gain. If the technology works you should be able to watch live at http://runkeeper.com/user/rickymoorhouse and you can sponsor me at http://justgiving.com/rickymoorhouse . I’ll update this again tomorrow after the race!

My run went well - I really enjoyed it and there was a fantastic atmosphere around the course.  I managed to beat my target and come in with a time of 1:59:42

Disabling SSLv3

With POODLE the time has come to disable SSLv3 everywhere. There will be clients that break and need fixing but it needs doing. You can read more details and background on the vulnerability.

Here’s a few useful snippets from my experience with it this week:

Apache

Make sure the combination you have for the SSLProtocol line disables SSLv2 and v3 - something like: SSLProtocol All -SSLv2 -SSLv3

DataPower

Ensure your crypto profiles have SSLv2 and v3 disabled in the options line:

[code lang=text] switch co crypto profile option-string OpenSSL-default+Disable-SSLv2+Disable-SSLv3 exit exit write mem [/code]

Java

If you have problems with handshakes from Java client process force the protocols to use with -Dhttps.protocols=TLSv1

nginx

Make sure the ssl_protocols line in your SSL configuration doesn’t have SSLv3 in it. ssl_protocols TLSv1 TLSv1.1 TLSv1.2;

nodejs

Make sure you don’t have secureProtocol:SSLv3_method anywhere in https options - use TLSv1_method instead if it’s really needed.

Websphere

See Security bulletin