Open IoT Assembly June 16-17th

Not my preferred way to spend a weekend but I wanted to support the initiative and I think a useful start was made.  Many aspects were good: venue, overall organisation, networking opportunities etc. Fri/Sat would have been better days for getting more people there.

The results are published. A lot of comment also coming in on the twitter feed. See also a writeup of the 2 days from @9600.

A lot of groups are thinking about best practice in all kinds of forums and many papers have already been circulated so I wonder whether we need another initiative.  However, the outcome of the weekend was to invite a broader participation – I could not disagree with that.

So I’ll follow this unless and until a better initiative turns up.  It can be successful if there is:

  • recognition of the whole picture including overall architecture, applications, scenarios/ use cases, testing, certification etc.
  • involvement of a community of potential “user” organisations (sorry, still don’t have a better word for them)
  • links with all the other activities in this space

We’ll see what happens and I’ll make a point of tracking the next steps and giving further input.  I’ll try to assemble a list of the other initiatives as I think this should have been done beforehand.

Self-Hacking/ Quantified Self OpenSpace 9th June ’12

This one day event came out of the now well-established London QS meetups. An opportunity to draw out some ideas about areas the group can pursue. Thanks as always to Adriana, Ken, guest speakers and others for putting this together.

I’m just reporting here on one openspace session.  For the rest please see the London QS website and

@rainycat and I initiated a discussion on what Open Source approaches can do for QS. This could be hardware, software or a combination of both addressing any of the elements of QS solutions nameless:

  • sensors and data capture
  • data communications and storage
  • data analysis and visualisation

We discussed why we like open sources so much: community, sharing, “the right to repair” etc. In some cases we end up with better quality solutions than commercial equivalents, Apache being one classic example.

In early stage markets it’s usually not possible to launch a commercial solution when the problem is not well-defined and the size of audience uncertain. However, by sharing the task of building and testing prototypes the state of the art can be advanced.

We looked at a possible wish-list for open source solutions. The immediate suggestion was to produce tools for analysing data from established commercial products such as

  • Zeo (event sponsor)
  • FitBit
  • Withings
  • Jawbone Up
  • Polar Wearlink etc (see QS guide)

At this point we ran out of time for discussion. The above off-the-shelf devices have the sensors and some of the communication, data storage and visualisation which leaves me with a question about what we need to do with these?  Based on Adriana’s input I think we are talking about getting the data out and presenting it in different ways. Rain and I are equally interested in the sensors themselves. To be further discussed.

The next step is to formulate a project. However, it’s very important not to waste effort and momentum on re-inventing any wheels as significant effort may be needed regardless of what we choose to do.  It’s also important to go after challenges that are both important and useful.  Boiling the whole ocean is not an option without significant funding so we need to start with a specific and achievable objective and build on it from there.

I therefore proposed a survey of the London group to establish what should be addressed first. I will put this out in another blog or poll shortly. This will be about priorities and potential “quick wins”.

PS. As I am personally interested in heart-rate monitoring at the moment I googled around to see what’s cooking in the open source community in relation to this. I was pleasantly surprised to see that, if we wanted to attack this area, we would certainly not be starting from scratch. This confirms my theory that QS and Open Source go well together and that, with a bit of research we may find that some of our challenges are already being addressed.

PS.  As Adriana noted: There may be loads of s/w for data analysis and visualisation but is it usable and user-friendly for people who have no software expertise? I found even the simple hack to extract my data from FitBit difficult…. And if they are usable and easy to use, where would we begin to look for them?  This suggests that cataloguing the existing stuff may be one of the priority tasks.  Based on that we can look at usability by QS people who are not IT specialists.

Jatrobot at OTA2012 Bletchley Park

Off we went to Station X for another OverTheAir mobile hack event.

The event was of the usual high quality with great sessions and people to hack with.  Our project was aimed at disruption of the farming industry – an idea courtesy of @herx.  This went pretty well and was recognised in a couple of the awards. Presentation slides here.

This collects soil temperature, moisture, GPS coordinates and other data in the field and transmits them to Cosm in real time.  From there, various real-time visualisations can be seen on a web page or tablet device.

The technologies included were Android, Arduino, various sensors, Cosm and javascript (for the real-time visualisation).  Although we could have used a USB/GSM dongle for the real-time comms we chose an Android phone instead as that provides GPS and other useful sensors.

More on this to follow…

Remote temperature monitor with BeagleBone and XRF

One of the first projects I did when I got the BeagleBone was to continuously read the temperature outside our house.  This is a small part of a much bigger project but maybe of interest to others coming to this platform.

To sense the temperature I am using a sensor device from CISECO.  This comes in two flavours and I chose the wrong one.  The DALLAS version uses a lot more power and therefore needs a much larger battery whereas the “thermistor” version will run for a decent length of time on a coin cell.  Into this I placed one of their XRFs, essentially a low-cost substitute for an Xbee.  (This needs a little setup but can also be purchased ready programmed for the temperature sensor.)

At the other end I needed another XRF wired up to my BeagleBone.  Given that I was in prototyping mode I used the ‘Bone’s proto-cape (cape == shield in ‘Bone parlance).  This connects the XRF, in an Xbee-to-breadboard adapter to UART1 on the ‘Bone.

As the XRFs are based on 866Mhz radios I am confident about the range (though I have yet to be fully verify this).

Now for some code.  I wanted to do this in node.js which is supported out of the box on ‘Bone.  Well almost.  There’s also an IDE called Cloud9 which is very handed but runs on an older node.js version not compatible with node module “serialPort”.  I did not want to destroy this with a node.js upgrade, particularly as I did not have a spare uSD card to play with.  So I did this in python, another well-supported language on the ‘Bone.

Thanks to quasiben’s blog it was easy to find out how to set up the bone with reading the enormous manual cover to cover.  This what I ended up with:

import serial, os
import sys
import time

## for beaglebone serial IO
uart1_pin_mux = [
  ('uart1_rxd', ( 0 | (1<<5) )), # Bit 5 enables receiver on RX Pin
  ('uart1_txd', ( 0 )), # No bits need be set for TX Pin
for (fname, mode) in uart1_pin_mux:
  with open(os.path.join('/sys/kernel/debug/omap_mux', fname), 'wb') as f:
  f.write("%X" % mode)
ser=serial.Serial('/dev/ttyO1', 9600, timeout=1)

## temperature data
i = 0
while i < 12: # or True once tested
  c =
  while c != 'a':
    c = # a--TEMPdd.dd
  if t == "TEMP": # ignore anything except temperature
    print s[6:]
  if i >= 6:

This just prints the temperature every 3 seconds.  Note that this is far from a complete implementation as it does not set the device ID nor tell it how often to send the reading.  The definition of the LLAP protocol used by the XRF firmware is at

Watch this space for a version combined with other sensors and actuators, probably node.js based.  Alternatively, some other way to making this asynchronous would be needed.

Quest for a low-power home hub

I’m talking about this at #iotlondon next week.  Objective: to let people know what I’ve found so far and see if others want to collaborate.

It all started when my router blew up because I had 16 devices talking like the clappers to multiple services.  Got a new router but that’s no permanent solution.

I thought: we’ve got all these devices with their own protocols talking to all these cloud services also with their own protocols.  We need a hub in the middle to make sense of this.  But, this hub needs to be powerful, versatile, low-cost and low in power consumption (as it’ll be always on).

That combination of attributes turns out to be challenging.  I’m looking at a number of hardware and software platforms and have some interim results.  Also building some demo apps to apply a little stress testing.

This hasn’t been plain sailing.  Glad I gave myself a deadline for this talk.

The quest continues.  Anyone got any spare coconut shells?

Update:  The session seemed to go well and useful input was received. Slides on slideshare.

@andypiper mentioned AIKO, an interesting platform that deserves a detailed look.  qp is another one I need to look at further.  The emotional pull of node.js is irresistible at the moment (such a time-saver) so support for this could end up as a key requirement.  Off to lnug (London) meetup tonight to top up my knowledge and contacts.

The more I think about it the more this is indeed a grail quest.

Hack Yourself (QS) Session at NESTA 3rd May

8am start in London means I’m up no later than 5:30.  Or was I dreaming?  No, the Quantified Self “movement” is gathering momentum.  In London this is thanks to the QS Meetup led by Adriana Lukas, one of the speakers at this session.  Great that NESTA picked this as hot topic.  They even posted a video!

Because of the document referred to in the second link, there is little I need to say about the talks themselves other than that they were very well done. In the ensuing questions after the talks some issues cropped up that are dear to my heart (with my “Internet of Things” hat on).

Architecture:  Many people seems to be developing architectures but there’s little sign of an emerging de-facto standard.  This is for the usual reasons.  Product vendors (Adriana cited Fitbit) produce end-to-end “stovepipe” solutions that sit in “walled gardens” that they control.  This allows them to get continuing income from selling us back our data to help fund product development (goodness) at the expense of making integration between systems extremely painful.  We need an architecture that’s vendor independent and based on open data principles, with privacy controls.

Tools:  The situation may be better than people think as there are lot of good tools around.  These tools are not, however, well suited to people who just want to use the data.  For example, getting  data out of a proprietary system and uploading to somewhere else for  analysis requires a fair amount of programming skill and not everyone wants to learn that.  This may be inevitable at this stage of market development.

Gateways:  Given that we’ll be facing this stovepipe problem for years to come and tools may be too technical for most users, it will be useful to have devices that are designed to translate formats and protocols straight out of the box.  With colleagues, we are looking into what could be done in this area.

Privacy/ Security:  As was pointed out in the talks, this is fairly fundamental if we are talking about personal data.  We need to be able to access our own data (all of it), grant access to others with a legitimate need and feel confident that unauthorised access will not occur.  Part of this is technical; there are well-proven mechanisms for encryption, access control etc.  The commercial, legal and cultural issues are more complex.

“Bill of Rights”: Attempts to codify appropriate behaviour in relation to data are underway. At the Open Internet of Things Assembly in June this will be on the agenda and there’s already an active googlegroup on the subject (iot-open-data).  It’s too early to say how this will progress and to what extent it will be adopted or enforced.

Data Analysis: This is a tougher challenge as the needed techniques will vary widely.  A good starting point is to look at analysis techniques in other fields of endeavour to see what can be re-used;  The case studies presented at regional QS groups should yield ideas on techniques and tools to implement them.

I look forward to further QS sessions and will continue to engage as much as I can.

OSHUG #18 – Pi and Bone

OSHUG, the UK’s Open Source Hardware User Group, goes from strength to strength so it was a pleasure to attend yesterday.

This was a special edition that included the first Raspberry Pi meetup.  The conversation went round the houses but was none the less useful.  Whilst this device is coming in an an attractive price it is currently lacking in an ecosystem to rival Arduino and MBED.  It is also suffering from an acute shortage in supply; RS guys were on hand to explain how this is being addressed.

The ecosystem discussion was interesting.  It may evolve organically.  However, IMHO some strong leadership is needed to ensure that this happens in a coherent fashion so that a collection or coherent and interchangeable building blocks evolves.

Once #OSHUG proper got started we have 3 great talks:

  • Open Compute: interesting to those who may need to build server farms.  In direct contrast  to Google’s approach of keeping the tech to themselves.   This is more large scale engineering than hacking, even if it started that way. DC power distribution is interesting – that could find wider application. Chris Swan probably works for the smallest kind of company that would need this.
  • Beagleboards, and especially BeagleBone has matured into a really interesting platform now.  Crucially, there is a comprehensive ecosystem around it.   This looks ideal for home hubs and such like projects.  I’m particularly impressed with the IDE options and software that comes with it.  Sorry, Raspberry Pi, but even with the steep cost differential this looks like better value for money, especially if you value your time. Amongst the demos Roger Monk brought along was a BeagleBoard running Android!  Looking forward to getting my hands on one.
  • Henk Muller gave us another talk about XMOS.  This differed from OSHUG #1 in that it was more about concurrent software execution that XMOS’ rather unique hardware.  Interesting if you have the sort of project where highly predictable timing is important.

Thanks again to @9600 for impeccable stewardship of the group and to #C4CC for hosting and the regulars for showing up.

IOTLondon Meetup 28th March ’12

First and very important after last time, the organisers had found an excellent venue in a convenient location and they even provided free beer. What’s not to like? This is really essential when you have great talks – which again we did.

First up was @andrewlindsay talking about the gateway software he’s created for Sukkin’s wireless hub device. This can handle arbitrary numbers of wireless sensors communicating at 433 or 868 Mhz. Thanks to an included socket it’s also amenable to Xbee or XRF ( wireless connections. It’s MBED based so there’s enough space to do the software properly, something that’s almost impossible on current Arduinos.

At 2 watts, the device is eminently suitable to being always on, something that many boxes are not. Being open source we’ll never be limited in the directions we could take the software that Andrew is creating.

Andrew has only tried a few sensors so far but, interestingly, these include air pressure and dust particles. He’s using Jeenodes at the remote end.  He plans all sorts of further enhancements, notablly a web-based configuration facility.  Excellent stuff!

@rollinson (Jeremy) then talked about “Open Telematics”, a mobile (currently Android) platform for telemetry that posts data to pachube. The first outing of this is a vehicle app that hooks up to the CAN and OBDII buses now legally required in all cars.

The platform uses the phone for GPRS, accelerometer, GPS and screen and hooks up to the CANBUS via a proprietary dongle. I want one. I’ve signed up for their beta programme.

At Jeremy pointed out, this is unique in that people will be able to get their own data, something not possible with commercial vehicle tracking systems. Datastreams will include, at least, fuel flow, rpm, speed etc. As well as offering the app for a reasonable price they have plans to make it social and add premium functionality. However, as we can access the data there’s nothing to stop others from innovating around this. Apparently Open Energy Monitor and already looking at how they might integrate this with their system.

So, I thought, good job I did not go to the trouble of building this. It does the job and the price is right.

Third session was from Cesar Garcia Saez and colleague from Medialab Prado Madrid. This sounds like a fabulous facility where people can come in and work together on interesting projects.

It’s open to all and inclusive in bring togther a wide range of disciplines from art to technology.  It’s also inclusive to people who are not Madrid-based through the use of Wikis, video streaming and regular open calls to bring people in from other places. Something like 50-100 collaborators are involved.

As this is government funded, one of the rules of play is that the results are open sourced. The projects they do at the Medialab tend to be pre-competitive, not products ready for market so this open policy is not a problem.

Their “Smart City” project was given as an example. In this they collected air quality data and displayed it on the side of a building. Following up on this they are getting involved in the Air Quality Egg project (now active on kickstarter).

All in all an excellent meetup. Many thanks to Ed and Alex. Keep up the good work.

EcoBuild 2012

Ecobuild came around again this week and I braved crowded tubes to make it down to ExCel. If you needed a solar panel look no further. I did not. Instead, I was there to see what was happening on the energy gadget front. In particular, I was looking for energy monitors and heating controls that I could recommend to friends and associates in our energy coop “Low Carbon Chilterns”.

In a much earlier blog I mentioned my fortunate chance to evaluate one of the first advanced heating controllers dubbed “Intuition”. This has now come to market under the OWL brand with numerous improvements and has become something that no home should be without.

We used to have a “state of the art” system with wireless thermostat and multi-period programmer – or so we thought. However, through careful monitoring of gas used I discovered that our system was far from optimised. With continually rising costs this was getting expensive. The main reasons is that most of the currently-available programmers are not designed for humans to use. Three buttons pressed in some hideous combination allow any setting of time period. Our programmer not being linked to the thermostats we only had one temperature setting for hot water and one for air. Moreover, this programmer is so tedious that most people do not set it optimally or change it as often as necessary (some don’t change it at all). This is especially the inefficient in Spring and Autumn when outside temperatures are volatile.

Management of an Intuition system is a breeze. You have a web-based User Interface (my preference) and controls on your smartphones. The fact that you can easily change any setting makes a huge difference. This is pretty fundamental to modern product design. Machines should make our life easier not try to enslave us (think last-generation video recorders).

You don’t want to keep the water shower-hot 24.7 just in case you might feel like a shower. Instead, have it hot when you know you will need it and any other time just punch one button to bring it up to temperature.

You may have heard of the “optimum start” feature of some modern boilers. Intuition’s version of this is based on your settings that say something like “give us 45 degrees for ours showers at 6am M-F and 8am on weekends”. The system looks at the outside temperature and calculates when to turn the boiler on and off.

The latest version of this system is also extremely easy to install, it simply replaces your existing thermostats so anyone with basic electrical skills can do it.

The net effect in our case has been a significant saving of energy. Since heating accounts for 80% of energy used domestically this is something that everyone should pay attention to. In our case the new heating controls are only one of many things we have done to effect a 30% saving in our energy bill. However, I would attribute at least half of this to accurate control of heating. This constitutes a sub-one-year payback with the current model.

BTW. A modern boiler is a pre-requisite that I sometimes take for granted. Old ones may be more reliable but they are rarely efficient. Make sure there’s a “diverter value” – not all systems have the ability to hear just the air or just the hot water. You need a plumber to install one of these, hopefully you have one already. With this in place you have separate control.

Back to Ecobuild and some other products that caught my eye as I rushed past all the panels.

Energy monitors are coming of age. I don’t subscribe to the idea that a little box on your windowsill will give you enough information to justify changes in energy-use behaviour. Put simply, you’ve got to have graphs of energy vs. time. CurrentCost were first to market with affordable web-based electricity monitoring and I’ve been using theirs for some time. What I always wanted first and foremost was gas so for that I had to build my own. However, electricity monitors are widely available and inexpensive considering the savings they can produce if properly used.

Now, a number of other suppliers are jumping into this, OWL and Energeco being two displaying their wares at the show. The OWL device, also dubbed “Intuition” is a neat little box that adds onto the familiar OWL monitor. This looks particularly interesting for installations with PV as well as grid power. I will evaluate this ASAP.

Another interesting type of gadget is one that diverts any excess energy from a PV array into heating hot water.  This is a relatively new category and I confess not to understand exactly how these work.  Something to watch out for.

As ever, watch this space for thoughts and developments on energy gadgets.

New Pachube Apps Facility

In the beginning (April 2011) there was a beta apps platform (announced at the hackathon) that allowed the creation of apps that other users could use with their datastreams.  This has now been superseded and a new facility is being tested prior to launch.  This is based on OAuth as the way of allowing a given app to use one’s data.

The new process for the user

  • Go to the app’s home page
  • Click on the install link
  • Authenticate at pachube
  • Configure the app, associating the user’s data feeds
  • App is now ready to use

How it works

The diagram shows a four step process for the user.  The grey boxes are the app and the white ones are pachube.  R shows redirects and dotted lines show server-server communication.

The pachube OAuth API spec shows how to implement this.  I did a version in PHP which took an hour or so to implement.  This is currently specific to my apps but I plan to generalise it in due course.  It’s not the whole solution because the configuration piece will tend to be app-specific.

Associating data to your app

This used to be handled by the pachube apps installer which is no more.  This is no great loss as the original UI, being generic, could cause user confusion.  Now that the developer can roll their own UI you can make it as useable as you like.

A first demo of this approach

Here’s an example.  After authenticating with pachube I take the user here where they can select the feeds and datastreams to use.  This needs explanations of course but the important feature is that both the app and the data semantics are visible together and clearly related.

Once the user confirms this configuration we can save it and take them straight to the app itself and set it running.  It this point there is no data store within pachube for the configuration.  Again, not a problem as you are free to store this wherever you like.

Limitations/ outlook/ suggestions

The current OAuth mechanism gives an app access to all the feeds belonging to the currently logged in user.  This is fine for now.  However, I foresee that users will have certain feeds and/or datastreams that need to be more private than that.  I understand that this is in the pipeline from pachube. It should be quite easy to retrofit this capability when the time comes if you cater for it in your design (each datastream will need to know what access the key provides);

Another issue is that each app needs to replicate the fields in pachube’s “App” entity (description, urls etc).  It would be useful if this could be accessible via an API so that the two could be sync’d (using the app owner’s master key).  Ideally, the app should be able to instantiate itself within pachube using an API rather than manually – this would avoid any accidental transcription errors.

The current (beta) versions of a couple of apps are at  I have asked how you get apps onto (ie more generally available).  No answer on that while the overall facility is still being completed.

Watch this space for more info as we test this more fully.