Remote temperature monitor with BeagleBone and XRF

One of the first projects I did when I got the BeagleBone was to continuously read the temperature outside our house.  This is a small part of a much bigger project but maybe of interest to others coming to this platform.

To sense the temperature I am using a sensor device from CISECO.  This comes in two flavours and I chose the wrong one.  The DALLAS version uses a lot more power and therefore needs a much larger battery whereas the “thermistor” version will run for a decent length of time on a coin cell.  Into this I placed one of their XRFs, essentially a low-cost substitute for an Xbee.  (This needs a little setup but can also be purchased ready programmed for the temperature sensor.)

At the other end I needed another XRF wired up to my BeagleBone.  Given that I was in prototyping mode I used the ‘Bone’s proto-cape (cape == shield in ‘Bone parlance).  This connects the XRF, in an Xbee-to-breadboard adapter to UART1 on the ‘Bone.

As the XRFs are based on 866Mhz radios I am confident about the range (though I have yet to be fully verify this).

Now for some code.  I wanted to do this in node.js which is supported out of the box on ‘Bone.  Well almost.  There’s also an IDE called Cloud9 which is very handed but runs on an older node.js version not compatible with node module “serialPort”.  I did not want to destroy this with a node.js upgrade, particularly as I did not have a spare uSD card to play with.  So I did this in python, another well-supported language on the ‘Bone.

Thanks to quasiben’s blog it was easy to find out how to set up the bone with reading the enormous manual cover to cover.  This what I ended up with:

import serial, os
import sys
import time

## for beaglebone serial IO
uart1_pin_mux = [
  ('uart1_rxd', ( 0 | (1<<5) )), # Bit 5 enables receiver on RX Pin
  ('uart1_txd', ( 0 )), # No bits need be set for TX Pin
]
for (fname, mode) in uart1_pin_mux:
  with open(os.path.join('/sys/kernel/debug/omap_mux', fname), 'wb') as f:
  f.write("%X" % mode)
ser=serial.Serial('/dev/ttyO1', 9600, timeout=1)

## temperature data
i = 0
while i < 12: # or True once tested
  c = ser.read()
  while c != 'a':
    c = ser.read()
  s=ser.read(11) # a--TEMPdd.dd
  t=s[2:6]
  if t == "TEMP": # ignore anything except temperature
    print s[6:]
  i+=1
  if i >= 6:
    time.sleep(3)
    ser.write("a--TEMP------");

This just prints the temperature every 3 seconds.  Note that this is far from a complete implementation as it does not set the device ID nor tell it how often to send the reading.  The definition of the LLAP protocol used by the XRF firmware is at openmicros.org.

Watch this space for a version combined with other sensors and actuators, probably node.js based.  Alternatively, some other way to making this asynchronous would be needed.

Quest for a low-power home hub

I’m talking about this at #iotlondon next week.  Objective: to let people know what I’ve found so far and see if others want to collaborate.

It all started when my router blew up because I had 16 devices talking like the clappers to multiple services.  Got a new router but that’s no permanent solution.

I thought: we’ve got all these devices with their own protocols talking to all these cloud services also with their own protocols.  We need a hub in the middle to make sense of this.  But, this hub needs to be powerful, versatile, low-cost and low in power consumption (as it’ll be always on).

That combination of attributes turns out to be challenging.  I’m looking at a number of hardware and software platforms and have some interim results.  Also building some demo apps to apply a little stress testing.

This hasn’t been plain sailing.  Glad I gave myself a deadline for this talk.

The quest continues.  Anyone got any spare coconut shells?

Update:  The session seemed to go well and useful input was received. Slides on slideshare.

@andypiper mentioned AIKO, an interesting platform that deserves a detailed look.  qp is another one I need to look at further.  The emotional pull of node.js is irresistible at the moment (such a time-saver) so support for this could end up as a key requirement.  Off to lnug (London) meetup tonight to top up my knowledge and contacts.

The more I think about it the more this is indeed a grail quest.

Hack Yourself (QS) Session at NESTA 3rd May

8am start in London means I’m up no later than 5:30.  Or was I dreaming?  No, the Quantified Self “movement” is gathering momentum.  In London this is thanks to the QS Meetup led by Adriana Lukas, one of the speakers at this session.  Great that NESTA picked this as hot topic.  They even posted a video!

Because of the document referred to in the second link, there is little I need to say about the talks themselves other than that they were very well done. In the ensuing questions after the talks some issues cropped up that are dear to my heart (with my “Internet of Things” hat on).

Architecture:  Many people seems to be developing architectures but there’s little sign of an emerging de-facto standard.  This is for the usual reasons.  Product vendors (Adriana cited Fitbit) produce end-to-end “stovepipe” solutions that sit in “walled gardens” that they control.  This allows them to get continuing income from selling us back our data to help fund product development (goodness) at the expense of making integration between systems extremely painful.  We need an architecture that’s vendor independent and based on open data principles, with privacy controls.

Tools:  The situation may be better than people think as there are lot of good tools around.  These tools are not, however, well suited to people who just want to use the data.  For example, getting  data out of a proprietary system and uploading to somewhere else for  analysis requires a fair amount of programming skill and not everyone wants to learn that.  This may be inevitable at this stage of market development.

Gateways:  Given that we’ll be facing this stovepipe problem for years to come and tools may be too technical for most users, it will be useful to have devices that are designed to translate formats and protocols straight out of the box.  With colleagues, we are looking into what could be done in this area.

Privacy/ Security:  As was pointed out in the talks, this is fairly fundamental if we are talking about personal data.  We need to be able to access our own data (all of it), grant access to others with a legitimate need and feel confident that unauthorised access will not occur.  Part of this is technical; there are well-proven mechanisms for encryption, access control etc.  The commercial, legal and cultural issues are more complex.

“Bill of Rights”: Attempts to codify appropriate behaviour in relation to data are underway. At the Open Internet of Things Assembly in June this will be on the agenda and there’s already an active googlegroup on the subject (iot-open-data).  It’s too early to say how this will progress and to what extent it will be adopted or enforced.

Data Analysis: This is a tougher challenge as the needed techniques will vary widely.  A good starting point is to look at analysis techniques in other fields of endeavour to see what can be re-used;  The case studies presented at regional QS groups should yield ideas on techniques and tools to implement them.

I look forward to further QS sessions and will continue to engage as much as I can.