Software still at the heart of IoT

Earlier today, I was quoted in Drew Turney‘s Tech giants get ready for Internet of Things operating systems article for The Age.

The article explores the relevance of ‘dedicated’ IoT systems, like GE’s Predix.

I’d like to expand on this quote:

“The opportunity of IoT lies in integrating physical intelligence right through to business processes, and back out again”

Much of the current discussion around IoT is focussed on cheap sensors, platform interoperability, and data analytics. These are all important building blocks, but they don’t really talk to the power of IoT for me.

We’ve spent two decades mashing up databases. Disconnected datasets now annoy even our least technical friends.

We spent the last decade mashing up web services. It’s mind boggling that I can add a high-quality, interactive map with global coverage straight into an app, and then spend longer trying to generate all the different icon sizes required to publish it.

We’ll spend this decade mashing up the physical world. We’re nearing the point that it’s as easy to connect to your toothbrush as it is to connect to a web service.

Software remains at the heart of all this: it’s just that we can now reach further than ever before. Rather than waiting for input data, we can just go and get it. Rather than sending an alert, we can just go and start/stop/repair/move/etc. whatever we need to.

Separately, it was encouraging to see security raised several times. A device that’s too small and dumb to run the math required for encryption is probably not something to be exposed to the public internet.

And of course, it’s always nice to see Readify’s name alongside the likes of Intel, GE, and CSIRO. :)

Make one thing better, starting with your bed.

U.S. Navy Admiral Bill McRaven:

It was a simple task — mundane at best. But every morning we were required to make our bed to perfection. It seemed a little ridiculous at the time, particularly in light of the fact that were aspiring to be real warriors, tough battle hardened SEALs — but the wisdom of this simple act has been proven to me many times over. If you make your bed every morning you will have accomplished the first task of the day. It will give you a small sense of pride and it will encourage you to do another task and another and another. By the end of the day, that one task completed will have turned into many tasks completed. Making your bed will also reinforce the fact that little things in life matter. If you can’t do the little things right, you will never do the big things right. And, if by chance you have a miserable day, you will come home to a bed that is made — that you made — and a made bed gives you encouragement that tomorrow will be better. If you want to change the world, start off by making your bed.



Readify is a very culturally diverse organisation: a quick scan of my inbox right now shows names like Korczynski, Mutandwa, Shah, Saikovski and The. “Tatham” isn’t exactly simple either. We’re also distributed across many different client sites spread around the country, which means a lot of our conversation is via email.

I recently came across an Aussie startup called, via a Shoestring article. I’m a sucker for well implemented simple ideas, so I gave it a go.

This is what my email signature now looks like, with an extra “Listen” link:

Tatham Oddie (Listen) | Readify Chief Information Officer | m +61 123 456 789 | | w

I like it, not because I’m precious about my own name (I’m really not), but because I like knowing how to pronounce other peoples’. There’s been a bit of adoption across Readify already, and I look forward to seeing it grow.

It’s hard to really tell how useful it is, as it’s not the type of thing people really call out a lot. I have been tracking click-throughs though, and it’s getting a few.

PS. If you’re wondering, here’s how you pronounce “The”:

Nerd Corner: Convert a Mercurial (Hg) repo to Git, with full fidelity, on any OS

Fortunately or unfortunately, Git won over Mercurial. I placed a few bets on Mercurial at the time, so I have a bit of a tail of repositories left to convert.

Converting on Windows with full fidelity isn’t really possible. None of the scripts work well, and the case insensitive file system can cause issues. Luckily, Windows Azure makes it super easy to borrow a small Linux instance quickly.

I’ve documented what I do in this post. Anybody with a web browser can follow these steps, on any platform. There looks like a lot of steps, but that’s just because I’m spelling out every last detail for clarity.

Create a Linux VM in Windows Azure

  1. Sign in to
  2. Create a new VM from the gallery:
    Create VM from Gallery
  3. Choose an Ubuntu release. As of this post, I chose Ubuntu Server 13.10.
  4. Name the VM anything you want
  5. Untick “Upload compatible SSH key for authentication”, unless you know what you’re doing there
  6. Tick “Provide a password”
  7. Leave all the rest of the defaults, and just keep clicking Next
  8. Wait a moment for the VM to get provisioned

Connect to the VM

For this, we’ll just be connecting to a command line via SSH: no GUIs will be harmed.

Because SSH is so prevalent, there are tool chains available for every platform. I’m actually writing this post on my Surface RT (not Pro), using an app called SSH-RT from the Windows Store.

  1. Connect to the DNS name for your new VM (mine was
  2. Use the username and password you established during the wizard
  3. You should now be at a command line like azureuser@git-convert:~$

Install Git and Hg on the VM

Ubuntu doesn’t ship with Git or Mercurial installed by default, but it does have an awesome package manager called apt-get.

  1. Run sudo apt-get install git
  2. Run sudo apt-get install mercurial

The sudo prefix is a command to elevate your permissions, kind of like a UAC prompt on Windows.

Clone hg-fast-export on to the VM

We’ll be using a tool called hg-fast-export to convert the Mercurial repository to Git, without having to replay each individual changeset like some tools do. This tool is in a Git repo, so we’ll just clone that repository down in order to get it onto the VM.

  1. Run git clone

Clone your Mercurial repository on to the VM

For the sake of simplicity, we’re just going to use HTTPS instead of SSH.

  1. Run hg clone https://your/hg/repo/address

Export your Mercurial repository to a new Git one

  1. Create a new folder for your Git repository: mkdir your-repo-git
  2. Change to that folder: cd your-repo-git
  3. Initialize an empty Git repository there: git init
  4. Do the fast export: ../fast-export/ -r ../your-repo/

Upload your Git repository to your Git hosting

  1. Add the remote: git remote add origin https://your/git/repo/address
  2. Push up all branches and tags: git push -u origin --all

Convert Hg-specifc config to Git

Take the opportunity now to convert your .hgignore file to an equivalent .gitignore one. You can go and do this back on your own machine.

Delete the VM

Back in the Azure Management Console, delete the VM. When you do this, choose to “delete the attached disks”. (It will ask you.)

All done!

You’re all done. Wasn’t that just a perfect, easy use of the cloud?

Being an open recipient

Lately, I’ve been reading One Strategy: Organization, Planning, and Decision Making. It’s a collection of Steven Sinofsky’s internal blog posts while he ran the Windows division, with some light analysis by Marco Iansiti. (The blog posts are so far more interesting than the analysis.)

This quote stuck with me (page 30, Kindle location 746):

We are still not sending around (locally) enough meeting notes and not sharing information more freely. And part of that is asking people to be more receptive to “raw data” and less demanding of “tell me what is important” because with empowerment comes the need to process more data and manage the flow of information. For our process to work smoothly we do need more communication.

Within Readify, we’re seeing a fast growth of shared OneNote notebooks. They’re like wikis on steroids: near real-time multi user editing, ink, images, audio, no save button, multi-device, offline support, web app, deep linking right down to a specific paragraph, and more. They’re an insanely useful part of our information flow, and deserving of their own post another time.

The ease of access that comes with these pervasive notebooks has lowered the bar for content capture. And it’s great.

Instead of some formal documentation requirement that gets missed, we’re now able to capture the as-it-happens notes. After 10 years of consulting, we’re finally seeing a really rich knowledge base about our engagements get synchronized back into SharePoint instead of living in the heads of individual consultants. Call notes, meeting notes, architecture diagrams, sprint reviews, pre-sales meetings, org charts and whiteboard photos all end up in the notebook now. When I go to a presales meeting, the account manager and I are both recording our different notes straight into the same page of a notebook in real-time, then one of us can snap a pic of the whiteboard as we leave. (SharePoint + 4G enabled devices are the back-end plumbing here.)

These notes don’t provide the full context of a project, but they capture a series of events that cumulatively provide much of that context. They aren’t an analysis of the events either; they’re a summary, closer to a transcript. But that’s all ok, because they create visibility across our teams and open conversations we weren’t having before. Seeing this transition sweep across our business, I have to say that I wholeheartedly agree with Steven’s views.