When I was at high school in the 80s, computers were about the most boring things I could imagine. They couldn’t do anything cool, unless your idea of cool was maths, and to program them was like talking very slowly to a barely literate person with an IQ of 50.
In the 90s, things changed.
By the 90s, computers had become capable of doing things for you that you couldn’t do better by hand. In the 90s, they started connecting to one another and becoming part of the Internet we all take for granted today. In the 90s, computers started waiting for us, instead of the other way around.
Let’s do stuff!
I got my first computer in the 90s, and immediately started a business doing digital imaging using Photoshop. Kodak at this point was still sleeping peacefully, figuring all this digital stuff was a fad that would be over soon.
The world wide web was hitting the news in the 90s. It took a while for people to figure out what it was, but when they did, the web took off exponentially. Even the dot.com meltdown in 2001 couldn’t really slow it down.
I started building for the web in the mid 90s and have been doing it ever since. In that time I have seen many incredible advances, and some monumental follies*, from vector-animation to 3D, streaming audio and video, WebSockets and WebGL through to initiatives like WebAssembly. The web just keeps getting stronger and more capable. Importantly, it has also stayed open, despite the best attempts by some companies to subvert it.
But even 10 years ago, few would have foreseen how different computing was going to be today.
Clouds appear
In the last 10 years, computing has gone from something we do at a desk or in server rooms to something we do everywhere, all the time. We are all carrying around computers in our pockets. We are seeing tiny cheap computers being built into every corner of our environments – from the smart TV to the wearable activity tracker and the smart watch, to the lightbulbs in your house and the locks on your doors. And it’s all connected via the web.
This is why, for me, the two most exciting technology trends today are Cloud computing and the Internet of Things.
Cloud computing got a lot of hype in the early years and some of it was just silly. The cloud’s infrastructure isn’t that different to what preceded it – it’s still run on servers in data centres, just like we did things in the past. What is different is how commoditised computing resources are changing the nature of computing itself. Servers are no longer purpose-built boxes in a DC that are configured to do one thing. Now they are simply a source of computing resources that can be abstracted away by higher level services. This means they can deliver the outcome we want without all the configuration and systems admin we used to have to do to get that outcome.
So while virtual machines that scale are nice, they are far from the most important thing that cloud computing has unlocked. By removing the need for me to manage my own servers, cloud computing has freed me to focus on the value I want my application to provide. Almost inevitably, this has lead to the concept of serverless architectures, where my application is only the code I need and nothing more. The cloud replaces the server stack I would otherwise spend my time maintaining.
New ways to think of software
This kind of thinking is opening up new ways to build applications. An example of this is AWS Step Functions, where an entire application can be pulled together via a visual workflow. Likewise, tools like AWS Simple Workflow Service offer ways to orchestrate your code in a serverless environment, and then to build out and connect it to systems hosted elsewhere and even to processes that existing in the non-virtual world. Tools like these are facilitating an increased connectedness, which in turn opens up new ideas as to what a software application is, and what it could be.
And then, humming at the edges of all that new cloud-enabled capability are the huge numbers of IoT devices that are popping up daily in our lives.
Devices everywhere
Before we had smartphones, who would have thought everyone carrying around a GPS receiver would be useful? Now we can’t live without them. This is just one familiar example of the IoT world that is heading our way, as we measure, monitor and report on more and more metrics we encounter in our everyday lives. Heart rate, steps taken, how much electricity we’re consuming, room temperature, environmental noise, pollution levels, security camera footage…it’s all being picked up and turned into knowledge we can use to improve our lives.
In industry, condition monitoring is a huge growth area, again driven in large part by low cost computer hardware. You can now put a $100 vibration monitor on a truck and collect that data. The data can allow you to predict when it will need servicing, which can save your company the cost of unscheduled downtime. The economics of this are becoming a no-brainer as computing hardware gets cheaper and smaller and wireless networking becomes increasingly ubiquitous.
One interesting result of the rise of IoT is how the cutting edge of computing has come full circle. In a world where servers are now being commoditised and abstracted away, there is renewed interest in physical computing. People are building their own devices, and plugging them into the cloud. They are getting reacquainted with low-level knowledge, like how serial communications work. They are learning how to gather data from sensors over GPIO pins on a circuit board. It’s an interesting development and one that bodes well for humanity, I think. It gets us back in touch with the magic of what, as a species, we’ve achieved over the last century.
* You can put the proprietary Microsoft Network and Rupert Murdoch’s purchase of a dying MySpace in that column.