But does it work in IE?

The state of the OS and web browser market

UPDATE: Android surpasses Windows as the world’s most popular operating system for the first time. Windows’ decline continues.

Original article follows…

An update for you with some interesting stats for Internet-using computers of all shapes and sizes. You will see they are continuing to follow a long term trend. (TL;DR people are continuing to move from desktop PCs to devices, and Windows market share is dropping as a result.)

Firstly, it looks like Android may surpass Windows as the most popular OS for Internet users (globally) sometime in the next few months.

Of course, these figures are somewhat skewed by the bazillions of users in Asia, but the trend is also visible in other markets as people turn away from desktops and towards devices.

Europe remains strongest for Windows, overall, though its decline there is still pretty consistent.

North America reveals a, perhaps predictably, higher share of Apple devotees than other regions. iOS is the second-most popular OS.

Think about that for a moment.

In the US, 38% of users are now using an Apple-branded computer or device to access the Internet. 60% are now using an OS other than Windows.

A mere 5 years ago Windows had 75% of the market and such a decline would have seemed unthinkable. If the long-term trend continues, in 6 months it will be half that.

If you’re a desktop app developer, you might want to consider what is compelling about the desktop environment and make sure you play to its strengths. You might also consider getting some experience in developing for mobile.

Turning to the Web Browser market, we see the market share of IE is now below 5% globally. Millions of web developers cry “hurrah!”.

Perhaps surprisingly, globally, Edge does not even rate a mention yet, and is lagging behind even the perpetual bridesmaid Opera in popularity. Chrome continues to grow in popularity and market share.

Even in the US, traditionally a market that has strong IE support, IE is continuing the steady downward trend and now sit at 8.1%. Edge is growing but almost imperceptibly – certainly not as fast as IE is shedding market share – and now sits at 3.4%. By contrast, check out those Safari numbers!

In Europe, IE and Edge combined are at about 8% of the market, with Edge again barely growing share as IE continues to drop year on year.

What does it all mean?

Well, if you’re a web developer it’s all pretty good news, showing that the worst browsers ever invented™ continue their slide into obscurity. It also tells us that if we plan to target China and India, we probably need to start testing web apps in the UC browser on Android devices.

If you’re a desktop app developer, you might want to consider what is compelling about the desktop environment and make sure you play to its strengths. You might also consider getting some experience in developing for mobile.

A counter-balance to this info is that while we continue to service customers with corporate networks that are slow to embrace change we might take some comfort from that acting as a brake on those customers asking for something different.

Unfortunately, it also means we’ll keep getting the question “yes, but does it work in IE?”.

As my teenagers would say, “kill me now”.

What a time to be alive!

When I was at high school in the 80s, computers were about the most boring things I could imagine. They couldn’t do anything cool, unless your idea of cool was maths, and to program them was like talking very slowly to a barely literate person with an IQ of 50.

In the 90s, things changed.

By the 90s, computers had become capable of doing things for you that you couldn’t do better by hand. In the 90s, they started connecting to one another and becoming part of the Internet we all take for granted today. In the 90s, computers started waiting for us, instead of the other way around.

Let’s do stuff!

I got my first computer in the 90s, and immediately started a business doing digital imaging using Photoshop. Kodak at this point was still sleeping peacefully, figuring all this digital stuff was a fad that would be over soon.

The world wide web was hitting the news in the 90s. It took a while for people to figure out what it was, but when they did, the web took off exponentially. Even the dot.com meltdown in 2001 couldn’t really slow it down.

I started building for the web in the mid 90s and have been doing it ever since. In that time I have seen many incredible advances, and some monumental follies*, from vector-animation to 3D, streaming audio and video, WebSockets and WebGL through to initiatives like WebAssembly. The web just keeps getting stronger and more capable. Importantly, it has also stayed open, despite the best attempts by some companies to subvert it.

But even 10 years ago, few would have foreseen how different computing was going to be today.

Clouds appear

In the last 10 years, computing has gone from something we do at a desk or in server rooms to something we do everywhere, all the time. We are all carrying around computers in our pockets. We are seeing tiny cheap computers being built into every corner of our environments – from the smart TV to the wearable activity tracker and the smart watch, to the lightbulbs in your house and the locks on your doors. And it’s all connected via the web.

This is why, for me, the two most exciting technology trends today are Cloud computing and the Internet of Things.

Cloud computing got a lot of hype in the early years and some of it was just silly. The cloud’s infrastructure isn’t that different to what preceded it – it’s still run on servers in data centres, just like we did things in the past. What is different is how commoditised computing resources are changing the nature of computing itself. Servers are no longer purpose-built boxes in a DC that are configured to do one thing. Now they are simply a source of computing resources that can be abstracted away by higher level services. This means they can deliver the outcome we want without all the configuration and systems admin we used to have to do to get that outcome.

So while virtual machines that scale are nice, they are far from the most important thing that cloud computing has unlocked. By removing the need for me to manage my own servers, cloud computing has freed me to focus on the value I want my application to provide. Almost inevitably, this has lead to the concept of serverless architectures, where my application is only the code I need and nothing more. The cloud replaces the server stack I would otherwise spend my time maintaining.

New ways to think of software

This kind of thinking is opening up new ways to build applications. An example of this is AWS Step Functions, where an entire application can be pulled together via a visual workflow. Likewise, tools like AWS Simple Workflow Service offer ways to orchestrate your code in a serverless environment, and then to build out and connect it to systems hosted elsewhere and even to processes that existing in the non-virtual world. Tools like these are facilitating an increased connectedness, which in turn opens up new ideas as to what a software application is, and what it could be.

And then, humming at the edges of all that new cloud-enabled capability are the huge numbers of IoT devices that are popping up daily in our lives.

Devices everywhere

Before we had smartphones, who would have thought everyone carrying around a GPS receiver would be useful? Now we can’t live without them. This is just one familiar example of the IoT world that is heading our way, as we measure, monitor and report on more and more metrics we encounter in our everyday lives. Heart rate, steps taken, how much electricity we’re consuming, room temperature, environmental noise, pollution levels, security camera footage…it’s all being picked up and turned into knowledge we can use to improve our lives.

In industry, condition monitoring is a huge growth area, again driven in large part by low cost computer hardware. You can now put a $100 vibration monitor on a truck and collect that data. The data can allow you to predict when it will need servicing, which can save your company the cost of unscheduled downtime. The economics of this are becoming a no-brainer as computing hardware gets cheaper and smaller and wireless networking becomes increasingly ubiquitous.

One interesting result of the rise of IoT is how the cutting edge of computing has come full circle. In a world where servers are now being commoditised and abstracted away, there is renewed interest in physical computing. People are building their own devices, and plugging them into the cloud. They are getting reacquainted with low-level knowledge, like how serial communications work. They are learning how to gather data from sensors over GPIO pins on a circuit board. It’s an interesting development and one that bodes well for humanity, I think. It gets us back in touch with the magic of what, as a species, we’ve achieved over the last century.


* You can put the proprietary Microsoft Network and Rupert Murdoch’s purchase of a dying MySpace in that column.