X
Tech

Why the market for paid-for Windows and OS X upgrades no longer exists

Remember the last time you paid for a desktop OS upgrade? We're half a year into Windows 10 with no announced plan for what happens after the free year of upgrades is over. David Gewirtz thinks paid OS upgrades are a thing of the past.
Written by David Gewirtz, Senior Contributing Editor

I've been following Ed Bott's articles on whether or not Windows 10 will have a paid-for upgrade once the free one-year upgrade program expires. In his articles, Ed is trying to extrapolate Microsoft's future plans from their current situation, historical behavior, and financial reports. Given Ed's excellent track record, his guess that paid-for Windows upgrades are a thing of the past is probably right.

But let's take a moment to first discuss how incredibly odd it is that we're half a year into Windows 10 and there isn't an announced plan for what will happen after the free year of upgrades is over.

Ed thinks Microsoft hasn't decided what it wants to do yet. This certainly fits into the Microsoft we all know, the one I once described as having this unwritten mission statement:

Microsoft builds incredibly deep, powerful, and flexible software products that -- before they see the light of day -- must be infused with a level of unnecessary inconvenience, incomprehensible restrictions, and regressive policies such that all possible joy has been removed prior to customer contact.

But there's more to it than Microsoft's internal culture. To understand, really, what's happening you have to look at applications, Android, iOS, the cloud, secretive criminals working for rogue nations and organized crime syndicates -- and overlay all that on the history of Windows operating system upgrades.

Windows through the ages

Let's start all the way back with Windows 95, released in 1995. A Web browser didn't even ship with Windows 95. Instead, Internet Explorer was originally provided with the Web Jumpstart Kit, an add-on to Windows 95. Google didn't exist. Facebook didn't exist. Most online connections were via modem. And while there were viruses and malware, they mostly just damaged the machine they were accidentally installed on.

Windows 1.0 to 10: The changing face of Microsoft's landmark OS

On the Mac side, OS X didn't exist yet. We were still in the days of the classic Mac and Windows 95 was introduced while the Mac OS was still hovering around System 7.

Windows 98 came a few years later and it was really Microsoft's first Internet-aware OS. The big fuss back then was channels, sort of on-desktop widgets with push news (think RSS in a widget). But like Windows 95, the main work people did on Windows 95 was in locally-hosted applications. Office was, like always, the Big Kahuna, but we all used desktop-based email clients. Software-as-a-Service was still, mostly, years in the future.

Apple was still working on the migration from the 68K architecture to the PowerPC and the Mac was running System 8. Apple sold a little over a million copies of System 8 within a week of its release. We were still buying OS upgrades back then.

Windows XP was the next big thing. Yes, I'm skipping over Windows Me and Windows 98 SE, simply because they don't really drive our story forward. XP came out in 2001 and is, despite Microsoft's best efforts, still going relatively strong today. The Internet was most definitely a thing back then, but we still conducted most of our computing on the PC and its desktop. Remember that when Windows XP came out, there still was no Facebook, no Twitter, no YouTube.

By this point, Apple had made the big transition to OS X, and that meant that the die was cast for the era of the modern desktop operating system.

Then there was Vista. Much has been said about Vista, and most of it not very nice. Vista was a 2007 product that introduced the Aero interface, better home networking, and -- in a much unheralded, but huge change for developers, a solid .NET implementation. Vista had some problems (not Windows 8-level problems, but it very unpopular and often derided). Even so, back in the Vista days, the PC-based application was still king. Facebook existed, but public access had only just started. Keep that in mind as we continue this tour.

Windows 7 was the juggernaut released in October 2009. It overcame all of Vista's limitations and was truly a modern operating system. I'm writing this on a Windows 7 machine. Windows 7 was also the last major Windows release where Windows was the dominant operating system.

By this point, we were all starting to use Facebook for friends and family communication. We were moving to online email clients and away from Outlook. We were sharing videos via YouTube. We were beginning to move our fundamental presence off of the desktop -- but we still identified ourselves as desktop users. If you wanted to benefit from digital technology in 2009, you obviously still needed a computer.

Decoupling digital technology from the desktop

That would all change as the world move to mobile and online-based applications.

This idea of the dominant OS is important, because it determines where developers put their time and attention. For most developers, if you wanted to make money from software, you generally wrote for Windows. At least back then.

And here's where things change. A year before Windows 7 came out, the iPhone App Store winked into existence. When I released my first iPhone app two months after the store opened, there were about 25,000 other apps. Now, there are roughly 1.5 million iOS apps. The Android Market (now part of the Google Play store) also landed on people's phones in 1998, and there are now about 1.6 million apps available on Google Play.

We often discuss the rise of the smartphone and its associated apps, but one killer feature is often ignored. It wasn't until the app stores and one-click installs that regular ol' end users started to feel comfortable installing add-on software. Prior to that, you had to either insert a disk or download an installer. That added set of steps added friction to the process.

There was another side effect of the reduction in friction in app installation -- it was possible to drop the price to a buck or less. See, back when you had to ship physical disks and fight for distribution, it cost a LOT to get apps into the market. But once you could just upload them to an app store and take your cut, there was absolutely no cost of goods sold. This meant consumers could get used to getting apps for cheap or free.

This changed the business model. Today, there are few apps that move the needle by charging upfront. Instead, most apps make their money through either advertising or in-app purchases.

But what does this all have to do with charging for Windows upgrades?

Let's add one more factor to the mix: online apps. I've talked about Facebook's evolution as Windows went through its various mid-life changes. That's because end users were starting to become comfortable with using entire applications -- and in this context, Facebook is an application -- that lived online only.

We forget that change now, but it was a fundamental cognitive change back then. Web sites were, up until the era of Facebook and Gmail, simply Web sites. Online brochures. It wasn't until Facebook and Gmail really took hold that users started to think of online applications with the same level of seriousness as desktop-based applications.

And that brings us back to charging for OS upgrades. Or not charging, as the case mostly is today.

As we all know, we don't have to pay for iOS or Android updates. We get our updates either over-the-air, or by trading in our phones for new ones. The OS is merely a part of the phone or tablet, not a feature you pay money for.

Apple took this trend and applied it to the desktop in 2013 with OS X 10.9, otherwise known as Mavericks. From Mavericks on, Apple said, OS X upgrades would be free. This made sense for Apple because it helped them move their users to a more uniform environment, and gave developers a much more homogeneous market for which to code.

Not by accident, it also devalued Windows from a pay-for-upgrade perspective.

Windows 8 came out in 2012, and it had an aggressive (for the time) pricing program. You could buy Windows 8 upgrades for $39.95. Despite Microsoft's idiotic Start Menu decision, I quite liked Windows 8 (at least once I installed Start8). As someone who used to pay quite a bit for Windows upgrades (and who still, back then, ran mostly a Windows shop), I bought ten upgrade licenses.

But by 2013, Windows 8 was the only major desktop/consumer OS where you had to pay for upgrades. As a result, while the uptake for Mavericks was relatively rapid, the movement off of Windows 7 to Windows 8 and 8.1 was glacial. Windows 8 was almost universally hated -- and it was a paid-for upgrade. That double-whammy was a formula for failure.

To this day, I contend that it wasn't just users' adoption of mobile and cloud technology that helped marginalize desktop computing, it was Microsoft's incredibly poor handling of Windows 8 that shocked a complacent user base out of their desktop computing habit.

Worse, Windows 8 introduced an entirely new API environment for Windows developers, Windows' equivalent of apps. These are now called Windows Store apps, but hey, how many of you have visited the Windows store? Yeah, I didn't think so.

I teach Windows programming at UC Berkeley and my students are always confused. Which major environment do they code for? Which entire Windows programming environment skillset will get them a job and which will leave them stuck in time?

In fact, this is the issue for most developers. While coding for Android is always a challenge because so many old phones are left running ancient releases of Android, most Android developers code just for the most modern release. Most iOS developers do the same. As do most Mac developers.

Another set of developers codes only for Web-based apps. Those developers don't care which OS users are running, just which browser.

The point of all this is, with the exception of Windows, developers are quite clear on what they're coding for. With Windows, though, developers haven't been really sure what makes for a stable foundation.

Then came Windows 10

Windows 10 solves many of these problems from a technical perspective -- as well as from a market penetration perspective. Windows 10 introduced a free upgrade, very much like Android, iOS, and OS X. Had Microsoft not decided to add the little weasel words of "only for a year," Windows 10 would be considered the absolute go-to environment for Windows developers to base their code.

There were probably a few thousand committee meetings inside Microsoft, and the compromise finally agreed upon was the one-year limit. On one hand, some folks theorized that putting a time limit in place would encourage people to upgrade before the deadline while others most likely wanted to keep their options open for future licensing policies.

Clearly, from a developer perspective, moving users to the more modern OS is critically important. But there's another key factor in play as well: security. All of those Windows XP users still out there, the die hards who won't upgrade, are at incredible risk. Those of us using Windows 7 gained some security advantages over XP, but each successive Windows release incorporates the security learnings that Microsoft has worked hard to develop.

There's a hacking war out there and given that Microsoft still is the market share leader in desktop OS, it's heavily targeted by hackers. But the more modern OS implementations are much more secure, so it's very much in both the users' interest and in Microsoft's to get users moved to the more modern environments.

Is there a market for upgrades?

All of this brings us back to Ed's question, What happens to those free Windows 10 upgrades after July 29, 2016? His contention is that -- one way or another -- free upgrades will continue. That might be free upgrades to Windows 10, or free upgrades to a dot-release successor. I agree with his assessment.

This, of course, differs when you're talking about enterprise licensing. Microsoft offers many different options that range from volume licensing to licensing to IoT devices, to VSA and Software Assurance licensing. You can read Microsoft's volume licensing guidelines here. The thing to keep in mind about enterprise licensing is that it always includes an additional layer of client service and support that large volume customers are often willing to pay for.

There is no longer a market for paid-for upgrades for an OS. In fact, not only is there no longer a market, it's actually bad business for the OS vendors to increase the upgrade friction by charging for upgrades. It's far more advantageous for Microsoft to be competitive with the other OSs, which offer free upgrades, to have a predictable, homogeneous platform for developers, and be able to roll out the latest security innovations as broadly as possible.

The revenue Microsoft will realize from having a stable, secure platform in as many users' hands as possible vastly exceeds any losses they might incur from not selling some incremental OS upgrades.

That's not to say Microsoft won't make a stupid or boneheaded decision. Lord knows, they've done that often enough. But strategically, it's certainly not in the company's best interest to charge for Windows 10 upgrades. And, fortunately, we've seen Microsoft tend towards smart decision-making of late.

Expect to see all OS upgrades remain free. It's now what needs to be done to ensure developer loyalty along with stability and security.

By the way, I'm doing more updates on Twitter and Facebook than ever before. Be sure to follow me on Twitter at @DavidGewirtz and on Facebook at Facebook.com/DavidGewirtz.

Editorial standards