Since the late 1970s, the concept of a personal computer was based on a few precepts: being personal, the computer you were using could be opened up and modified, as could the software you installed on it. But recent trends seem to point to a future where such capabilities no longer exist. Based on the maker’s assumption of “if you can’t open it, you don’t own it,” we can now extrapolate that the future of computing is not one that will be kind to tinkerers.
For almost four decades, there’s been an almost anarchistic streak to the way the computer world has reacted. While the consumer electronics world was generally about closed up systems that were to be disposed of if they failed, the computer world was about all about the concept of upgradeability and extensibility. Steve Jobs, upon unveiling the iMac in 1999, fully understood what that meant:
We don’t think design is just how it looks. We think design is how it works. And we labored a lot on this because our pro customers want accessibility. There’s a lot of great technology inside, but they want access to that technology. To add memory, to add cards, to add drives. And so we think we’ve got the most incredible access story in the business. It’s called a door.
In the late 1990s, that was a truly revolutionary revelation: that a computer didn’t even need a screwdriver to be open. It forced PC makers to follow suit by making it easier to open up computers running Windows. While none when as far as Apple did in those days, all manufacturers eventually moved to easy to remove screws. Those were the best days for accessibility in the computer world but dark cloud were gathering.
By 2001, Apple was at the beginning of a major transition from being a computer manufacturer to being a consumer electronics one. The launch of the iPod heralded that era. The iPod was a perfect consumer electronics product: fully closed, from its proprietary software stack all the way to its screw-less enclosure. It was a device that screamed “don’t even think about opening me.” A few hackers screamed but their screamed were drowned by the crowd’s applause for the new device.
What Apple had tapped into was the fact that openness came at a cost. And that cost was stability and ease of use. In a world where interoperability was key, the support for every new form of device created a tax on existing systems that made them more unstable so while Microsoft could claim that Windows-based machines could work with whatever a third party OEM threw at them, it was at the risk of seeing some of those third party drivers come into clashes on the system, leading to blue screens of death and incomprehensible error messages.
By comparison, the Apple ecosystem had to pass through the Apple approval process and since Apple manufactured the hardware and often distributed the software drivers, such instability did not appear in the ecosystem. The cost for such stability was a more limited set of hardware devices one could attach to Apple hardware and along with those limits came an impact in cost, as the few approved players did not have to compete with as many cheap and low-cost competitors.
Customers voted with their wallet and increasingly voted Apple, siding with the idea of a more Disneyfied experience in exchange for greater stability. And Apple got the message. In 2007, Apple made it clear that it was no longer a computer company, by dropping the word computer from its name. The transition to Apple the consumer electronics company was now complete.
Those last five years have shown that resurgence of Apple in many ways: the iPhone and the iPad redefined the idea of computing and forced every other electronic device manufacturer to play catchup. A highly integrated hardware and software experience, all built around advantage in supply chain management became the story of this new era. And along the way, the perception of what a computing device is changed.
Apple’s influence with the iPhone moved the telecommunication industry to embrace smartphones and effect a substantial transition from what are now called feature phones (and were just once called mobile phones) to Apple or Apple-like smartphones. Phones have traditionally not been an area where there has been much opportunity to tinker and hardware makers generally have not offered much in the ways of extensibility there but Apple showed again that even a few concessions towards openness were no good. Over the last few years, we’ve seen a decrease in the number of phone that allow to put in micro-SD cards or present any other way to get data on and off a device than plugging it in. Synchronization with “the cloud” made the idea of side-loading (ie. installing software in another way than through an approved app store) an increasingly difficult task. And even removing and changing a battery seems to go the way of the dodo.
Apple’s next post-PC device, the iPad continued on that trend of closing up the system and customers continued to love it. With only a few exceptions, there were no real calls for being able to change a battery or get information on and off the device. Tablet manufacturers with other operating systems took those lessons and incorporated them in their own designs, making tablet devices that were as closed as the iPad.
Then Apple took to the computer business, with the Macbook Air, an amazingly light computer with a completely closed hardware architecture (in the interest of full disclosure, I should say that I’m writing this on one of those wonderful machines and while I have concerns about the philosophical impact it has, I’d be hard pressed to give it up). The device was light, relatively inexpensive, and completely un-upgradable. With all its components glued or soldered to the motherboard, this was no longer a machine that could be tinkered with or upgraded. The more recent Macbook Pro with Retina display continued on the same path with glued on and soldered on components.
Meanwhile, the PC world was still pushing Windows machines that allowed to change memory, batteries, even upgrade processors. Sure, this was the kind of thing that only geeks and hackers would do but the upgradeability continued a trend that had existed since the early days of the personal computer.
And then Microsoft gave up.
At the unveiling of the Microsoft Surface, the company’s vision for what the future of computing ought to hold, one thing became clear: this type of machine was not the kind that ought to be upgraded but instead, it was the kind that one may use for a few years and then discard for a new model.
The introduction of the Surface product line made it clear that the industry has a whole had bought into the idea of closed systems and that any potential competitor to Apple had given up on any other worldview.
With that, the hardware tinkerer era was dead. Sure there will still be a few holdouts here and there but the general direction of the computing world is increasingly towards more closed systems. Gone are the days when a father will be able to open up a computer and show to his kids how the inside works (in the digital equivalent of previous generations of fathers and sons looking at the inside of a car engine). Gone are the days when careers in hardware got started because a kid tinkered for many years with the inside of the computer. Gone is the excitement for hardware engineering careers that was sparked through the interaction with soldering irons and motherboards.
Instead, we will continue to see increasingly docile developers who will increasingly buy into the requirements set by the large vendors to get their software through this or that app store.
Now before anyone blame Apple for this, think twice. If it were an Apple only issue, there would be an alternative. But no, there isn’t one anymore because all the industry has now bought into the idea of a closed hardware/software system with a controlled experience: look at your Apple device; look at your Android device; look at your Microsoft device. Do you think something that is opening up, that can be changed? Can you still personalize its hardware?
Going beyond the basic tinkering that could happen on hardware is a potential bigger problem: what impact such approaches can have from a recycling standpoint. Today, there are sets of standards that have been established to ensure electronics can be disassembled and their components can be separated so they can be recycled.
However, recent reports have shown that it is impossible to separate components on the new Macbook with Retina display because they are glued on in a way that makes it impossible to take them apart. Apple recently announced that it was pulling many of its product from the EPEAT registry, the standard registry for environmentally conscious products, because their “design direction was no longer consistent with the requirements.”
With this, it creates in interesting conundrum for the rest of the industry: should they follow Apple’s queue and thus dismantle one of the most advanced green-certification program in the world or should they hold the line and risk losing more customers to the Cupertino giant? It’s a tough call but I wouldn’t put money on the industry holding the line on remaining green unless it can find a way to use it to gain marketshare.
© Tristan Louis 1994-present Some rights reserved.