It is almost 6 years since Apple announced and released the iPhone. I still remember Steve Jobs mentioning that his goal for the first year was to get 10M iPhones shipped; at the time almost 1% of the global mobile telephony market share. The goal seemed totally unrealistic to anyone involved in the industry as that would amount to several millions of units sold for a device that was, in many ways, severely lacking and overpriced (at launch). The iPhone came out, and despite having significantly inferior technical specifications in some of the most crucial benchmarks, such as the quality of its camera, the lack of 3G, the extremely slow CPU, the lack of MMS-support (a relatively obscure, yet somewhat ubiquitous feature of ‘feature’ phones, especially in Europe) and others, managed to exceed the 1% goal that Steve Jobs had set a year earlier. It soon became that the reference state-of-the-art device that exemplified everything that Apple had to offer in its nascent post-iPod era, where mass market was apparently successfully coupled with premium quality design and manufacturing and extremely high margins.
At the same time Google had already bought and was preparing for the launch of the Android Platform, an open source new generation smartphone platform based on linux and a slew of open-source libraries and APIs (including Java running on Google’s Dalvik VM) with a large ecosystem of vendors and supporters and Google at its centre. Google originally hoped to create a large ecosystem of OEMs, carriers and application developers all working for it and not against it. I had high hopes for Android in 2007, the same kind of high hopes you’d find developers, engineers, and ‘geeks’ worldwide having about ‘desktop linux’ around ten years earlier.
Contrary to desktop linux — and similarly to Microsoft Windows — Android gradually prevailed in the early smartphone wars, now commanding around 80% of the market share. But Android did not turn out what I (or Google, for totally different reasons) hoped it would; instead it evolved into a sprawling, chaotic, in some ways brilliant and others completely backward platform, combining the best of new technology, and geeky, specification based computing metrics and the worst of the technology industry compromises that accompanied computing since its early days. Fundamental concepts of mobile computing were butchered, like basic navigation, consistency, to manually controlling the power saving, managing tasks, having well-thought out, stable APIs, coupled with mediocre devices, widely varying user experiences and a generally poor roster of applications, as different device manufacturers created their own “skins” — as well as their own set of poorly designed and implemented software to accompany them — resulted in a desperate effort to differentiate their offerings from those found in the stock version of the operating system and an ever increasing pool of mediocrity. The irony, of course, was that the stock operating system was practically nowhere to be found except for Google’s own Nexus series of devices, a showcase of Google’s vision that permeated the developer community and diffused into the wider smartphone-toting populace. Devices cost just a small fraction less than Apple’s ‘closed’ iPhone, but demonstrated horrific deficiencies in performance and quality; the software stack was not optimized, power efficiency was poor, even with batteries much larger than those found on iOS devices. The hardware also lacked in some cases, like the response of the touchscreen, often blamed purely on the sub-par performance of Android, but apparently also caused by inferior hardware. Yet android was improving.
In a couple of years the number of android devices sold surpassed that of iPhones. Coupled with the global financial crisis, the iPhone failed to become a commodity device (at least outside of the large metropolises of the West, where salaries did not reach, let alone exceed, tens of thousands of $ or €) in the same way that the iPod had succeeded in doing a few years earlier. It was still the leading device, both from the design and technology perspective, but it was rapidly losing ground in terms of sales as people chose cheaper android devices. Apple was unfazed: it’s margins were still high, it still had the mind share. Above all, it still produced the definitive smartphone, the reference device that everybody else copied in one way or another.
The iPad was initially welcomed with hesitation by many — it was after all the Netbook era — but was quickly acknowledged as a runaway success after its sales drove the whole industry to go nuts with tablets. More junk devices flooded the market. People were hooked to Apple’s ‘innovation’, the whole industry was bent on copying it. Even as they hated it, Apple increasingly defined progress in the computing industry. In 2009 Intel gave Apple exclusive access to the then-new Nehalem processors ahead of the rest of the industry by several months, something they repeated several times since then with their latest and greatest chips.
People were hooked on Apple’s products, even if it was pretty clear that such innovations don’t happen overnight, that it takes years to actually come up with, design and produce such products.
It took a few more years, and a couple more versions of the android platform software, for Google to produce a platform worthy of comparison to iOS. Apple was in many ways resting on its laurels and iOS started looking increasingly stale, albeit still much more refined, consistent and well-designed than its Android counterparts. By 2012, Samsung had effectively copied many (but not all) of the innovations found in iOS, introduced some of its own and cluttered the stock Android experience with more useless features than you could ever imagine, let alone use. The flexibility of the Android platform, its rich ecosystem, its flaky APIs and mediocre experiences closely resembled another war Apple had fought a quarter of a century earlier: the PC wars with Microsoft.
Lessons from the past
Apple had lost the PC war, not because its software was inferior (for many years it was years ahead of anything by Redmond — but that’s also true for a number of other early personal computing pioneers that did not survive into the 1990s) and most certainly not because its computers was inferior. It lost the PC war because it had lost something much more important than the technology it produced and sold: mind share.
In the early 1990s, Apple was perceived as an arrogant company that milked old technology and produced overpriced, average-performing machines for elitists. Despite the continuous improvements in its line-up, there was some truth to that; its machines were generally too expensive, its Operating System too old and flaky, its compatibility in a world dominated by MSDOS and, later, Windows machines minimal.
Mind share was the first thing that Apple lost in the late 1980s, and the first thing it gained back in 1998 with the introduction of the iMac, after Steve Jobs returned to the company. Well before it lost (or regained) its revenues, mind share was the primary determiner of its fate.
The combination of an expensive, closed-ecosystem culture that necessitates extreme polish and excitement to overcome the hit to its appeal on the public, the death of Steve Jobs (a fundamental part of Apple’s product design and communication) and the inevitable maturing of the smartphone business (a key driver of Apple’s growth these past few years) that takes enthusiasm away from what is rapidly becoming a commodity spells the end of Apple’s dominance in this industry and, perhaps, foretells the beginning of a new era of mediocrity where Android will reach a quasi-monopolistic position, dictating terms and pricing, pushing rubbish software and mediocre devices to the public — similarly to the way Microsoft pushed ugly beige boxes and crappy Windows versions on people’s desks twenty years earlier.
And everything starts from mind share that Apple seems to be losing as of late.
Mind your mind share!
With iOS 7, Apple is, for the first time in six years, drastically redesigning its mobile operating system, its trademark software that largely defines its products. The software is largely uninteresting, in some ways disappointing. It is surely something that breaks from the Steve Jobs tradition of progressive refinement. Æsthetically, it ‘borrows’ quite a lot from Redmond’s ‘Metro’ interface (now called ‘Modern’ after a trademark war for the name ‘Metro’ was averted, resulting in the name change about a year ago). It is flat and monochromatic, it has subtle (and annoyingly slow!) animations that guide the user, it does away with all the sceuomorphism that Steve Jobs and his protégé, Scott Forstall adorned their user interfaces with. It is cleaner, more spacious and more consistent.
While it is refreshing — and in some cases vastly improved — it is also rushed, buggy and in some areas, badly thought out. There are deep usability and functional regressions that would not have been there some years ago. Apple, under Steve Jobs, messed up quite a bit, but one thing that stood out, at least to me, was its ability to create a very polished product, that might have been lacking in features, but managed to nail the fundamentals, do the few things it was intended to do extremely well, before new and more advanced features were added. And it was that attention to detail that enthused and inspired. This was true of early (Mac) OS X, and iPhone OS 1.x. Incomplete pieces of software, certainly flawed in many ways, that managed to get many knowledgeable people happy to be using a computer again, after almost a decade of force-fed Microsoft Windows mediocrity. Despite its flaws, iOS 6 remained the most coherent, most well-thought out (both from a development and a usability perspective) smartphone operating system in the world. Where Android boasted hundreds of devices of all kinds, iOS promised consistency, stability, efficiency and the just-works factor that you’ll seldom find in Android devices (or apps), even today.
With iOS 7 Apple redesigns the OS, and as with the iPhone 5c and 5s, it is not the lack of polish, but the lack of enthusiasm they instill to their prospective users. In some ways they are the most complete examples of modern smartphones: powerful, refined, expertly designed. They are much better than the original iPhone or iPhone 4, devices that thrilled and stood clearly apart from their competition at the time. But they are so much ‘closer’ to their contemporary competition than their predecessors were. The two new phones by Apple pack tremendous power and introduce interesting features that may, to those that understand their value, excite, but they cannot create widespread sentiments of joy to the general population who wouldn’t (or couldn’t) care less about motion co-processing, fingerprint scanners and the like.
Apple seems to be losing its mind share, its edge in creating innovative products that enthuse. Products that justify its ridiculously high prices (and similarly high margins) and its vendor-lock-in culture. That make people spend mindlessly in a $650 phone that only offers marginally better experiences than their previous one. In what seems to be nearing a repeat of the desktop computing wars, Apple is made out to seem like an arrogant, elitist company creating overpriced, uninteresting devices.
In many ways that might be dismissed as an aphorism, and it’s true that it is only partly true, but it doesn’t really matter. iOS 7 is still the best mobile operating system in the world. The new iPhones are certainly great devices and the innovations in the fingerprint sensor, the Secure Enclave in the A7 chip, and the M7 co-processor are almost certainly going to become clear to many more people in a few months. Yet Apple is probably at the beginning of a new era when it is no longer universally loved or admired, but scorned; When the mind share is gone, mistakes are not forgiven, new products and services are not eagerly awaited and do not enthuse, the Reality Distortion Field is anything but gone.
And that’s when the decline starts.