Lost priorities.

In a previous article, I wrote how this year’s WWDC was probably the most uninteresting of all the WWDCs I’ve followed from 2002 onwards. In that article, I quoted a small part of an interview that Jef Raskin gave to the British newspaper, the Guardian. The quote was: “One only cares about getting something done. Apple has forgotten this key concept. The beautiful packaging is ho-hum and insignificant in the long run.”.
This is what this article is about. How the computing industry tends to forget the very same concepts that make their core, computers, fascinating, exciting, thrilling, useful, a good investment, an amazing tool. That is their usefulness, their fast paced evolution that promises the realisation (and subsequent commoditisation) of concepts, functions, facilities previously part of the imagination. Their ability to bridge in the most pure form the world of mathematics, science and art in one single, ever-evolving yet so familiar physical device.

But take a look at an early 1980s machine, thousands of times slower than those we use today, and it very quickly becomes clear that something is wrong. That things are not moving forward. That the very system that we have developed and became familiar with has strangled the fundamental motives for using computers in the first place. The experience one gets from all modern operating systems, be they Windows, Mac OS X or Linux based and all but very few hardware systems, is more or less the same — bad. And that is despite the considerable advantages one may have over the others in specific areas, they all convey the ignorance, misled creativity and lost priorities of their designers.
Taking one step back (or more for that matter), should be enough for an experienced engineer, computer enthousiast, or even — hopefully bright business manager involved with IT — to see that the motives that drove Jobs’ passionate, naïve and harsh management of the Mac team, Gates’ vision to put computers in every home of this planet etc. are still very much valid, albeit slightly different.
The computing industry today is stale. Software evolves at a very slow pace, despite the vast number of libraries, tools and facilities available to developers today. Existing software is horribly unstable, unappealing and downright frustrating. Existing hardware is not leveraged adequately and design decisions, both in hardware and software products are completely inappropriate and fail to take into account the human user (although they often take into account the human buyer).
From the hardware point of view consider the noise most computers generate. While in an office environment the noice might be considered insignificant, its effect on concentration, productivity, and ultimately the well-being of a user cannot be ignored. And while you can get (or build) silent computers, the process typically involves considerable expense both in time, money and performance. Computers should be silent.
System designs have settled on the 15 year old standards that emerged in the early 1990s. And while buses, physical specifications, storage media etc. may have changed, the fundamental design of a personal computer has remained the same. Taking a look at how computers from the 80s were designed reveals a completely different approach. The custom chips of the Amiga. The innovative approach of Acorn Archimedes. There was competition. There was change. The design of the platform is crucial for the design of software. At a time when our technology should allow us to interact with our computers in new, exciting ways, we’re stuck with a 30 year old paradigm. We are happy when a software house releases software with marginally improved æsthetics: gaussian blurred shadows under windows, desktop compositing, elementary 3D effects. There’s something wrong there.
UI/Task Responsiveness and human-centric design of software is another major issue with modern computer use. We may never have ‘enough’ computational speed in our machines, but there is no reason why computers have to respond slow to human interactions with them. Good UI, and by extension, system and application design is fundamental to achieve good responsiveness. All popular Operating Systems in existence today suffer greatly in this respect. In OS X the Finder (and other less known applications) are infuriating with their hellish single-threaded responsiveness (or rather, lack thereof). In Linux, the problems are more and more varied, as the APIs and libraries are many and fragmented and software comes in varying quality, shape or size. Windows, the most popular OS in the world, is hell in the form software code, a monstrosity that in the name of compatibility boasts the most complex, redundant API, although Microsoft has gone to great lengths to improve on several UI aspects (they still have no class, or sense of æsthetics though — Vista looks worse than what would be a pre-α version of Mac OS in 1995). Computers should be responsive irrespectively of their computational performance
The new generation of processors and technologies for desktop computers are fascinating. Multicore CPUs, physics chips on graphics cards (and motherboards?), greatly increased CPU performance. The power of the average PC of 2010 is frightening, considering its predecessors from 15 years earlier. The power and performance of the accompanying 2010 PC software is frightening in its bloat, lack of performance, instability.
The buzz around ‘Web 2.0’ applications is equally worrying: the industry failed to provide a good rich client platform, and, guided by profit, turns to the ‘web’ in an effort to shift the sales model from product to service, ignoring the fact that the user experience, development process and overall quality of these ‘web-applications’ is typically appalling.
Computers are more than just tools. They are more than a utility. They are intrinsically linked to many parts of the daily lives of billions of people. I think the priorities in both hardware and software design are completely wrong. True competition is inexistent, innovation is marginal (at best) and the experience remains mediocre.
While the renaissance (and marginal success) of Mac OS (in the form of OS X) and some desktop Linux efforts has helped nudge some people off the Microsoft slow wagon of the mental decay the Windows hegemony has brought to its users, the innovation, quality and features of software remain inadequate.
Microsoft, and Apple, once the new, fresh, innovative players in a world dominated by mainframes, graysuits and IBM, are now themselves the gray industry leaders; each in their own way. They maintain their decades-long approach and paradigms. For a while I thought that linux was the answer, yet the anarchy proved to me that it isn’t — not by a long shot. The sorry state of the computing industry cannot change unless the whole process of designing and selling technology is fundamentally changed to reflect the needs for technical excellence and human needs. And that probably requires new and fresh approaches from young, up and coming companies.