I’ve been a fan of Lua since the early 2000s when a friend introduced it to me, even though I never got around to finding the time to properly learn and use it in production stuff. We have discussed about using Lua as a scripting language to allow for downloadable bundles that would extend AthensBook/ThessBook functionality (or fix bugs, or provide dynamically determined personalised features etc., but that never happened until now, due to licensing restrictions by Apple) for ages. Codify is an unbelievably cool app that leverages lua to provide a simple programming environment for the iPad. Combined with the general appeal of the device, the lack of third party, scripting programming environments for it, the ease of programming and use of Codify and the excitement of using such great hardware, I feel that Codify might be the Logo/Basic equivalent for this generation of children between the ages of 5-10, a great introductory platform for programming and an amazing tool for everyone else. And at $7.99 I think it’s a steal. You might want to use a bluetooth keyboard for it though; typing code on the on-screen keyboard seems like a horrible horrible nuisance.
A lot has been said and written about Ubuntu Unity, the new ‘shell’ that’s replaced the ‘classic’ default GNOME desktop in Ubuntu 11.04. Despised by many that interpreted Canonical’s break from the ‘open-source’ norm of restricting modifications to upstream platforms to a bare minimum, as a threat to the upstream projects’ existence (a valid point to an extent), that found it to be half-baked and offering little more (if anything) over the classic desktop and a couple of additional programs (e.g. a Dock, a launcher etc.) while much slower and kludgey (a totally valid point, but it’s a 1.0), Unity is here to stay.
It is true that, despite Shuttleworth’s ramblings on his blog, most of Unity is hardly innovative. Most useful things in there can already be found in most modern desktop environments (including some linux desktops) while Unity’s implementation of those very features is hardly the best. But there are also some unique offerings that are different, such as lenses and the proposed (but not yet included, thankfully) windicators. The question there is: are those features really useful? Are they well thought-out?
I think not. Take for example desktop search, a hot subject in mid 2000s desktops that’s been largely solved in an exemplary way in OS X by Apple’s Spotlight and a number of third party tools on that platform (LaunchBar and then Quicksilver are prime examples of early game changers), and even Windows 7 to some extent through the built-in search field in the start menu. Then, with five years of hindsight, Canonical decides to make things somewhat harder for users by exposing the search context to the user in the form of completely separate ‘lenses’ as opposed to keeping the distinction internal (in the same way OS X does) and presenting filtering options in an innovative way. Put it simply: I’d much rather have a single search field, ala Mac OS X’s Spotlight that searches for my input text across ‘data domains’ and contexts and returns useful, filterable lists of data, than the frustratingly badly designed ‘lens’ concept that forces a clear separation of searches while taking up screen real estate and wasting the users’ time with additional clicks and keystrokes.
Which begs the question: why on earth did the fine people at Canonical make such a bad design decision, when the stated mission of Unity was to streamline the desktop while taking less space etc. and at the same time there are numerous implementations of search/launch applications (even in linux) that work significantly better than Unity? Were they afraid of being labelled copycats? Is that worse than been called bad designers?
The same can be said about the new ‘global menu’ and AppIndicators that replace Gnome panel in Unity. Having few replacements for the staple Gnome Panel widgets of yesteryear is fine, given it’s a 1.0. Having botched the whole concept of a global menu through inconsistencies when windows are maximised and in multi-display scenarios betrays a badly designed (viz. not just incompletely implemented) system that shouldn’t have been out in the first place.
Unity has divided the GNOME community by introducing a new shell on the world’s most popular linux distribution. While it’s true that the state of linux desktop has been moving frustratingly slow for a number of years and that a quasi-open project, funded by a commercial entity with a focus on usabilty and æsthetics — exactly like Unity is on paper — could help accelerate its development and help reach parity with the two main desktops in some of the more difficult areas where linux has been falling back over the years. Still, Unity is largely incomplete, it’s missing many of the configuration options and functionality that linux users are used to — nay, demand — and, sadly, what’s there betrays a rushed, badly designed feature set that should never have gone past alpha inside Canonical, let alone be part of the world’s most popular distribution.
Guardian.co.uk is switching from Java to Scala. I’m surprised it took so long and that other Java shops are not following en masse — it could be because of how different and esoteric Scala can be, especially to Java programmers. The linked infoQ article contains an interesting discussion with the Guardian folks.
Programming enterprise web applications (or anything, for that matter) in Java is painful for anyone mature enough to have experienced the wealth and breadth of tools out there, given how primitive, verbose and unproductive it is, and how much it caters for the lowest common denominator of a programmer. That’s not to say that Scala is the best choice for everyone, let alone those not starting from scratch, but given the Guardian’s existing infrastructure and systems, I guess that it’s the best choice they could’ve made.
Ten years ago, on March 24th, 2001, Mac OS X came out. A first, publicly available, one point oh unpolished version of Apple’s ‘next’ (pun intended) operating system. An operating system that Apple had been trying, in one way or another, to create for more than ten years. Remember Pink? Taligent? Copland? Gershwin? Mythical codenames to those that heard of them in the 1990s of either projects that promised amazing experiences compared to Mac OS Classic and that were never finished or released as planned, or spun-off products that died after a few short years. Mac OS X, what finally became a cornerstone of Apple’s platform well beyond the Mac and a catalyst of its success in the 2000s was a reincarnation of NeXTSTEP in Apple’s colours. Fusing NeXTSTEP’s core and frameworks with the Mac OS of old as one product that didn’t exactly know itself. A new skin, the same — amazingly advanced for their time — underpinnings.
In this short article I will describe, in summary, some experiences with Mac OS X from the point of view of a software engineer as opposed to a user, over the past ten years: The initial chaos of integration, Apple’s flirtation and dilemma with Java, the modernisation of Objective-C, the eventual coherence of the APIs and the extension of the system to support Touch in a way that was never achieved before.
NeXTSTEP frameworks used Objective-C. It was a language unknown to 99% of programmers out there in 2001. I had only heard of it while fiddling with GNUStep a few years earlier. In the early 2000s you could still find C++/Corba programmers in major service companies (as opposed to large software houses or systems development divisions) and Java was only starting, slowly but increasingly, to become the preferred platform for enterprise software. I remember meeting amazing and suitably eccentric software engineers — not merely the subpar ‘developers’ that are increasingly common nowadays in service/enterprise environments — that proudly proclaimed ‘Java is for girls!’ and other elitist, sexist jokes like that. Anything less than Alexandrescu and Sutter-class C++ was unacceptable to them. Knowledge of x86 assembly was standard among their friends. How could a person like that appreciate Objective-C? I felt comfortable hanging out with those people because I had gone through the rings of fire of learning, liking and using assembly, ‘high-level’ [insert CPU here] programming (irony!) and C/C++, but I also enjoyed the elegance and simplicity of Objective-C and Cocoa for rapid application development.
In this draft CSS3 spec, preliminary support for gradients is defined. Where are diamond and angle gradients? They may not be used as much as the others, but I find it weird that they are not added to a newly spec’d standard, given that it’s not that hard to implement them.
Android 2.3 was announced a few days ago. The previous day, CyanogenMod 6.1, the most popular community mod was released, based on Froyo (2.2). And today, just a short two weeks after the announcement, the source code for the latest version of Android is being released!
The release marks the end of the 2.x era, with Google, most definitely, working hard on the 3.x series aimed for release in the first quarter of 2011 and — hopefully — taking the fight with iOS up a notch. Just an hour ago cyanogen posted this on twitter:
If you need me, I’ll be locked in my room for the next 3 days. #gingerbread
I feel that right now that’s precisely what makes Android sell, and by extension the popularity and characteristics of such projects give many clues on the demographics of those buying Android devices.
In other words, the ‘magic’ of the platform is its rapid evolution and by extension its community (a community that is largely technology oriented), something not to be found in HTC’s or Samsung’s wanna-be iPhone devices (or their mediocre software), Sony Ericsson’s lifestyle apps or Motorola’s ‘macho’ Droid phone and its seriously bad Motoblur. These are commercial parts of a nascent platform that — until now — enthuse few outside the technology community.
Stuff like CyanogenMod are exciting because they evolve extremely fast and at the same time let your imagination run wild with features that half-baked commercial Android ‘flavours’ couldn’t never have. A combination — and even the ‘controlled’, sterile in a way, yet amazingly polished environments like iOS lacks.
And this is, sadly, something that most major Android device manufacturers don’t get, judging by the effort they put in locking their products down, the amount of crapware they bundle with them and the restrictions they place to their customers.
By the way, if you’re using a supported device, like e.g. the HTC Desire, I recommend you get rid of Sense right now, get CyanogenMod, or another mod if so you prefer, and turn the damn thing into a usable gadget. You won’t regret it*.
*If you do, I won’t be held responsible for any damage you may cause to your device.
This is what Andy Rubin stated in his ‘D: Dive into Mobile’ interview, yesterday. And that’s probably the best descrption of Android I’ve read. Like desktop linux was (and arguably still is in some respects), like Mac OS X was in its first three years and like Windows was for a very long period until — arguably — Windows 95 came out in August 1995. It’s hard for ‘normal’ people to get excited about Android, because there’s little that appeals to normal people. Even from a development standpoint it’s clearly work in progress, with volatile APIs, significant bugs and vastly inferior performance (incl. power management) compared to iOS. As I’ve written before, Android development is moving fast and I reckon it’ll take a couple of years at most for it to reach maturity.