I don't pretend that this is exhaustive or even accurate (for example I demoted Cloud Computing to ‘honourable mention’ purely because writing the entry made me fall asleep), but it's the time of the year for end-of-year lists, and I wouldn't want to disappoint.
- Distributed Version Control
A few years ago I tried darcs and wondered why all version control systems didn't work like this. The only thing that stopped me using it full time were reports of poor performance, and the absolute lack of tool support. Today, both of those problems are history. darcs has been joined by git, Bazaar and Mercurial, and while no system has such an obvious technical edge over its competitors to be declared a deserving winner, it looks like git has the upper hand in mindshare thanks largely to github
Meanwhile, at work, our system administrator lost weeks of productivity and large chunks of his hair trying to solve our cross-Pacific Subversion performance problems
Apple just keep putting out good stuff. This time it was the creation of a mobile computing platform that despite its flaws has accomplished more in a year in terms of adoption and mindshare than players who have been in the game for most of this decade.
At the beginning of the year, I'd already written off Twitter as a pointless waste of time. So colour me surprised that by December, it's probably my most used web application (well, after Confluence). Twitter is a sea of little snippets of information, much of it banal, but once you've worked out how to swim it's both a fun toy, and an incredibly useful exercise in many-to-many communication. Now if only they could come up with a business model.
Honourable mention: Wikis, OSGi, Cloud Computing.
Ruby hasn't taken over the world as promised, and its status as ‘it’ technology is being threatened from one side by an emboldened Python, and from another by Java technologies such as Grails that offer the same kind of productivity benefits on top of a technology stack that CTOs might be a little more comfortable with. Still, a lot of the things that were wrong with Ruby—performance, wacky Unicode support—are being fixed and nobody (or at least nobody with a clue) actually believed Ruby was responsible for Twitter's instability.
On the plus side, Facebook seems to be halting the previously inevitable tidal effect of social networking services, where the next generation of ’net users all move somewhere else. A small subset of Facebook applications (mostly those that pull in content from other sites) are actually both useful and sticky, and we're yet to see what OpenSocial will do to enhance the experience.On the other hand, Facebook applications are still annoyingly needy, and they really, really need to sort out their woeful contextual ads.
The death of Java is kind of like desktop Linux going mainstream. Every year a bunch of people come up with well thought out, well argued opinion pieces as to how this is the year it’s going to happen, and every year they turn out to be wrong.
Honourable mention: Google, OpenID
I really wanted to like Scala. I bought the book. I did the tutorials. Ultimately I ended up in the very uncomfortable position of being forced to agree with Steve Yegge. If Scala is the level of complexity you need in order to provide a strongly typed object oriented language, maybe that's a bad sign for type safety. Or maybe the Frankenstein approach needs refinement.
The lessons to be learned from Erlang with regards to concurrency are important ones. For one thing, it teaches us that no VM-based language can truly have thread safety until the VM can guarantee that one thread dying from resource starvation won't bring down the world around it. Unfortunately, Erlang-the-programming-language still looks frighteningly like it was designed in Sweden in the 1980s.
Still right now, in the absence of anything better, it's possibly the best choice for large parallel systems.
A comment thread on a blog post I can no longer find a link to saw a rosy future for Microsoft because they spend nine times as much on research and development as Apple. There's the problem. Microsoft pour R&D money into multi-touch interfaces and come up with a table that is relegated to tech demos and gimmicky election coverage. Apple put R&D money into multi-touch and produce the frickin’ iPhone.
Of course, Windows 7 will fix everything. We've never heard that before.
Dishonourable mention: Perl, Web 2.0 pundits, Second Life