March 2007

« February 2007 | Main Index | Archives | April 2007 »
Lando Calrissian: Lord Vader, what about Leia and the Wookiee?

Darth Vader: They must never again leave this city.

Lando Calrissian: That was never a condition of our agreement, nor was giving Han to this bounty hunter!

Darth Vader: Perhaps you think you're being treated unfairly?

Lando Calrissian: No.

Darth Vader: Good, it would be unfortunate if I have to leave a garrison here.

The Empire Strikes Back, via WikiQuote.

Apple TV Review

  • 9:30 AM

No huge surprise, I have an Apple TV. Seeing as my existing computer-to-TV setup was a messy combination of an Airport Express and a five metre DVI-HDMI cable, the Apple TV was a chance to get rid of the annoyances of this makeshift setup; such as Front Row's insistence on only running on the primary monitor and having to point my remote control at right angles to my television when I wanted to skip songs.

Setup

Setup is pretty easy. You plug it in and run through the on-screen prompts to connect it to your Wireless network. Then you go to your computer, bring up iTunes, and enter the magic code being displayed by the Apple TV to pair the two.

The advantage of on-screen setup is that it's more direct and obvious for those 90% of users who haven't read the manual. The disadvantage is that entering your 128-bit hexadecimal WEP key using an Apple remote and an on-screen keypad is painfully annoying. My iMac already knows how to connect to my preferred Wireless network. Why not have an Airport-style configurator that passes this information on to the Apple TV?

Once connected, iTunes will start synching its library with the Apple TV. Given a reasonably sized library and an 802.11g network, this will take a couple of hours.

Streaming

Luckily for the impatient, I didn't have to wait that long to enjoy my new toy. The Apple TV is just as happy streaming direct from iTunes as it is playing content from its own hard drive. As soon as you've got the unit paired with an iTunes library, you can browse the library using the familiar, Front Row-style interface, and play stuff.

A lot of people are worried that the 40GB hard drive built into the Apple TV is too small, or are rushing to replace the internal drive with a larger unit. These people forget that the Apple TV's internal hard drive is only used for caching and offline use (and if you're nerd enough to swap out your Apple TV's hard drive, you're probably nerd enough not to turn off your computer). Current-generation wireless networks are faster than you think.

Google Calculator: 8.5 GB / 20Mbps = 58 minutes

A quick back-of-the-envelope calculation suggests that if the Apple TV were capable of such a thing, you could stream an entire dual-layer standard definition DVD over a reasonably performing 802.11g (not n!) home network, without any additional compression, in significantly less than the time it would take to watch the movie.

Since you can't just stream a DVD, any video you'll be playing on the Apple TV will already have been re-encoded, and will likely be a lot smaller than 8GB. So unless you've got a lot of high definition video lying around already, the question of whether it's sitting on your Mac or on Apple TV's hard drive shouldn't be a problem.

Using

The Apple TV's limitations are pretty well documented. It's not a DVD player. It's not a PVR. It wants your video to be in H.264-encoded Quicktime. Since that's the format I tend to convert all my videos to anyway, that doesn't bother me. Of course, any 320×240 video that was designed to play on the iPod is going to look like pixellated arse on a 100cm LCD panel. I assume this would also be true for most video podcasts.

A few annoying gripes I came across:

  • As far as I can tell, there's no way to turn it off without pulling the plug. I left the Apple TV running overnight, and when I woke up in the morning it was still happily converting electricity into heat in order to display its screensaver. I'm not going to be using it that often, and the lack of even an obvious way to put it into standby mode is a significant annoyance. Hey Al Gore, Apple TV is killing the planet!
  • Every minute or so when playing a song, the display will flip horizontally, swapping the position of the album art and title text. This large, sudden movement is like your television shouting "Hey, look at me!" at a time it shouldn't be drawing any attention to itself.
  • You can browse popular iTunes Store content, but you only get 30 second previews, and there's no "buy" button if you happen to like it. Try it, Apple. Make the top twenty songs fully streamable. I bet you'll sell more of them that way.
  • The remote-control is an inaccurate, sluggish interface for scrolling through long lists of songs and artists.

Those gripes aside, the Apple TV works exactly as advertised. It's a tiny grey box that finds your iTunes library over a wireless network and plays it on your TV. Which is almost exactly what I wanted.

Almost.

Digital Millennium

The iTunes music store piggybacked on the popularity of the mp3. People who had already ripped their CD collections got used to thinking of music as purely digital, to be consumed through the computer and its peripherals. After a while, the idea of juggling pieces of plastic to play music that was otherwise a few mouse-clicks away started to feel a bit silly.

What I really want from the Apple TV, what would make it a killer device, is to be able to do the same thing with my DVD collection. I want to buy a nice big hard drive, rip all my DVDs to it, and have every movie I own be a few clicks of my Apple remote away. I'd watch more movies that way. I'd buy more movies that way. And when the Apple Store started selling movie downloads in Australia, I'd be primed as a ready customer.

Thanks to asinine anti-circumvention, anti-format shifting copyright laws, though, all the software I could use for this end is dodgy grey-market nerdware, requiring a lot of manual labour to get working properly. I have to copy my DVD to my hard drive (ripping straight from the DVD makes the drive thrash painfully), then I have to mess around with a bunch of manual encoding settings I am entirely unqualified to even guess at. Or I could spend three days waiting for the torrent to download, only to find that it's dubbed into German with Polish subtitles. And at the end of all this, all the menus and commentary tracks are still locked on the plastic disc.

Oh, and I'd be breaking the law either way.

I just want iTunes to ask me "You've just inserted a new DVD. Do you want to add it to your library?"

Of course, now they're part of the media establishment, Apple would probably much rather we buy new copies of our already legally purchased movies from the iTunes store. (Not that those of us at the arse-end of the earth have that option) But now, at least, they have the Digital Millennium Copyright Act to blame instead.

To me, a "Rip, Mix, Burn" option for existing DVDs is the real missing link holding back the potential of Apple TV as a platform for digital movies. You don't just try to push people into the future. You build a bridge there from where they are today.

So a year ago, David Maynor and Jon Ellch demonstrated to the Washington Post that they could "Hijack a MacBook in 60 seconds or less". Some people called shenanigans, first citing the differences between the claimed vulnerability and the demonstrated exploit, and later finding evidence that the demonstration may have been entirely manufactured. Many people dismissed these rebuttals as "Macintosh zealots" who "refused to admit their shiny boxes could have a security flaw."

A year later, Maynor attempted to clear his name by demonstrating that he could, in fact, exploit a WiFi vulnerability that was fixed between OS X 10.4.6 and 10.4.. Some people have called shenanigans, pointing out that, with a year to refine his exploit, and a patch from Apple to examine, Maynor has gone from demonstrating a hijack to just triggering a kernel panic. Such people are, of course, obviously biased Macintosh zealots.

Enough of that for now. Moving on, we can vastly over-simplify computer security into three groups of people.:

  1. People who build secure systems
  2. People who publish flaws in secure systems
  3. People who exploit flaws in secure systems

The people in group 1 do most of the important work. The people in group 2 get most of the attention. The people in group 3 do most of the damage.

Once again, a vastly over-simplified categorisation. But useful nonetheless.

At the moment, the primary currency of the second group is credit. You gain value as a 'security researcher' based on the potential impact of the flaws you are credited for discovering. There is an implied "you scratch my back, I scratch yours" agreement between vendors and the exploit-research community. Vendors will release timely patches and carefully lend credit in in press-releases, release-notes and advisories to anyone who reports a security bug to them. In exchange the community will pat such vendors on the back on mailing-lists, and continue to give them advance notice of flaws in the future.

(This is the crux of Maynor's accusations of Apple: they fixed a flaw in their WiFi drivers without giving him the credit he felt was due to him for discovering it. Without a great stretch of the imagination, one could link the perception that Apple weren't playing fair with 'the way things are done in the vulnerability disclosure business' to January's disclosure-without-prior-notification month of Apple bugs.)

Being named in a high-profile advisory gives the researcher kudos amongst his peers, gets him invited to speak at conferences, and gives amateurs a leg-up into paid work. It's their equivalent of publishing an academic paper. Exploit or perish.

Theres also a certain amount of geek wish-fulfilment involved, of course. All the jargon and silly names makes the whole thing look like some bizarre role-playing game taken into real life, where you score experience points for publishing a damning advisory, hopefully to help you level up in the security community.

(Although the most egregious "information technology as a role-playing game" I've encountered was back when I used to browse the Usenet net-abuse newsgroups. After a few days watching pseudonymous vigilantes bicker about who could claim the "kill" for having a spammer's account revoked, I was sorely tempted to post "How about you both get XP for the orc, but NightBringer the Mighty gets the +1 dagger.")

The problem with exploit-discovery, though, is that for the most part it's not nearly as exciting as it's made out to be. The vast majority of exploits belong to a small set of common flaws that developers were just too lazy (I mean, "busy with more important things") to prevent. Finding them is an exercise in "volunteer QA". Firewall pioneer Marcus Ranum put it this way last February on the firewall-wizards mailing-list.

A skilled attacker is someone who has internalized a set of failure analysis of past failures, and can forward-project those failures (using imagination) and hypothesize instances of those failures into the future. Put concretely - a skilled attacker understands that there are buffer overruns, and has a good grasp of where they usually occur, and laboriously examines software to see if the usual bugs are in the usual places. This is a process that, if the code was developed under a design discipline, would be replaced trivially with a process of code-review and unit testing (a little design modularization wouldn't hurt, either!).

But it's not actually rocket science or even interesting. What's so skilled about sitting with some commercial app and single-stepping until you get to a place where it does network I/O, then reviewing the surrounding code to see if there's a memory size error? (Hi, David!) Maybe YOU think that's security wizardry but, to me, that's the most boring clunch-work on earth. It's only interesting because right now there's a shockingly huge amount of bad code being sold and the target space for the "hit space bar all night, find a bug, and pimp a vulnerability" crowd to play with.

The discover/patch/fix exploit cycle does not, on its own, make us significantly safer. The reason there is such an industry in the first place is that the software we run on a daily basis is riddled with holes, the result of an industry where the weight of demand and speed of passing trends dictates quickly producing software that is 'good enough'. Gumming up one hole in a colander is just one less hole.

The people who build secure systems are the ones who make us safer. They're the ones who, instead of saying "Buffer overruns are a problem, so we should examine existing software for buffer overruns", say "buffer overruns are a problem, so we should stop teaching students that strcat() even exists" (or if that doesn't work, stop teaching them that C exists). They're the ones who didn't care about the SQL Slammer worm, not because they rushed to patch their servers the moment the exploit was discovered, but because their network was far too smart to deliver untrusted UDP traffic to (or from) a database box in the first place.

(As an aside: How do you tell if your network administrator is conscientious? Pay attention to what he blocks from going out of his network, not just what he stops coming in.)

The irony, though, is that if it weren't for the frequency and publicity of vulnerability disclosures, few people would bother to listen to advice on building more secure systems. Before the advent of full disclosure, there was barely sufficient incentive for vendors to patch known bugs, let alone fundamentally change their development practices so as to stop introducing new ones.

While the patching of a single vulnerability barely makes us safer, there's value in the publicity the community seeks in finding them. The whole funny role-playing game keeps security in the public eye, and keeps the people who sign the cheques aware that it's worthwhile to expend effort moving, painful inch by painful inch, towards an industry that isn't just an endless progression of patches.

(Or not. Ranum maintains the whole thing is just a distraction from real security, not a spur towards it.)

Ordering a sandwich is a flow control problem.

If you provide too much information at once, you'll overflow the buffer of the person serving you. This will cause an unknown amount of information to be dropped on the floor, and for safety you'll have to start again from scratch to ensure no ingredients are missed.

You can just treat the whole thing as a challenge-response protocol. In fact, this is the best thing to do when approaching a new sandwich server, as there are subtle variations in the order of serving. (Are they going to ask for butter? When are they going to ask about salt and pepper?). But challenge-response wastes time, as you pay double the cost of the latency between you and the person behind the counter.

Once you know the order in which the data is required, though, the trick is to keep the pipeline full without (a) overflowing the buffer, or (b) emptying the buffer and dropping back to challenge-response.

Estimating the buffer capacity of someone serving you your sandwich, however, can be tricky.

(also)

Bumper Sticker

  • 11:17 PM

Are we semantic yet?

The Fishbowl. Embarrassedly serving tag soup since 2002.

(Every so often I think I should fix my site's HTML. Then I weigh the amount of effort it would take to fix my templates and archives to even be valid HTML, let alone correct, against the almost zero benefit this would bring myself or my site visitors, and I think "bugger that for a game of soldiers".)

(Alternative sticker candidate: WWTBLD?)

Electrical Storm

  • 11:36 PM

Electrical Storm

Electrical Storm

I received the following in my most recent opt-in Apple spam:

Should you finally say “hasta la vista” to your old PC and replace it with a computer that’s “easier,” “safer,” “more entertaining” and “better connected”?

Nothing says "for some value of..." like scare quotes.