15
Apr

According to the Wall Street Journal, the top five companies responsible for U.S. peak Internet traffic in 2013 were Netflix (32%), Google (22%), Apple (4.3%), twitch.tv (1.8%), and Hulu (1.7%).

For those of you who haven't been following along at home, Twitch.tv, the company coming in fourth place, the one that edged out the name-brand service that streams TV shows from NBC, Fox and ABC for free, is a platform devoted to streaming video of people playing computer games.

Twitch produces almost no content themselves, instead acting as a portal through which anybody can use readily available (often free) software to broadcast their own gaming shows, live. These shows are monetized through advertising and subscriptions, with Twitch passing a proportion of that revenue back to the streamer.

By their own statistics, Twitch reached a global audience of 45 million unique viewers a month, each of whom averaged 106 minutes of viewing a day. Regardless of what you think of the kind of person who will sit down in front of their computer to watch someone else play League of Legends, it's hard to argue that the numbers are astounding.

And for every big eSports organisation getting a million concurrent viewers for their grand finals, there are a dozen popular streamers broadcasting gameplay or doing shows from their bedrooms for thousands of viewers, making a decent living from advertising and subscription revenue. And for each of those, there are several hundred amateurs streaming to their friends, or whoever happens to pass by, hoping maybe to make the big time.

So what is the least you need to take part in this new, booming opportunity? A reasonably fast PC, a webcam and a microphone, and an Internet connection capable of reliably uploading 1080p video. (The resolution at which most modern games stop looking blurry).

Which might be why Australia is famous for having exciting events and world-class players, that only the most die-hard fan wants to watch because of the awful video quality. There could be thousands of young, hungry Australians competing in this exciting new market, trying to make money entertaining viewers around the world, but there aren't, because our Internet is shit.

Advances in Internet connectivity are usually measured in download speed, but download speed is only a measure of how efficiently you can consume the Internet. Upload speed is the measure of how you can change from being a consumer to a producer. Upload speed is what allows an Internet user to engage with the network as a peer. And it's not just the video-maker wanting to send their content to Twitch or YouTube, it's every knowledge worker who wants to send the product of their labour to someone else without a prohibitively expensive dedicated connection.

One of the more frustrating things in the early days of Atlassian was how insanely long it took to upload a new version of our software from our office in Sydney to the download servers in the USA, because even a business-grade ADSL line is still an ADSL line.

And this is why I am dismayed by the continually dwindling promise of the Australian National Broadband Network, now down to 25Mb/s down/1Mb/s up even in areas with fibre. It's not the download speed. At least until 4k video becomes the norm, 25Mbit is more than enough bandwidth to stream a high definition movie.

It's the upload speed. Falling from the originally planned 40Mb/s to 1Mb/s is the difference between telling your co-worker in the branch office a few suburbs over to grab a cup of coffee while you send them that file, and telling them it's probably faster if they get in a car, drive over and with a USB stick and pick it up themselves.

The advances in productivity, the opportunities to build the digital economy in Australia that should be the goal of a national infrastructure project like the NBN don't come from giving Foxtel and Netflix a bigger pipe to shovel content into our homes. They come from giving us connectivity that can talk back, that can let knowledge workers be just as productive without having to slog into an office every day, that puts small networked businesses on the same footing as the established heavy-weights, that allows entrepreneurs to build amazing things from home and loose them on the Internet.

26
Feb

Almost every report on the recent Apple SSL security bug has focused on the code. On the failure of developers to notice the pernicious extra goto statement.. On the way it could have been picked up by code review, or static analysis, or (my favourite) by making sure you put braces around one-line conditional branches.

Just as much has been made of the almost-too-coincidental fact that within a month of the bug shipping to the public, Apple was added to the NSA's PRISM hitlist of vendors subject to "data collection".

I'm not a conspiracy theorist. Here’s how I am 95% sure they did it, because it's too obvious for them not to be doing it.

Somewhere, in a boring lab in a boring building, an overworked government employee has the job of running a mundane (hopefully automated) test suite against every new release of an OS or web browser. The test suite tries to fool the browser with a collection of malformed or mis-signed SSL certificates and invalid handshakes, and rings a triumphant bell when one is mistakenly accepted as valid.

Focusing on goto or braces, misses the point. There are an uncountable number of ways a bug like this could end up in a codebase. It's not even the first, or even the worst example of an SSL certificate validation bug: back in 2002 an issue was discovered in Internet Explorer (and also, to be fair, KDE) that meant 90% of web users would accept a trivially forged certificate.

The Apple SSL bug existed, and remained undetected for a year and a half, because Apple wasn't testing their SSL implementation against dodgy handshakes. And it made us unsafe because the NSA, presumably alongside an unknown number of other individuals and organisations, government and otherwise, were.

It's a depressingly common blind spot for software developers. We’ve become much better over the years at verifying that our software works for positive assertions (All my valid certificates are accepted! Ship it!), but we're still depressingly bad at testing outside the “happy path”.

What we call hacking is a form of outsourced QA. Hackers understand the potential failure modes of systems that can lead to a compromises of integrity, availability or confidentiality, and doggedly test for those failures. Sometimes they succeed because the systems are incredibly complex and the way to exploit the failure incredibly obscure, and there's just more people with more time to look at the problem from outside than from within.

Far more often, they succeed because nobody else was looking in the first place.

1
Jan

Happy New Year!

  • 1:33 AM

521A9064.jpg

3
Dec

Happy birthday to me
Happy birthday to me
Two years ’til I'm forty
Fuck fuck fuck fuck fuck fuck.

1
Sep

Back in 1998, network news administrator Russ Albery wrote an inspiring rant in reply to a spammer who was invoking “free speech” to defend his dumping of thousands of job advertisements on local newsgroups.

I still go back to it on occasion to remind myself that when you’re writing network software, even when on the surface that software is about sharing code, or tracking bug reports, or answering support queries, or publishing blogs, or producing documentation… you’re writing software about people.

…because the thing that Usenet did, the important thing that Usenet did that put everything else to shame, was that it provided a way for all of the cool people in the world to actually meet each other.

Sure, I've been involved in Usenet politics for years now, involved in newsgroup creation, and I enjoy that sort of thing. If I didn't, I wouldn't be doing it. But I've walked through the countryside of Maine in the snow and seen branches bent to the ground under the weight of it because of Usenet, I've been in a room with fifty people screaming the chorus of "March of Cambreadth" at a Heather Alexander concert in Seattle because of Usenet, I've written some of the best damn stuff I've ever written in my life because of Usenet, I started writing because of Usenet, I understand my life and my purpose and my center because of Usenet, and you know 80% of what Usenet has given me has fuck all to do with computers and everything to do with people. Because none of that was in a post. I didn't read any of that in a newsgroup. And yet it all came out of posts, and the people behind them, and the interaction with them, and the conversations that came later, and the plane trips across the country to meet people I otherwise never would have known existed.

That's what this is all about. That's why I do what I do.

People.

[a few paragraphs…]

And you can talk to me about free speech and applications and the future of communication and the use to which people put such things until you're blue in the face, and when you ask me if there's really such a thing as good speech and bad speech, I'll still say yes. Because there are people talking to other people and there are machines talking to no one as loud as they can to try to make people listen, and damn it, there is a difference, and the first one does deserve to be here more than the second one. And I don't know how to tell the difference reliably either, but that has jack to do with the way I feel about it.

And to all of the spammers and database dumpers and multiposters out there, I say this: You want to read that stuff, fine. You want to create a network for such things, fine. You want to explore the theoretical boundaries of free speech, fine. But when it starts impacting people trying to communicate, then that is where I draw the line. This is not a negotiation and this is not a threat; this is simply a fact. I've been through pain and joy with this network, I've seen communities form and wither and reform, I've met friends and lost friends here, I've learned things and discovered things and created things. I've seen people make a home here when they didn't have any other, not on a newsgroup, not with a bunch of electrons, but with people that they've met and communities that they've found and support that they've received from people who had just the words they needed to hear and would never have known they existed, and by God I KNOW what this network is for, and you can't have it.

26
Aug

Bundled Out

  • 9:32 AM

With the recent announcement of the (probably forced) retirement of Microsoft CEO Steve Ballmer, respectable publications are rolling out another series of “Where Did Microsoft Go Wrong?” pieces that almost all trace the same narrative arc:

In 2000, Ballmer inherited a software juggernaut so powerful that it was known to many as the “Evil Empire”. How could he possibly have mismanaged it to the point that, despite its continuing record of raking in cash, it is now almost the industry’s comedic afterthought?

We like our classic tragedies, where one man’s hubris brings down everyone around him. We like stories where there’s someone we can point at and blame, especially when that villain is easy to dislike. And by all accounts Ballmer wasn’t a good leader for Microsoft; neither great himself, nor the kind of person to inspire those under him to greatness.

Blaming Ballmer for the woes of Microsoft, though, misses the fact that every problem the company is experiencing today was written into its DNA in the 1980s.

Read the rest of this entry…

6
Aug

clepetit:

What should I do for lunch?

Carlfish:

Eat.

clepetit:

Genius.

Carlfish:

That’s why I’m the architect.

I come up with the general solution, it’s up to you to decide how to implement it.

29
Jul

On the surface, Quantum Immortality is an attractive thought.

Under the many-worlds interpretation of quantum mechanics, every chance event leads to the creation of multiple parallel universes. When you roll the die, it doesn't come up 5. It comes up six different universes, and your personal thread of causation just happens to be looking backwards from the perspective of the '5' branch.

As a corollary, if you are ever in a life-threatening situation and there is a possibility you might survive, in at least one universe, you will.

Sheldon: Penny, while I subscribe to the "Many Worlds" theory which posits the existence of an infinite number of Sheldons in an infinite number of universes, I assure you that in none of them am I dancing.

Penny: Are you fun in any of them?

Sheldon: The math would suggest that in a few of them I'm a clown made of candy, but I don't dance.

Big Bang Theory: S3. Ep3. The Gothowitz Deviation

This leads to the superficially awesome thought that, at least subjectively, you can't die. Your subjective consciousness will always be looking back through that path of causality in which you survived.

This would be great if existence was a binary state between being dead, and being perfectly healthy and able.

Except It's far more likely that as time increases, the number of universes in which you are not a brain in a jar, screaming your insanity into an eternity of nothingness, approaches zero.

7
Jul

Contains Game of Thrones spoilers.

Just after the infamous “Red Wedding” made its way from the pages of his novels onto the HBO TV series Game of Thrones, author George R. R. Martin was interviewed by Entertainment Weekly:

ENTERTAINMENT WEEKLY: How early in the process of writing the book series did you know you were gonna kill off Robb and Catelyn?

George R.R. Martin: I knew it almost from the beginning. Not the first day, but very soon. I've said in many interviews that I like my fiction to be unpredictable. I like there to be considerable suspense. I killed Ned in the first book and it shocked a lot of people. I killed Ned because everybody thinks he's the hero and that, sure, he's going to get into trouble, but then he'll somehow get out of it. The next predictable thing is to think his eldest son is going to rise up and avenge his father. And everybody is going to expect that. So immediately [killing Robb] became the next thing I had to do.

There are a lot of very good reasons to kill a character. Maybe the death of the character is necessary to take the plot in a particular direction. Maybe you want to explore how other characters deal with the death, how it changes them or how the character's absence opens up new possibilities for them. Maybe the character's arc is fundamentally tragic and can only end in death.

“I wanted to surprise the readers” strikes me as a really bad reason. It's soap opera writing: start with how you want your audience to react then work backwards, poking the characters with sharp sticks until they fill in the details.

I’ll be the first to admit that fantasy writing is terribly formulaic, and so laden with predictable plot lines and character archetypes that you can tell from the first page exactly what will happen on the last. David Eddings made a very successful career out of writing exactly the same story over and over again, so well so that over the years he got more and more efficient: condensing the One True Plot from five books down to three, then finally being able to knock the whole thing over in one volume.

This leaves a lot of room for authors to take a machete to the genre's thick undergrowth of tropes, but subverting a trope is a means, not an end. It should be a way to say something new, or at least something newer than “Ha! You weren’t expecting that!”

Also, leaning on your subversion too often can put you in the uncomfortable position where you’re as predictable as Eddings:

So Martin kills off beloved characters in order to subvert reader expectations. But after the first two times, what does the reader actually expect? What if you’ve read Wild Cards and find Martin’s debasement and slaughter of protagonists not only not surprising, but pretty much Martin’s stock in trade? Are reader expectations still being subverted then?

I’m sure Martin has a lot of other reasons to do what he does to his characters. Killing Ned Stark created a power-vacuum that was very important to the plot, and both isolated his children and forced them to learn to live without their father and protector. It’s just telling that when put on the spot, the author picked this reason out of all possible reasons to explain it.

Which is why I think I'll stick to watching it on TV instead of catching up with the novels. If you're going to follow a soap it may as well be a TV soap.

(And it may as well have frequent nudity.)

18
May

On Google Glass

  • 9:09 PM

This is Dr. Martin Cooper, the man generally credited with inventing the cellular telephone. He is holding a prototype of the Motorola DynaTAC, the first handheld mobile phone. The DynaTAC cost $3995 (in 1983 dollars!), was the size of a brick, and weighed one and three-quarter pounds. A full charge would give you 30 minutes talk time or about eight hours standby.

You also looked like a bit of an idiot carrying one around or making a call on it.

For at least a decade after the DynaTAC’s release, mobile phones were stereotypically cast as toys of the wealthy and self-important. Growing up in Australia at the time, it was not uncommon to refer to them as “wankerphones”.

6570641391_9a70944029.jpg

This is a phone stack. Some bright spark came up with an idea where everyone at dinner stacks their phones together on the table, and the first person to grab their phone back from the stack, even if it is ringing, has to cover the bill.

Even thirty years after the release of the DynaTAC, we’re still working out new social mores and tricks to deal with its intrusion into our lives.

I'm pretty bad at predicting the success or failure of new technologies, I just think it's a little too early to write off something as potentially game-changing as Google Glass base on how it looks today, what it costs today, or based on the fact that we're currently entrusting one of society’s most socially tone-deaf groups (nerds) with the question of when it's appropriate to wear them.

The photograph of Dr Cooper was retreived from Wikipedia, copyright Rico Shen and made available under a Creative Commons Attribution, Share-Alike licence. The phone stack photograph was retrieved from Flickr, copyright Roo Reynolds and made available under a Creative Commons Attribution, Non-Commercial licence.

6
May

If you look at the widely-retweeted code.org campaign, or the recent petition to add programming to the official Australian curriculum, you see a common theme.

The core of the petition:

The Digital Technologies section of the draft Curriculum for Technologies is a massive step in the right direction. If enacted, it will equip Australian students with the skills they need; not just to become competent consumers of technology, but to design and create our shared technological future.

Or, amongst the quotes from luminaries on code.org, President Bill Clinton:

At a time when people are saying "I want a good job - I got out of college and I couldnt find one," every single year in America there is a standing demand for 120,000 people who are training in computer science.

Mark Zuckerberg:

There just aren't enough people who are trained and have these skills today.

Ashton Kutcher:

If we want to spur job growth in the US we have to educate ourselves in the disciplines where jobs are available and where economic growth is feasible.

This theme, that we should teach coding because it will lead our children to IT jobs and help our growing software industry, comes across far too strongly from both campaigns, and it's the wrong message.

Sure it might be the right message for bureaucrats, industry insiders and parents worried that their child's grade seven teacher isn't properly preparing them for a lifelong career as a sysadmin, but it's a really bad reason to set educational policy. General childhood education isn't, and shouldn't be vocational training.

Luckily, code.org has some more redeeming things to say.

Learning to write programs stretches your mind, and helps you think better, creates a way of thinking about things that I think is helpful in all domains. — Bill Gates.

I think everybody in this country should learn how to program a computer because it teaches you how to think. — Steve Jobs

Programming is probably the greatest, and most criminally untapped teaching tool we have developed in the last century. At its heart, programming is applied logic, a discipline that requires you:

  • to break a problem into its component parts
  • to construct those parts from a set of logical building-blocks
  • to combine those solved parts into a greater whole

These are powerful, fundamental skills that are worth teaching to anyone. They're not only the building-blocks of a career in computing, they're building-blocks for critical thinking, for scientific thinking, even for creative thinking. Programming teaches all of this in an environment where you can keep students interested by having them use the skills they are learning to build tools, toys and games.

Programming even provides an answer to that question every school kid asks, “What the hell am I learning maths for?” I'm pretty sure my high school trig and calculus would have stuck a whole lot better if I had been asked to build a game with them instead of just solving equations on a piece of paper, or being assessed on my ability to draw a curve neatly on the provided graph paper.

Sadly, though, I suspect the problem with programming at school is far more practical than intellectual. All the willingness to add programming to the curriculum isn't worth anything if we don't also have enough teachers qualified to deliver those lessons. And so long as we as a society continue to devalue teachers and teaching, that's a much bigger challenge.

30
Apr

Bad Actors

  • 11:03 AM

Shanley, I Wish I Knew — About Work.

There are also people in [the tech industry] who are dishonest, manipulative, abusive, bullying, mean-spirited, harassing and destructive. Early in my career I was very paranoid about maintaining amicable relationships with these individuals or staying quiet despite my moral qualms about their actions, because I was always told I’d have to work with them again, and that someday they might be on the other side of a hiring board or committee or collective I needed something from. I’ve since realized that these very fears ensure these assholes will have long prosperous careers, where we’re all forced to see them again.

When we moved into the new office a few months ago, Atlassian handed out adjustable standing desks to people who wanted them. Numbers were limited, but they were given out in order of tenure and, well, sticking with the same employer for nine years has its privileges.

I'm totally unqualified to comment on the medical benefits of standing desks. My light reading on the subject suggests that sitting down all day is bad for you, but at the same time (and after years of working eight-hour retail shifts I can attest to this) standing up all day isn't all that good either. So you naturally glide (or in the case of an electronically adjustable desk, buzz loudly) between one position and the next based on what your body tells you to do.

So on Monday morning, tired from one of my regular bouts of insomnia and my body still aching from a weekend game of squash, the first thing I did was return my desk to its natural, sitting position. Being the Internet junkie that I am, I celebrated this action with a tweet.

@carlfish: Standing Desk vs Monday Morning. Monday morning wins

And there it would have ended, but for the Social Media Expert. The Social Media Expert, hired to increase his employer's exposure on The Social Media, decides that the perfect thing to do is butt his product into into my life. Seeing that The Social Media Expert is living in a different timezone, this happens fifteen hours later, after I have raised my desk again, lowered it again, spent the late afternoon lying on an office couch, gone home, worked some more then gone to bed.

Which meant I woke up on Tuesday to this.

Read the rest of this entry…

14
Apr

Quoted from a network security mailing-list I am subscribed to:

Last time [we] sent out a warning email along the lines of:
We never ask for your username and password. If you get an email that looks like:

"There is an issue with your account. Please reply with your username and password and we will rectify it"

You should never reply to these messages with your details.
50 people replied with their usernames and passwords.

8
Dec

Repost: Cruelty

  • 1:42 PM

Something I wrote ten years ago about prank phone calls, suddenly relevant again today: “Cruelty."

The prank phone-call, at its essence, relies on the generosity of the person on the other end of the line. It preys on someone's willingness to do their job to the best of their ability, or it preys on the victim's willingness to give the perpetrator the benefit of the doubt. So in essence, what it's trying to say is “People who take strangers at face value are funny, and it's fine to laugh at someone whose job it is to be nice to you whatever you say.” That, to me, is not funny.

3
Dec

December 3rd

  • 7:44 AM

Happy birthday to me
Happy birthday to me
Three years ’til I’m 40
Fuck fuck fuck fuck fuck fuck.

28
Nov

There’s no reason Nintendo’s ad agency couldn’t show a girl playing Mario, they just didn’t. Apparently girls are too busy taking photos of themselves and playing Style Savvy Trendsetters to play a real game.

In response to criticism, Nintendo removed the words “Boys” and “Girls” from the advertisements’ respective titles.

• • •

#1reasonwhy is the Twitter hashtag women in the computer gaming industry used to provide far more than one reason why there aren’t more women in the gaming industry.

• • •

Meanwhile, if there’s a complementary version of this Sunsuper billboard somewhere in Sydney where the girl dreams of being an astronaut, it’s not in any train station I’ve been through.

8
Oct

A year or two ago, I created a Facebook account for my mother’s cat.

(a) It was funny at the time. (b) My mother could use it to follow me and my brother’s goings-on without the embarrassment of having a Facebook account of her own. (c ) Donna and I may have been drunk.

By the time I got around to handing it over, my mother had sensibly decided that she’d probably rather not know what Nick and I were up to on Facebook after all.

Not long after that I forgot the account password. I also forgot the date of birth I used to register the account, which is necessary to recover the password.

As a result of this, every week or so, Facebook sends me a helpful email suggesting other cats I may be interested in pursuing an Internet friendship with.

27
Sep

Troll: v.i. to fish by trailing a lure or baited hook from a moving boat. — The Merrriam-Webster Dictionary

Around twenty years ago when I first set foot on Usenet, trolling had a much gentler meaning than it does today. Trolling was the art of saying something wrong, but in such a way that everybody except the target of your trolling could tell you were being deliberately obtuse.

Trolls ranged from the throwaway jokes like the deliberately typo-ridden spelling correction, to elaborate long-term performance art; for example the jokers who completely derailed the Star Trek newsgroups by dragging half the readers into a choreographed argument about whether sound (and, when that got too boring, light) could travel in a vacuum.

On one hand this kind of trolling was elitist and exclusionary, often a way for forum regulars to one-up newbies who didn't know the pecking order. On the other hand it served to discourage the very common nerd trait of wanting to one-up the world by leaping in to correct the most trivial of errors, a defence against the kind of knee-jerk pedantry that can clog otherwise interesting discussion.

“Don’t feed the trolls” was a warning as much as anything else. Don’t jump into a newsgroup discussion before you’ve read enough to know who is who; don’t make it your job to correct every trivial, irrelevant misteak. Learn the ropes first, and you might just avoid being the butt of everyone’s in-joke.

Some people claim that the troll (sense 1) is properly a narrower category than flame bait, that a troll is categorized by containing some assertion that is wrong but not overtly controversial. — The Jargon File

As the early 90s drifted on, the definition of trolling broadened to encompass anyone who acts like an asshole on the Internet just to get attention. By the end of the decade, few people even remembered the original definition.

The reason for the sudden shift? The rise of the consumer Internet and with it, easy anonymity.

Anonymity on the old-school Internet of shell-accounts granted by universities or employers was a rare currency, mostly limited to “anonymous remailers” like anon.penet.fi, addresses that made it obvious that the author was making a deliberate effort to hide their identity. In the 90s, with the rise of dial-up Internet and subscriber online services, throw-away anonymity became the norm rather than the exception. And with anonymity came the ability for anyone to be an asshole without fear of repercussions.

America Online, with its “feature” of granting all users an infinite supply of screen-names, caught a lot of blame at the time.

The accepted wisdom was that the best way to react to the influx of assholes was “don’t feed the trolls”. Starve them of attention and they would get bored and go away.

Looking back from 2012, I can’t see any evidence of that tactic having worked. What happened was the opposite. By propagating the notion that the only way to deal with assholes is to pretend they aren’t there, we made the Internet a safe space for sociopaths.

This is a problem, because anonymity is also an indispensible component of free speech. Without guaranteed anonymity, the oppressed can't speak up, minorities can't find a voice, and unpopular opinions are suppressed. Stifling this vital freedom is unacceptable.

“Don’t feed the trolls” is the wrong approach.

(To be continued…)

24
Sep

A3hUlFMCcAAGHcg.png-large.png