17
Jul

I spend a non-trivial amount of my time talking to students, junior developers and other newcomers to the industry, and one of the war stories I share goes something like this.

At its heart, commercial software development is soul-crushingly depressing.

As a developer, your working day revolves around everything that's wrong with all the hard work you've put in so far. When you come in to work, your primary interface with reality is the infinite list of shit that needs to be done. You have a list of bugs that need fixing. You have a list of feature requests, things your software is deficient for not doing, and every one of which somebody can't live without. Your job is to pick something off one of those lists and fix it, but when you come in tomorrow, the infinite list of shit will still be infinitely long.

One memory from the early days of my current job is a developer who got so mired in this mindset that the founders literally ordered him to fly from Sydney to San Francisco where he could talk to some real customers, the people who had bought his product and, while they might have the occasional gripe, mostly wanted to tell him how awesome it was, and how it was helping them.

One thing I've been trying to do in my daily life is to be less critical of other peoples software, especially in public places, and especially if I think someone who might have been responsible for that software might be listening. I sometimes let frustration get the better of me, but I'm trying.

There are plenty of official channels to report bugs and request features, but when you go outside them, there's a good chance all you're going to do is ruin someone's day for no real benefit.

8
Jun

Instant messages, last Thursday:

Charles: I’m spending the next two days at a terribly wanky-sounding seminar.

Charles: “Search Inside Yourself”

Vidya: Eek.

Charles: Someone else cancelled and gave me their ticket.

Vidya: You’d better not come out brainwashed and be like “It was great. I never expected it, but it’s mind blowing ladidah…”

Vidya: You’d better return a cynic.

Charles: I searched inside myself, and found a cynical bastard.

The Good

To my great disappointment, a decent chunk of what went on in the day and a half I was holed up in the Sheraton learning to be mindful was not an irredeemable load of wank. Much of the theory around self-awareness and self-improvement was prima facie plausible, and seemed backed by a non-trivial amount of peer reviewed science.

As anyone will tell you, paying attention to one thing at a time is not my strongest skill. In a recent hack day, one of the company’s CEOs went so far as to tell me I had “one of the biggest cases of attention deficit he’s ever seen” as I obsessively command-tabbed between the many windows that keep me with one leg on each strand of my spider-web of constantly streaming global information.

So as much as I hate to admit it, the “mindfulness meditation” exercises around controlling the focus of your attention, recognising when your attention was wandering and drawing it back to whatever you were originally supposed to be doing without falling into a downward spiral of judgement and meta-analysis were something I could use to good effect in my day-to-day life.

I even found myself paying more attention to the speakers than I normally would. It almost seems like cheating, when you start the day teaching your audience how not to tune out of what you’re saying for the rest of it.

The journalling exercises also reminded me of the power of putting one word after another. Whenever I write, I fight with the way what felt in my head like a linear narrative waiting to be transcribed verbatim turns out to be anything but. Turning ideas into written words forces you to fill in all sorts of gaps you didn’t even know were there until you started writing, and often you learn a great deal of what you really think by making yourself express it.

Other techniques, like the practice of recognising emotion in the physical effects it has on your body, and then distancing yourself from them enough to make rational decisions seemed promising, although I utterly failed to summon any triggers from memory during meditation.

It’s also worth mentioning the massive buffet lunch the hotel put on for us was nothing short of amazing.

The Bad

The first thing I wrote down in my notes on day one: “Amazing how you can prime an audience to say what you want them to, by describing something then asking people to describe it as if you hadn't.” If you have spent five minutes telling a room full of people what you think are important qualities for success, don’t pretend to be all surprised when you ask them what they value in a leader and have those points regurgitated back to you.

Despite the presentations being quite grounded in rationality and scientific evidence, the atmosphere in the room often verged on the revival meeting, and even if there was no “woo” on stage, you could hear the crystals’ faint clinking whenever the microphones roamed the audience. In the frequent feedback sessions, attendees competed to be the most earnest, to provide the most insightful reinforcement of the speakers, to gush at how the most recent five minutes of meditation had changed their lives.

By the second day I had started collecting examples of “common phrases that make me dismiss your statement”, including any comment from the audience starting with “Can I acknowledge…” or ending with “I just wanted to share that.”

A note from my end-of-first-day summary: “Boring people should not be given microphones.”

I hope they didn't think I was taking the piss too much when I volunteered that I had attempted to mindfully drink a beer, but wasn't sure I had experienced anything different because “the first beer after work is already a spiritual experience.”

The rare admission that some practice hadn’t been useful, or some enlightenment hadn’t been felt, was met by the presenters with a thoughtful “Well isn’t that interesting”, reminding me so much of the “That is such a great dream” that Quentin Watts, Triple J’s resident morning show dream interpreter in the 90s, would preface every one of her blatant cold readings.

A note early from the second day, whispered to me from a neighbouring seat during one of the many times comment was being sought from the audience: “I challenge you to think of the most absurd thing you can say and still get [the presenter] to agree with you.”

Even as the presenters pushed the science, they were careful to hedge their bets occasionally by pointing out that the field and much of the research was new and uncorroborated.

During the lunch break on day one, I asked an academic working in the field how you could possibly have a double-blind study when the subjects obviously knew which group they were in by dint of the fact that they either were, or were not taking a mindfulness course. His response: “It’s even worse than that. In [the study he was involved with], the evaluating therapists all knew which group each subject was in within minutes of starting their evaluation, just from the words they used to describe their progress.”

The Ugly

I suffer from a mild social anxiety. It manifests as an irrational but unescapable belief that aside from about half a dozen people—wife, immediate family, closest friend—everyone else in the world would be far happier if I wasn’t around. Amongst other things it’s why at University I was that guy who would always leave parties without saying goodbye, why I’m so bad at conferences and trade-shows, and why performance reviews turn me into an alcoholic.

Over the years I’ve developed effective ways of coping. Simple things like structuring social interaction around some activity like a game, or around alcohol, (preferably both) limiting my daily contact with strangers, or making sure I have scripts to deal with common social situations. (When I make a phone call, I’ll map the expected conversation out in my head before I dial the number. Which is why voicemail always throws me for a loop.)

Which brings me to the last four lines in my notebook, written just before lunch on the second day:

“Just like me” practice.
Kindness training.

Staring at someone - awkward as fuck.
Profound?

The exercise… I mean “practice” was to explore the concept of empathy by staring intently at a complete stranger for what felt like an eternity, while the convener quietly encouraged you to recognise this person as another human being just like you, with all the same kinds of thoughts and feelings, memories, triumphs and pain as you, to connect with them.

At times it was hard not to giggle, but we got through it. It certainly created the illusion that somehow just by staring at someone for a while I had made some connection to them.

After the practice was over, as we debriefed in preparation for lunch, I felt this growing feeling of dread.

“Aha!” I thought. “A chance to use that thing we learned yesterday!”

I took a deep breath or two, tuned out the microphone-grabbers describing how their minds had been opened to the beauty of the world and focused my attention on the sensations in my body. Tightness in my chest. Tense muscles. Heightened senses. “Fight or flight” reflex.

It was the very familiar sensation of “human being overload”. Unlike all the previous one-on-one exercises which had been safely covered by my “talking at someone on a harmless predetermined topic” script, this unexpected, unscripted and unfamiliar interaction was something my brain was not prepared to process rationally.

I'd felt this often enough to realise the only reliable way to make it go away was to distance myself from anyone who might want to interact with me. So I gathered up my notepad and bag, avoided the Sheraton’s delicious buffet, ate a large tub of frozen yogurt in blissful anonymity in a food court, then went home.

Everyone’s mind is different. Some peoples minds don’t work quite as you (or they) wish they did. Introducing people cold to exercises designed to have a psychological impact, in an unregulated, highly conformist environment with no safe avenue to opt out seems… risky.

22
May

Sorry, this blog entry is all images. I fail at accessibility. If it's any consolation, you've only missed a lame Star Wars joke.

(Transcript of IM conversation)

Charles: We need this!

Donna: No.

Charles: Awwww…

Donna: It's fugly.

Charles: Yes, but… but…

Donna: You can put it in your study?

Charles: I don't have room. It would have to go in the living room.

Charles: IT WAS HIS FAULT FOR NOT PAYING ON TIME.

19
May

Laura Hudson, writing in Wired:

Ultimately, online abuse isn’t a technological problem; it’s a social problem that just happens to be powered by technology. The best solutions are going to be those that not only defuse the Internet’s power to amplify abuse but also encourage crucial shifts in social norms, placing bad behavior beyond the pale. When people speak up about online harassment, one of the most common responses is “Well, what did you expect from the Internet?” If we truly want to change our online spaces, the answer from all of us has got to be: more.

16
May

I haven't seen the F8 session or used Flux. I've just vaguely skimmed some of the reddit comments. However…

Once upon a time, there was this thing called the Model View Controller architecture. It was a product of the Smalltalk community, but then so were Design Patterns and Extreme Programming. At its heart, it was the simple idea that Model classes should be responsible for representing the state of your application, View classes should be responsible for drawing your user interfaces and presenting that state to your users, and Controller classes should be for translating actions performed by your user into changes to your model, that were then reflected in your view.

In a typical screen for an MVC GUI you might have any number of widgets, each drawn by their own independent view objects, backed by different models, connected to multiple controllers. Almost forty years after the MVC pattern was formulated for GUI development it is still out there, albeit in obscure rarely-used frameworks like Cocoa Touch.

Then, twenty years after MVC was introduced to the Smalltalk world, along came the Model 2 Web Application. In an attempt to make Java web development fit into the growing J2EE spec, the Java community seized on MVC and mapped it almost arbitrarily to the web. Every HTTP request would be served by a single controller (servlet), which would collaborate with zero-to-many models (EJBs), and map the resulting data into a DTO (Pay no attention to the man behind the curtain!) that could be passed via RequestDispatcher to a single JSP view.

The sleight-of-hand was the assumption—baked into the servlet spec—that each HTTP request would map to a single controller which would delegate to a single view. A few frameworks like Tapestry tried to buck the trend, but a Java developer years later could literally replace servlets with actions, EJBs with dependency-injected beans and JSPs with the templating language du jour, and still deliver a not-unreasonable technology choice in most circles. This malaise even infected other undeserving languages like Ruby.

The problem is that even in a very simple web application, a single page contains elements that by rights should be the responsibility of different views, backed by different controllers and their own models. The single controller-per-request, or even the primary controller-per-request is a broken paradigm, and one that every developer has come up with some sub-optimal tower of contraptions to compensate for.

As a result, your average non-trivial Java web application ended up being a mess of controller implementation inheritence, servlet filters, interceptor stacks, view decorators and arbitrary objects placed in the template engine's context to allow the drawing of all the bits of the web page that don't belong to the increasingly impure controller.

Amusingly, the "Type 2" approach to web MVC isn't a bad fit for REST-based systems, (because a REST resource usually does have a single responsibility), but most pure REST systems figure it's overkill… because a REST resource just has a single responsibility.

A simple inversion fixes so much. Going back to the GUI paradigm and flipping the process around so the view comes first and then delegates to controllers and models as necessary is already the go-to strategy for single-page applications and frameworks like Backbone. This kind of "view first" is also a perfectly good strategy for server-side page generation, one that large distributed systems have been taking advantage of for years to delegate fragments of page generation to independent external services.

This blog post was brought to you by the Society for Sending Charles Back In Time Fourteen Years To Slap Himself.

12
May

mVOSg.gif

Help! Stuck in Turbolift!
Reporter: Williams, D. Lieutenant
Priority: Critical
Version: NCC-1701

Description:

The Turbolift is not moving, no matter how many times I state my destination to the computer.

Comment from Bandt, G. Ensign

I am reducing the priority of this issue from Critical to High. While we appreciate the urgency of your situation, Critical priority is reserved for issues that may impact the integrity or spaceworthiness of the vessel.

Old Enterprise versions do not have a reliable voice control in the Turbolift. Have you tried twisting the cylindrical handle clockwise or anticlockwise?

Comment from Williams, D. Lieutenant

I rotated the handle anticlockwise and I am now in Engineering. Please close the issue, I will find someone here to show me how to get to the Bridge.

mVOSg.gif

From: Cordova, M. Lieutenant Commander
To: enterprise-support@starfleet.gov
Subject: Auxiliary Power

It has come to my attention that many support engineers are reducing their “time to first response” by, on receiving a new case, immediately bouncing it back to the customer with the comment: “Have you tried diverting auxiliary power to the affected system?”

While diverting auxiliary power is often a good short-term fix, suggesting it in every case will only reduce the confidence our customers have in our service, and increase the perception that we are just making guesses instead of taking the time to understand the issue and provide informed advice.

As Starfleet officers, we should take pride in the service we provide, and not resort to cheap shortcuts to artificially boost our metrics.

LtC. Michelle Cordova
Team Lead, Enterprise Support

mVOSg.gif

Starship adrift! Please Escalate!
Reporter: Picard, J-L. Captain
Priority: Blocker
Version: NCC-1701-D

Description:

It has been seven days since I opened my previous ticket re: having completely lost control of my Starship. The Enterprise is currently adrift in space, main power is dead as are the warp and impulse engines. We are locked out of the computer and all diagnostics and overrides have so far been fruitless.

Despite this dire situation we have found ourselves being bounced back and forth by a junior engineer who does not seem to appreciate the gravity of the situation, or even be able to provide us with an idea of how much longer our situation will have gravity!

We have approximately four days of auxiliary power remaining (Thank you for that suggestion, it would SO OBVIOUSLY have not been the very first thing we tried ourselves!) and would appreciate the attention of a more senior engineer before then.

Comment from Garvin T. Snr Ensign

Hi, I was assigned to help you with your case. I have reviewed the previous tickets and will be working personally with you going forward until we have found a solution.

Have you checked Jeffries Tube 194a? The problem you are experiencing could be caused by a crystalline intelligence having taken refuge in your substructure, using that location as a beach-head to control the core engineering functions of your starship.

As a higher form of life you may not wish to kill it, but short bursts of gamma radiation triggered manually from the deflector dish can make it uncomfortable enough to leave on its own.

Comment from Picard, J-L. Captain

That was exactly right! How did you know? Please close the issue.

Comment from Garvin T. Snr Ensign

It happens more often than you think.

Comment from System

Thank you for contacting Enterprise Support. Now your case is closed, we would appreciate if you took a moment to fill out the following survey:

On a scale of 1 to 10 where 1 is “never” and 10 is “always”, how likely would you be to recommend Enterprise Support to a friend or colleague?

mVOSg.gif

From: Cordova, M. Lieutenant Commander
To: enterprise-support@starfleet.gov
Subject: First and last warning.

Everything in my previous email referring to diverting auxiliary power _also_ applies to reversing the polarity.

LtC. Michelle Cordova
Team Lead, Enterprise Support

Ideas shamelessly stolen from Chris and Conor. Also, Atlassian is hiring an Enterprise Senior Support Engineer, but I can’t guarantee you’d end up fixing warp core breaches if you applied.

15
Apr

According to the Wall Street Journal, the top five companies responsible for U.S. peak Internet traffic in 2013 were Netflix (32%), Google (22%), Apple (4.3%), twitch.tv (1.8%), and Hulu (1.7%).

For those of you who haven't been following along at home, Twitch.tv, the company coming in fourth place, the one that edged out the name-brand service that streams TV shows from NBC, Fox and ABC for free, is a platform devoted to streaming video of people playing computer games.

Twitch produces almost no content themselves, instead acting as a portal through which anybody can use readily available (often free) software to broadcast their own gaming shows, live. These shows are monetized through advertising and subscriptions, with Twitch passing a proportion of that revenue back to the streamer.

By their own statistics, Twitch reached a global audience of 45 million unique viewers a month, each of whom averaged 106 minutes of viewing a day. Regardless of what you think of the kind of person who will sit down in front of their computer to watch someone else play League of Legends, it's hard to argue that the numbers are astounding.

And for every big eSports organisation getting a million concurrent viewers for their grand finals, there are a dozen popular streamers broadcasting gameplay or doing shows from their bedrooms for thousands of viewers, making a decent living from advertising and subscription revenue. And for each of those, there are several hundred amateurs streaming to their friends, or whoever happens to pass by, hoping maybe to make the big time.

So what is the least you need to take part in this new, booming opportunity? A reasonably fast PC, a webcam and a microphone, and an Internet connection capable of reliably uploading 1080p video. (The resolution at which most modern games stop looking blurry).

Which might be why Australia is famous for having exciting events and world-class players, that only the most die-hard fan wants to watch because of the awful video quality. There could be thousands of young, hungry Australians competing in this exciting new market, trying to make money entertaining viewers around the world, but there aren't, because our Internet is shit.

Advances in Internet connectivity are usually measured in download speed, but download speed is only a measure of how efficiently you can consume the Internet. Upload speed is the measure of how you can change from being a consumer to a producer. Upload speed is what allows an Internet user to engage with the network as a peer. And it's not just the video-maker wanting to send their content to Twitch or YouTube, it's every knowledge worker who wants to send the product of their labour to someone else without a prohibitively expensive dedicated connection.

One of the more frustrating things in the early days of Atlassian was how insanely long it took to upload a new version of our software from our office in Sydney to the download servers in the USA, because even a business-grade ADSL line is still an ADSL line.

And this is why I am dismayed by the continually dwindling promise of the Australian National Broadband Network, now down to 25Mb/s down/1Mb/s up even in areas with fibre. It's not the download speed. At least until 4k video becomes the norm, 25Mbit is more than enough bandwidth to stream a high definition movie.

It's the upload speed. Falling from the originally planned 40Mb/s to 1Mb/s is the difference between telling your co-worker in the branch office a few suburbs over to grab a cup of coffee while you send them that file, and telling them it's probably faster if they get in a car, drive over and with a USB stick and pick it up themselves.

The advances in productivity, the opportunities to build the digital economy in Australia that should be the goal of a national infrastructure project like the NBN don't come from giving Foxtel and Netflix a bigger pipe to shovel content into our homes. They come from giving us connectivity that can talk back, that can let knowledge workers be just as productive without having to slog into an office every day, that puts small networked businesses on the same footing as the established heavy-weights, that allows entrepreneurs to build amazing things from home and loose them on the Internet.

26
Feb

Almost every report on the recent Apple SSL security bug has focused on the code. On the failure of developers to notice the pernicious extra goto statement.. On the way it could have been picked up by code review, or static analysis, or (my favourite) by making sure you put braces around one-line conditional branches.

Just as much has been made of the almost-too-coincidental fact that within a month of the bug shipping to the public, Apple was added to the NSA's PRISM hitlist of vendors subject to "data collection".

I'm not a conspiracy theorist. Here’s how I am 95% sure they did it, because it's too obvious for them not to be doing it.

Somewhere, in a boring lab in a boring building, an overworked government employee has the job of running a mundane (hopefully automated) test suite against every new release of an OS or web browser. The test suite tries to fool the browser with a collection of malformed or mis-signed SSL certificates and invalid handshakes, and rings a triumphant bell when one is mistakenly accepted as valid.

Focusing on goto or braces, misses the point. There are an uncountable number of ways a bug like this could end up in a codebase. It's not even the first, or even the worst example of an SSL certificate validation bug: back in 2002 an issue was discovered in Internet Explorer (and also, to be fair, KDE) that meant 90% of web users would accept a trivially forged certificate.

The Apple SSL bug existed, and remained undetected for a year and a half, because Apple wasn't testing their SSL implementation against dodgy handshakes. And it made us unsafe because the NSA, presumably alongside an unknown number of other individuals and organisations, government and otherwise, were.

It's a depressingly common blind spot for software developers. We’ve become much better over the years at verifying that our software works for positive assertions (All my valid certificates are accepted! Ship it!), but we're still depressingly bad at testing outside the “happy path”.

What we call hacking is a form of outsourced QA. Hackers understand the potential failure modes of systems that can lead to a compromises of integrity, availability or confidentiality, and doggedly test for those failures. Sometimes they succeed because the systems are incredibly complex and the way to exploit the failure incredibly obscure, and there's just more people with more time to look at the problem from outside than from within.

Far more often, they succeed because nobody else was looking in the first place.

1
Jan

Happy New Year!

  • 1:33 AM

521A9064.jpg

3
Dec

Happy birthday to me
Happy birthday to me
Two years ’til I'm forty
Fuck fuck fuck fuck fuck fuck.

1
Sep

Back in 1998, network news administrator Russ Albery wrote an inspiring rant in reply to a spammer who was invoking “free speech” to defend his dumping of thousands of job advertisements on local newsgroups.

I still go back to it on occasion to remind myself that when you’re writing network software, even when on the surface that software is about sharing code, or tracking bug reports, or answering support queries, or publishing blogs, or producing documentation… you’re writing software about people.

…because the thing that Usenet did, the important thing that Usenet did that put everything else to shame, was that it provided a way for all of the cool people in the world to actually meet each other.

Sure, I've been involved in Usenet politics for years now, involved in newsgroup creation, and I enjoy that sort of thing. If I didn't, I wouldn't be doing it. But I've walked through the countryside of Maine in the snow and seen branches bent to the ground under the weight of it because of Usenet, I've been in a room with fifty people screaming the chorus of "March of Cambreadth" at a Heather Alexander concert in Seattle because of Usenet, I've written some of the best damn stuff I've ever written in my life because of Usenet, I started writing because of Usenet, I understand my life and my purpose and my center because of Usenet, and you know 80% of what Usenet has given me has fuck all to do with computers and everything to do with people. Because none of that was in a post. I didn't read any of that in a newsgroup. And yet it all came out of posts, and the people behind them, and the interaction with them, and the conversations that came later, and the plane trips across the country to meet people I otherwise never would have known existed.

That's what this is all about. That's why I do what I do.

People.

[a few paragraphs…]

And you can talk to me about free speech and applications and the future of communication and the use to which people put such things until you're blue in the face, and when you ask me if there's really such a thing as good speech and bad speech, I'll still say yes. Because there are people talking to other people and there are machines talking to no one as loud as they can to try to make people listen, and damn it, there is a difference, and the first one does deserve to be here more than the second one. And I don't know how to tell the difference reliably either, but that has jack to do with the way I feel about it.

And to all of the spammers and database dumpers and multiposters out there, I say this: You want to read that stuff, fine. You want to create a network for such things, fine. You want to explore the theoretical boundaries of free speech, fine. But when it starts impacting people trying to communicate, then that is where I draw the line. This is not a negotiation and this is not a threat; this is simply a fact. I've been through pain and joy with this network, I've seen communities form and wither and reform, I've met friends and lost friends here, I've learned things and discovered things and created things. I've seen people make a home here when they didn't have any other, not on a newsgroup, not with a bunch of electrons, but with people that they've met and communities that they've found and support that they've received from people who had just the words they needed to hear and would never have known they existed, and by God I KNOW what this network is for, and you can't have it.

26
Aug

Bundled Out

  • 9:32 AM

With the recent announcement of the (probably forced) retirement of Microsoft CEO Steve Ballmer, respectable publications are rolling out another series of “Where Did Microsoft Go Wrong?” pieces that almost all trace the same narrative arc:

In 2000, Ballmer inherited a software juggernaut so powerful that it was known to many as the “Evil Empire”. How could he possibly have mismanaged it to the point that, despite its continuing record of raking in cash, it is now almost the industry’s comedic afterthought?

We like our classic tragedies, where one man’s hubris brings down everyone around him. We like stories where there’s someone we can point at and blame, especially when that villain is easy to dislike. And by all accounts Ballmer wasn’t a good leader for Microsoft; neither great himself, nor the kind of person to inspire those under him to greatness.

Blaming Ballmer for the woes of Microsoft, though, misses the fact that every problem the company is experiencing today was written into its DNA in the 1980s.

Read the rest of this entry…

6
Aug

clepetit:

What should I do for lunch?

Carlfish:

Eat.

clepetit:

Genius.

Carlfish:

That’s why I’m the architect.

I come up with the general solution, it’s up to you to decide how to implement it.

29
Jul

On the surface, Quantum Immortality is an attractive thought.

Under the many-worlds interpretation of quantum mechanics, every chance event leads to the creation of multiple parallel universes. When you roll the die, it doesn't come up 5. It comes up six different universes, and your personal thread of causation just happens to be looking backwards from the perspective of the '5' branch.

As a corollary, if you are ever in a life-threatening situation and there is a possibility you might survive, in at least one universe, you will.

Sheldon: Penny, while I subscribe to the "Many Worlds" theory which posits the existence of an infinite number of Sheldons in an infinite number of universes, I assure you that in none of them am I dancing.

Penny: Are you fun in any of them?

Sheldon: The math would suggest that in a few of them I'm a clown made of candy, but I don't dance.

Big Bang Theory: S3. Ep3. The Gothowitz Deviation

This leads to the superficially awesome thought that, at least subjectively, you can't die. Your subjective consciousness will always be looking back through that path of causality in which you survived.

This would be great if existence was a binary state between being dead, and being perfectly healthy and able.

Except It's far more likely that as time increases, the number of universes in which you are not a brain in a jar, screaming your insanity into an eternity of nothingness, approaches zero.

7
Jul

Contains Game of Thrones spoilers.

Just after the infamous “Red Wedding” made its way from the pages of his novels onto the HBO TV series Game of Thrones, author George R. R. Martin was interviewed by Entertainment Weekly:

ENTERTAINMENT WEEKLY: How early in the process of writing the book series did you know you were gonna kill off Robb and Catelyn?

George R.R. Martin: I knew it almost from the beginning. Not the first day, but very soon. I've said in many interviews that I like my fiction to be unpredictable. I like there to be considerable suspense. I killed Ned in the first book and it shocked a lot of people. I killed Ned because everybody thinks he's the hero and that, sure, he's going to get into trouble, but then he'll somehow get out of it. The next predictable thing is to think his eldest son is going to rise up and avenge his father. And everybody is going to expect that. So immediately [killing Robb] became the next thing I had to do.

There are a lot of very good reasons to kill a character. Maybe the death of the character is necessary to take the plot in a particular direction. Maybe you want to explore how other characters deal with the death, how it changes them or how the character's absence opens up new possibilities for them. Maybe the character's arc is fundamentally tragic and can only end in death.

“I wanted to surprise the readers” strikes me as a really bad reason. It's soap opera writing: start with how you want your audience to react then work backwards, poking the characters with sharp sticks until they fill in the details.

I’ll be the first to admit that fantasy writing is terribly formulaic, and so laden with predictable plot lines and character archetypes that you can tell from the first page exactly what will happen on the last. David Eddings made a very successful career out of writing exactly the same story over and over again, so well so that over the years he got more and more efficient: condensing the One True Plot from five books down to three, then finally being able to knock the whole thing over in one volume.

This leaves a lot of room for authors to take a machete to the genre's thick undergrowth of tropes, but subverting a trope is a means, not an end. It should be a way to say something new, or at least something newer than “Ha! You weren’t expecting that!”

Also, leaning on your subversion too often can put you in the uncomfortable position where you’re as predictable as Eddings:

So Martin kills off beloved characters in order to subvert reader expectations. But after the first two times, what does the reader actually expect? What if you’ve read Wild Cards and find Martin’s debasement and slaughter of protagonists not only not surprising, but pretty much Martin’s stock in trade? Are reader expectations still being subverted then?

I’m sure Martin has a lot of other reasons to do what he does to his characters. Killing Ned Stark created a power-vacuum that was very important to the plot, and both isolated his children and forced them to learn to live without their father and protector. It’s just telling that when put on the spot, the author picked this reason out of all possible reasons to explain it.

Which is why I think I'll stick to watching it on TV instead of catching up with the novels. If you're going to follow a soap it may as well be a TV soap.

(And it may as well have frequent nudity.)

18
May

On Google Glass

  • 9:09 PM

This is Dr. Martin Cooper, the man generally credited with inventing the cellular telephone. He is holding a prototype of the Motorola DynaTAC, the first handheld mobile phone. The DynaTAC cost $3995 (in 1983 dollars!), was the size of a brick, and weighed one and three-quarter pounds. A full charge would give you 30 minutes talk time or about eight hours standby.

You also looked like a bit of an idiot carrying one around or making a call on it.

For at least a decade after the DynaTAC’s release, mobile phones were stereotypically cast as toys of the wealthy and self-important. Growing up in Australia at the time, it was not uncommon to refer to them as “wankerphones”.

6570641391_9a70944029.jpg

This is a phone stack. Some bright spark came up with an idea where everyone at dinner stacks their phones together on the table, and the first person to grab their phone back from the stack, even if it is ringing, has to cover the bill.

Even thirty years after the release of the DynaTAC, we’re still working out new social mores and tricks to deal with its intrusion into our lives.

I'm pretty bad at predicting the success or failure of new technologies, I just think it's a little too early to write off something as potentially game-changing as Google Glass base on how it looks today, what it costs today, or based on the fact that we're currently entrusting one of society’s most socially tone-deaf groups (nerds) with the question of when it's appropriate to wear them.

The photograph of Dr Cooper was retreived from Wikipedia, copyright Rico Shen and made available under a Creative Commons Attribution, Share-Alike licence. The phone stack photograph was retrieved from Flickr, copyright Roo Reynolds and made available under a Creative Commons Attribution, Non-Commercial licence.

6
May

If you look at the widely-retweeted code.org campaign, or the recent petition to add programming to the official Australian curriculum, you see a common theme.

The core of the petition:

The Digital Technologies section of the draft Curriculum for Technologies is a massive step in the right direction. If enacted, it will equip Australian students with the skills they need; not just to become competent consumers of technology, but to design and create our shared technological future.

Or, amongst the quotes from luminaries on code.org, President Bill Clinton:

At a time when people are saying "I want a good job - I got out of college and I couldnt find one," every single year in America there is a standing demand for 120,000 people who are training in computer science.

Mark Zuckerberg:

There just aren't enough people who are trained and have these skills today.

Ashton Kutcher:

If we want to spur job growth in the US we have to educate ourselves in the disciplines where jobs are available and where economic growth is feasible.

This theme, that we should teach coding because it will lead our children to IT jobs and help our growing software industry, comes across far too strongly from both campaigns, and it's the wrong message.

Sure it might be the right message for bureaucrats, industry insiders and parents worried that their child's grade seven teacher isn't properly preparing them for a lifelong career as a sysadmin, but it's a really bad reason to set educational policy. General childhood education isn't, and shouldn't be vocational training.

Luckily, code.org has some more redeeming things to say.

Learning to write programs stretches your mind, and helps you think better, creates a way of thinking about things that I think is helpful in all domains. — Bill Gates.

I think everybody in this country should learn how to program a computer because it teaches you how to think. — Steve Jobs

Programming is probably the greatest, and most criminally untapped teaching tool we have developed in the last century. At its heart, programming is applied logic, a discipline that requires you:

  • to break a problem into its component parts
  • to construct those parts from a set of logical building-blocks
  • to combine those solved parts into a greater whole

These are powerful, fundamental skills that are worth teaching to anyone. They're not only the building-blocks of a career in computing, they're building-blocks for critical thinking, for scientific thinking, even for creative thinking. Programming teaches all of this in an environment where you can keep students interested by having them use the skills they are learning to build tools, toys and games.

Programming even provides an answer to that question every school kid asks, “What the hell am I learning maths for?” I'm pretty sure my high school trig and calculus would have stuck a whole lot better if I had been asked to build a game with them instead of just solving equations on a piece of paper, or being assessed on my ability to draw a curve neatly on the provided graph paper.

Sadly, though, I suspect the problem with programming at school is far more practical than intellectual. All the willingness to add programming to the curriculum isn't worth anything if we don't also have enough teachers qualified to deliver those lessons. And so long as we as a society continue to devalue teachers and teaching, that's a much bigger challenge.

30
Apr

Bad Actors

  • 11:03 AM

Shanley, I Wish I Knew — About Work.

There are also people in [the tech industry] who are dishonest, manipulative, abusive, bullying, mean-spirited, harassing and destructive. Early in my career I was very paranoid about maintaining amicable relationships with these individuals or staying quiet despite my moral qualms about their actions, because I was always told I’d have to work with them again, and that someday they might be on the other side of a hiring board or committee or collective I needed something from. I’ve since realized that these very fears ensure these assholes will have long prosperous careers, where we’re all forced to see them again.

When we moved into the new office a few months ago, Atlassian handed out adjustable standing desks to people who wanted them. Numbers were limited, but they were given out in order of tenure and, well, sticking with the same employer for nine years has its privileges.

I'm totally unqualified to comment on the medical benefits of standing desks. My light reading on the subject suggests that sitting down all day is bad for you, but at the same time (and after years of working eight-hour retail shifts I can attest to this) standing up all day isn't all that good either. So you naturally glide (or in the case of an electronically adjustable desk, buzz loudly) between one position and the next based on what your body tells you to do.

So on Monday morning, tired from one of my regular bouts of insomnia and my body still aching from a weekend game of squash, the first thing I did was return my desk to its natural, sitting position. Being the Internet junkie that I am, I celebrated this action with a tweet.

@carlfish: Standing Desk vs Monday Morning. Monday morning wins

And there it would have ended, but for the Social Media Expert. The Social Media Expert, hired to increase his employer's exposure on The Social Media, decides that the perfect thing to do is butt his product into into my life. Seeing that The Social Media Expert is living in a different timezone, this happens fifteen hours later, after I have raised my desk again, lowered it again, spent the late afternoon lying on an office couch, gone home, worked some more then gone to bed.

Which meant I woke up on Tuesday to this.

Read the rest of this entry…

14
Apr

Quoted from a network security mailing-list I am subscribed to:

Last time [we] sent out a warning email along the lines of:
We never ask for your username and password. If you get an email that looks like:

"There is an issue with your account. Please reply with your username and password and we will rectify it"

You should never reply to these messages with your details.
50 people replied with their usernames and passwords.