Java, being rooted firmly in the C tradition, believes that the first number is zero. Arrays and lists begin at index 0. The first character of a String is string.charAt(0). Thanks to the legacy of time.h
, Java even believes January is month 0, skewing up the mental landscapes of anyone used to thinking of June as month 6, and December as month 12. (Someone once told me this originated with the desire for C programmers to be able to map easily between a numerical month, and an array of named months without wasting the first... er... zero-th element of the array).
It therefore came as something of an annoyance to me when, years ago, I discovered JDBC starts counting at 1. I've lost count of the number of times I've been caught out because the first substitution parameter in a PreparedStatement isn't zero, and neither is the first element of the ResultSet. But I've gradually got used to this, annoying as it is.
Then today, I discovered java.sql.Clob#getSubString(long, long). Let's ignore for a second the fact that the java.lang.String#substring(int, int) method sets a precedent for “substring” being a single word, with no internal capitalisation. Thankfully, Eclipse's auto-complete has set me free from that particular trap. Let's instead look at this one, little thing.
The offset parameter in String#substring
is zero-based. The offset in Clob#getSubString
is one-based. What the hell were they thinking? Every single Java programmer, on seeing that method, will assume that it's zero-based without even thinking. And it isn't. Bastards! I don't care which is closer to the ODBC API. I'm sure 99% of Java programmers don't care either.
Once more, with feeling. Bastards!