Some disorganised thoughts with no conclusion...
TCP implementations will follow a general principle of robustness: be conservative in what you do, be liberal in what you accept from others — RFC 793, section 2.10
You can't go ten feet on the Internet without having Jon Postel's Robustness Principle quoted at you. As a maxim of protocol design, it's not a bad one. If you write your software in such a way that it always generates correct output, you maximise the number of other implementations that can read what you say. If you also do your best to accept input that does not strictly follow the standards, you maximise the number of implementations that can talk to you.
Like most accepted wisdom, it doesn't quite bear close inspection. The problem arises because following the robustness principle does not protect you from people who do not follow it. The results can be quite chaotic.
Counter-intuitively, Postel's robustness principle ("be conservative in what you send, liberal in what you accept") often leads to deployment problems. Why? When a new implementation is initially fielded, it is likely that it will encounter only a subset of existing implementations. If those implementations follow the robustness principle, then errors in the new implementation will likely go undetected. The new implementation then sees some, but not widespread deployment. This process repeats for several new implementations. Eventually, the not-quite-correct implementations run into other implementations that are less liberal than the initial set of implementations. The reader should be able to figure out what happens next. — RFC 3117 S4.5
A classic example of how this form of ‘robustness’ can let you down is the World Wide Web. Web browsers have always followed the robustness principle, doing their best to render bad HTML through a series of guesses as to what the page author may have intended. The problem was, page authors weren't following their side of the robustness bargain, and suddenly we ended up with a Web where browsers had to be guess-for-guess and bug-for-bug compatible with their predecessors.
If you can't make sense of the spec in 10 minutes no one is going to use it so you can safely ignore it. — Dave Winer, Scripting News
Attitudes like this don't help either. Many hackers (in the perjorative sense) will try to make use of a protocol with a ten minute understanding of it. Very few protocols can be really understood in ten minutes, the closest you can get is a step-by-step recipe .(what Winer calls a ‘Busy Developer's Guide’) that leaves out the why and how for a strict set of whats. Such implementations, written without understanding what they are really implementing, take advantage of the robustness principle by approximating the protocol, and hoping that the implementations they are talking to will accept their approximations. Because one side is doing the right thing and trying to accept bad input, some implementors feel they can ignore robustness entirely.
This is a bad direction to take. It can even bog down protocols and prevent them advancing: just look at the hassles introducing XML namespaces to RSS, because a number of implementations never bothered to learn how XML really worked, and just did pattern matching instead.
Postel's Robustness Principle is a two-edged sword: Gresham's Law [“The bad currency drives out the good”] often trumps the Robustness Principle. — BEEP: Building Blocks for Application Protocols