If you play a chord slightly wrong, you'll probably get a slightly wrong sound instead of an incredibly wrong sound. On the other hand, you may break the instrument! The chord could snap.
If you change a single bit in a calculator's memory, that bit could be part of a single number, and the low-order bit at that, and so the result could be just a slightly wrong calculation like saying the square root of 60 is about 7.745966692414835 (instead of a 4 at the end). Or the single bit could be part of a single boolean and a boolean action could take the completely opposite effect of what it was supposed to do.
But the set of ways you can play a chord slightly wrong that results in a broken instrument seems much smaller than the set of ways you can change a random bit that results in a huge problem later on and potentially far away. If you discretize playing the instrument, it's not really that many bits involved in determining what happens. The physics of it is so simple that string harmonics can be expressed in some short and sweet equations, and a program to simulate that can be very short indeed, meanwhile we have other programs that are vastly more complicated and rely on sending and receiving messages from other programs that aren't even local! That's the real problem with the programming world--the space of cause/effect is just so much larger since programmers get to create many of their own physics, their own rules, that don't have to be similar to how nature works in the way that nature is self-similar with for example its various inverse-square laws or its abhorring of gradients.
So it seems if programmers can impose limitations on what their cause/effect space looks like, they have a lot less problems to deal with. If your stove top only has 4 settings for heat, not a lot can go wrong. If your programming language doesn't have loops or functions, again not a lot can go wrong (but you just sacrificed a ton of power). What programmers need is a language that limits the cause/effect space without being Turing-incomplete and without limiting expression. Enter Lisp.
Youtube has been having a problem lately where it keeps saying new comments are on videos when they aren't, and will show you old comments if you click the "show them" button. How could this happen? I have no idea what their code base looks like. But I suspect it could look something like this:
// do stuff
// do other stuff
and that somehow in a recent commit the if statement got removed or the boolean has_not_seen_new_comments changed its meaning. Interestingly, if they used Lisp instead, it might look like this:
;(do other stuff)
And that's using a very procedurally-looking functional style. But my point is that in this contrived example, if the if line got removed that would result in a syntactical error because there's still a closing paren unaccounted for. It won't save you if has-not-seen-new-comments has changed meaning. But it still removed half of the two potential problems categorically, which is pretty impressive when talking about ways the language rather than convention can help avoid program errors. Just like managed memory languages categorically remove the huge class of potential errors that come into play when dealing with unmanaged memory languages.
Here I picked Lisp because its code-as-data programs-are-their-own-abstract-syntax-trees syntax saves you from many types of small errors where commenting a line out or removing one and so on will result in an immediate error instead of causing a big problem later on. I love that Lisp is about expressions rather than lines or statements. But Lisp has other benefits too that you can get more of with, say, Haskell, or you can just have a lot of self-discipline and do it by convention (convention has its place). This convention (or in Haskell's case demand) is having a functional style to your code, which saves you from other classes of errors, too, like those involved with side-effects.
Anyway, I didn't intend this post to be a well-thought-out manifesto, just a thought to post. A functional style almost by definition protects you from random bit changes because nothing changes, and it simplifies searching for errors to the case of just looking at the borders where things do change instead of throwing your hands up and saying "It could be anything!"
Posted on 2012-02-19 by Jach