This blog post was originally posted on my blogpost blog at this URL, and was later migrated to this place. There may be some comments at the original URL.
Consider this very very simple problem, stated in plain English:
Square two numbers and add them.
This is how it’s usually implemented in mainstream languages:
And in more functional ones:
The problem statement in English says everything just once. Square - once. two numbers - once. add - once. Yet the above implementation repeats some of these things.
This implementation could be read in English as:
Let x be a number. Let y be another number. Square x. Square y. Add the above two.
Not very expressive, is it?
Most languages lack the necessary abstractions to express this sort of thing concisely. Let’s have a look at Factor implementation of this problem:
sq is a function that squares the given number. We put it in a block to indicate that we want to treat it as data.
bi@ is a combinator that applies the supplied function - in this case,
sq - to given two values.
+ adds the two values, yielding the expected result. (What you see in parenthesis is called stack effect declaration. They are not name bindings for the arguments to the function, and can thus be ignored in this discussion.)
Since Factor is not very popular, I figured it’d be useful to provide an implementation in Haskell, which also enables this sort of programming, albeit that’s not its natural style. Complete implementation:
Well, that’s pretty ugly, but gets the point across. I wouldn’t advise you to write this sort of code in Haskell though.
To me, this concise mathematical way of achieving such a high level of expressivity in programming is one of the most beautiful things I have seen in computer science.
Maybe the example I chose wasn’t the best there is to illustrate the point, but (hopefully) you get the idea. The arguments made here apply to whole of programming in general.
This style of programming is known as function level programming, and if this idea interests you, you might enjoy John Backus’ Turing Award lecture on the subject.