It is not much of a confession to say that I'm of a rather dogmatic disposition regarding the need for strict, clear and unambiguous formalisms —let this serve— in the endeavors of formal thought. Mathematics are a testament to this, but, somewhat inexplicably, no other human science has gone as far. Computer Science, its little sibling, should have been expected to follow its steps; instead, it diverges more and more from its path.
What's more, the tendency to cast "programming" (in the most extended of its senses: every thing that pertains to computing, to use a single umbrella term) in formal languages is often derided as "academicism". This is a feud that is being waged for two thousand and five hundred years now, and a resolution is nowhere to be expected. On the one hand, Platonists continue to peer in the gloom for the shapes that make the shadows (pity they never prefigured the Necker Cube); on the other, Cynics bark their laughter at them while at the same time correctly pointing out that so much is lost in the translation from idea to actuality that they might as well build castles in the sand.
And yet I personally loathe Platonism as a general philosophy, and would rather side with Naturalism. For me, the way out of this inconsistency is not to regard algebra as operating on thought entities but as a mere shorthand for the operations to be actually carried out in the real world. I'm not so much a Platonist as I am a poet: words not only have meaning, that is, representational power in the real world, but grammar is inescapable. And that is a good thing, I think.
This autobiographical irrelevance is prompted by a couple of blog posts that resonated deeply with my convictions and I find them oddly fitting in a web of thought that has lately come to preoccupy me somewhat. I've written on Reddit something that summarizes quite pithily what I think in general of the art of programming:
[…] it violates the fundamental heuristic of using a stricter object (in the mathematical sense) even when a laxer one would do. An array is isomorphic to a finite map from an initial natural interval [0..N), so I can't really see a way to justify on stylistic grounds using a partial function (a cascade of
ifs) instead of a finite map.
(emphasis added.) Apart from the specifics I was objecting to, I was merely echoing Dijkstra's emphasis on strengthening the antecedents as an effective way to carve out, and thus reduce the complexity associated with an unbounded state- (problem-) space. This was not lost on someone that replied with what I surmise was intended to be faint praise:
You and Dijkstra would have gotten along great.
I felt quite honored, but that's not to the point. My conviction on the generative power of restrictions comes from not only the practice of programming, but also from many years of Literary Workshop attendance. I was more apt to thrive with baroquely artificial writing assignments than with "open-ended" ones. This is the Oulipo lesson, too (and it is not fortuitous that it numbered a mathematician) and von Trier's Dogme 95's, and I'm sure there must be many other examples of "creating by restricting". The other side of this coin is what I think is the Romantic imperative of giving free rein to the creative process. The opening paragraph of the Wikipedia article is, for me, revelatory not so much in what it says but in how much, how viscerally, violently I disagree with what it expounds:
Romanticism […] was partly a revolt against aristocratic social and political norms of the Age of Enlightenment and a reaction against the scientific rationalization of nature, and was embodied most strongly in the visual arts, music, and literature.
I cannot think of a better expression of the Romantic mind than this:
Want to know who are the damned creative magicals? Philosophers. I hate their guts.
Analyzing this would throw me on a long limb, and I am probably misrepresenting the quote anyway; suffice to say that I place myself on the antipode of that thought. And fortunately there are still some thinkers that would accompany me in that lonely spot. This is a wonderful sample of that kind of rational, analytic, articulate thought that I feel is much needed in the discipline of programming:
But the power of languages is to abstract these things away, so that we can concentrate on the algebra while the machine does the arithmetic. Yet, I think we owe the acceptance of
nulllargely to the fact that people are used to sloppy thinking. They’re used to treating nothing as if it were a thing, and using concepts of which they only know the approximate meaning.
(again, emphasis mine.) And this conclusion is reached by unwaveringly adhering to such fundamental logical principles as the Law of Identity and the Law of the Excluded Middle. Computers are Aristotelian, whether we like it or not.
Following a link from that article, I found another gem of the Euclidean approach to computing, that I leave for you to ponder. Personally, I find exciting to think about the result of embarking on a path that begins by recognizing that a value is
unique, eternal, immutable, and is not fixed in time or space and continuing from there.
By the way, it is Romantics I blame for artificially distinguishing between "art" and "discipline": as of the XVIII Century, "art", as in Greek techne sufficed for —or rather there was no distinction between— both.