Languages are languages

Ian Bogost has an interesting argument that computer languages are not languages,  but systems.

He starts off arguing that learning a programming language shouldn’t meet a curricular requirement for learning a natural language.  That’s a fair argument, except he does so on the basis that computer languages are not languages at all.

”the ability to translate natural languages doesn’t really translate (as it were) to computer languages”

It clearly does translate.  You can either translate literally from C to Perl (but not really vice-versa), or idiomatically.  It’s straightforward to translate from C to English, but difficult to translate from English to C.  But then, it’s difficult to translate a joke between sign and spoken language; that doesn’t mean that sign language isn’t a language, indeed sign languages are just as rich as spoken ones…  The experience of signing is different from speaking, and so self-referential jokes don’t translate well.

We can approach translating from English to C in different ways though.  We can model the world described in a narrative in an object oriented or declarative fashion.  A human can get the sense of what is written in this language either by reading it, or perhaps by using it as an API, to generate works of art based on the encoded situation.  Or we could try to capture a sense of expectation in the narrative within temporal code structure, and output it as music.

From the comments:

”If we allow computer languages, we should allow recipes. Computer codes are specialized algorithms. So are recipes.”

This seems to be confusing utterances with languages.  Recipes are written in e.g. English.  Computer programs are written in e.g. C.

“[programming code is] done IN language, but it ISN’T language”

You could say the same of poetry, surely?  Poetry is done in language, but part of its power is to reach beyond language in new directions.  Likewise code is done in language, but you can also do language in code, by defining new functions or  parsing other languages.

The thing is that natural languages develop with a close relationship with the speaker, words being grounded in the human experience of their body and environment, and movements and relationships within it.  Computer languages aren’t based around these words, but we can still use the same symbolic references by using those words in the secondary notation of functions names and variables, or even by working with an encoded lexicon such as wordnet as data.  In doing so we are borrowing from a natural language, but we could just have easily used an invented language such as Esperanto.  Finally the language is grounded in the outside world when it is executed, through whatever modality or modalities its actuators allow, usually vision, sound and/or movement.

… replacing a natural language like French with a software language like C is a mixed metaphor.

Discussing computer language as if it were natural language surely isn’t a mixed metaphor, if anything it’s just a plain metaphor.  But both have strong family resemblances, because both are languages.

The claim that computer languages are not languages reads as an attempt to portray computer languages as somehow not human.  Get over it, digital computation is something that humans do with or without electronic hardware, we can do it to engage fully with all of our senses, and we can do it with language.  Someone (who I keep anonymous, just in case) said this on a mailing list recently:

“Having done a little bit of reading in Software Studies, I was surprised by just how many claims are invalidated with a single simple example of livecoding.”

I think that this is one of them.

10 Comments

  1. Funny thing is in order to demonstrate a programming language is a language all you have to do is prove that it meets a formal definition of a language in terms of a set of sentences generated from some grammar. Of course since many programming languages really are defined in terms of a grammar they are unambiguously languages. The same can’t be said of natural languages.

  2. Computer languages seem to evolve on a much slower scale since their semantics are firmly grounded in their interpretation by silicone, but if you look at the evolution of programming languages over the years you will see the same pattern of development that occurs when pidgins become creolized, when natural language moves from cuneiform record keeping to the age of mass publication, or when children learn language in general— a movement away from concatenative sorts of structure and towards term-based structures, as well as the development of pervasive structural moods and morphologies which are holistically integrated and intuitively understandable rather than ad-hoc and inexplicable.

    I’d be hesitant to call the old languages like assembler, Fortran, or Cobol “languages” in the colloquial sense and not a technical or metaphorical sense. But more modern languages like Haskell and Perl resemble natural language far more than they’re given credit for. One of the problematic issues here is that there are some very old discourses at stake here, both for and against the idea of programming/computer/mathematical languages as languages; and unfortunately, I think some of the past advocates for the idea have done more damage to its reputation than the critics (because those advocates are typically more interested in mathematizing natural language than in languagifying mathematical formalisms).

  3. Dan, that just means that the Chomskyite definition of a language is wrong wrong wrong. Definitions have to fit reality, not the other way around.

  4. Wren, you are kidding, right? Programming languages are (currently) evolving many orders of magnitude faster than natural languages.

    Cobol is all but dead for new development, and it’s not even 60 yet. And Java, which is currently the world’s most popular language by many measures, isn’t even 20 years old. (Not even 16, depending on how you count…)

    I mean, I understand (and share) your impatience, but computer languages haven’t really even got to the point that they’ve really stayed strong for more than one human lifetime yet. Fortran and Lisp will probably be the first; though the modern versions of the languages are far removed from the originals.

    English, on the other hand, has changed much more slowly over centuries. If you could go back 500 years to 16th century England, you could still communicate pretty effectively, though it would require some effort at first.

    It’s not clear to me what the future of PLs hold; on the one hand languages are now expected to be much bigger and tackle much harder problems, although we have better hardware, better techniques, and better software tools to help offset the difficulty. Certainly the rate of new language formation doesn’t seem to have gone down any.

    A key difference is that natural language is primarily for bidirectional communication, whereas programming language tends towards unidirectional communication. I mean, it’s much easier to write code than it is to read it, and a very significant quantity of code never _really_ gets read by another human, even if it’s open source. (And, most code that does get read only gets read a small number of times.) And I think that enables programming languages to evolve relatively quickly.

    1. Hi Leon,
      Nice points, but just to reply on development speed, natural languages do emerge at the same order of magnitude of speed, in the form of sign languages e.g. when Deaf communities suddenly emerge for genetic reasons.

  5. @Leon:

    The important thing to consider is not the lifespan of specific programming languages, but rather to look at the evolution of the linguistic structure involved in that language. From a PL perspective of looking at type systems etc, sure progress is quite rapid. But if you look at the linguistic structure involved, progress is much slower. When it comes to linguistic structure, all those C-like languages are the same, all those Java-like languages are the same, all those ML-like languages are the same. The nature of grammatical primitives and the shape of how those primitives can be combined, these things change very slowly.

    In exactly the same way, we don’t look at the lifespans of individual speakers. Nor even the absolute duration in which some language has a living speaker. Languages evolve at different rates. With a small community that’s entrenched in the “right” way to do things, evolution is slow. With a large community that’s more willing to adapt, evolution can be quite rapid. The Great Vowel Shift happened in the span of one human lifetime! But rapid change like that is rather rare in general. A pidgin can remain a pidgin for generations so long as noone teaches it to their children as a primary language of communication. Absolute duration means very little when it comes to the evolution of languages.

    It’s important to be clear here. I am not talking about the evolution of individual languages and their dispersion into language families. I am talking about the evolution of the *linguistic structure* which forms the foundation of what it is that’s evolving and dispersing. In the course of a generation, pidgins which lack most of the crucial linguistic structures required to be called a “real language” can be converted into creoles which possess all those structures. In that same timespan we don’t see individual programming language families evolving from string-based scripting languages to term-based programming languages, nor individual language families evolving from procedural imperativism to applicative functionalism. Instead we see language families dying out and being replaced by new ones which incorporate these new forms of linguistic structure. But even still, the whole linguistic community is very slow to change. In the physics simulating community Fortran is the lingua franca and C++ is still considered something of an upstart. The mainstream software industry is still using languages like C++, Ada, Java, and Python which are all effectively the same when it comes to linguistic structure.

    C-style languages which lack applicative structure, lack any coherent namespace resolution system, lack linguistic support for polymorphism, etc, these kind of languages have been around for quite a bit longer than a human generation. All the greats who started computer science off before there were computers, they all used languages like this. C++ namespaces, which are rather ad-hoc and incoherent as namespaces go, weren’t even accepted in the original versions of the language spec. By the time Java came around we started to see modern versions of namespaces, but how many years passed between those old languages and the first version of Java? Hoare logic was already 25 years old when Java was released, and Hoare logic wasn’t exactly breaking new ground in the syntax of its object language.

  6. To go back to the original Bogostian distinction between natural (NL’s) and programming languages (PL’s). NL’s are the stuff that emerge when two conscious entities try to communicate with one another. Or as ape language researcher Savage-Rumbaugh says “language is what you use when you tell something new to somebody else” (that is actually not a direct quote but as how my memory of a quote). Or to put in another slightly different way: two language-capable agents will always find a way to communicate even when there is not a shared protocol between them. Try programming a computer without knowing a language the computer already knows. In that sense a PL is a system not a language.

    On the other hand, we are so accustomed to take writing as the highest form of language that it seems insanely obscure and confusing to call a PL not a language because they do obviously have all sorts of things we expect from a language: words, punctuation, rules, etc.

  7. Talk about a non-issue. Has nobody here ever heard of the doctrine of family resemblances? “Language” is a polythetic class; there’s no essential, defining feature.

  8. Indeed I agree with Xhevahir. Words in and of themselves do not possess a meaning. Their meaning comes about by the context in which they are used, and we learn the meaning of words by filtering out what they don’t mean (given external instruction by people who “know” what words mean).
    For example, does the word ‘ten’ mean the digits “1” and “0”, or does it mean the number which is twice five? There didn’t used to be enough of a distinction for society to bother treating them separately, but now we commonly use phrases like “ten in binary”, which can mean “10” (the number 2 in base 2) or “1010” (the number 10 in base 10, expressed in base 2).
    Thus words like “language” don’t have an exact meaning, they only have things which they do not mean. Unfamiliar distinctions, such as that between computer language and natural language which is relatively recent, take a while for society to “learn” (ie. bother defining one way or the other as covered by “language”) We’re in a time where there is such a distinction but no such filter.
    In other words, computer languages are languages because society hasn’t bothered to define them as being separate to natural languages, hence the need for a qualifier “computer” or “natural”.

Leave a Reply

Your email address will not be published. Required fields are marked *