word itself and what it stands for—are what call the tune for us here today,
and what we therefore have to articulate in order to avoid being drowned out by
them completely. They are the language of our time precisely because code—the
word and the signified fact—is so much older, as I propose to show in a brief
historical digression. But have no fear—I’ll arrive back at the present soon
originate in processes of encryption, which, according to Wolfgang Coy’s
elegant definition, “is, from a mathematical perspective, a representation of
a finite quantity of characters of an alphabet in an appropriate sequence of
signs.”1 This statement already makes two things clear. First of all, codes,
despite widespread opinion to the contrary, are by no means a peculiarity of
computer technology or, for heaven’s sake, genetics; as sequences of signals
over time, they are an integral part of every telecommunications technology and
every medium of transmission. Furthermore, there is much evidence supporting the
view that codes have only been conceivable and feasible for the encryption of
natural languages since true alphabets—in contradistinction to ideograms and
logograms—have existed. These are, as previously indicated, systems containing
a finite number of identically recurring signs that represent—more or less
clearly and as completely as possible—verbalized sounds in the form of letters.
A vowel-based alphabet like that unique Greek invention2 that with good reason
has been credited with being the “first total analysis of a language”3
therefore does indeed seem to be a necessary precondition for the development of
codes—but, nevertheless, not a completely sufficient one.
all, what the Greeks lacked—notwithstanding sporadic allusions to the
utilization of ciphers and secret writings in the works of Aischylos, Aeneas
Tacticus and Plutarch4—was that second precondition of all forms of encryption—namely,
highly developed telecommunications technology.
thus seems to me to be no mere happenstance that the accounts that have come
down to us of secret message systems neatly coincide with the emergence of the
his Lives of the Caesars, Suetonius (who himself held the post of
confidential scribe in the service of a great emperor) reports that he himself
discovered encoded letters among the papers in the estates of the divine Caesar
as well as the divine Augustus.
made do by shifting all the letters of the Latin alphabet by three places—i.e.
A became D, B became E, etc. His adopted son Augustus, on the other hand, is
said to have simply skipped to the next letter, whereby a lack of mathematical
insight led him to replace X, the alphabet’s final letter, with a double A.5
The reason for this is obvious: when read by the many uninitiated (and literary
acuity was not exactly widespread among the Romans) the result was a hodgepodge
of consonants. And as if such innovations in encryption weren’t enough,
Suetonius ascribes to Caesar another invention made shortly before—that of
having composed the reports to the Roman Senate on his Gaul campaign in several
columns or even separate book-size pages. An even higher honor is laid at the
feet of Augustus, who is said to have stationed riders and set up relay stations
in such a way as to establish Europe’s first strictly military express message
In other words, it was the Empire as such—as opposed to the Roman Republic or
individual composers of shorthand communiqués like Cicero—that provided the
basis upon which command, code, and telecommunications technology finally came
imperium therefore means both the order and its effect—world dominion. Thus,
until recently, the Pentagon’s imperial watchwords were command, control,
communications and intelligence; now, with the coming together of
telecommunications technology and Turing-machines, the battle cry is C^4—command,
control, communication and computers— and it resounds from Orontes to the
coast of Scotland, from Baghdad to Kabul.
Romans used the word imperia to mean the orders of the emperor, as well
as codicilla to refer to small blocks of wood that were stripped of bark
and coated with a layer of wax that could be inscribed with writing. In the
Early Imperial Age, the etymon codex (Old Latin: caudex; distantly
related to the German word hauen meaning to chop), on the other hand, came to
mean books, the pages of which—in contrast to papyrus scrolls— could, for
the first time, be leafed through. And this is ultimately how that word came
into circulation whose circuitous path to French and English has brought us
together in Linz today. Code, from Emperor Theodosius to Empereur Napoleon,
simply meant the bound book of laws and thus the codification of the
judicial-bureaucratic files containing
torrents of imperial dispatches and orders that had surged for centuries along
the expressways of the Empire and then were frozen into a single collection of
transmission became data storage7; pure events became serial order. And to this
extent, the Codex Theodosius and Codex Iustinianus are still the
bearers today (wherever Anglo-American common—literally ordinary and
widespread—law does not presently hold sway) of an ancient European code of
rights and obligations. After all, in the corpus iuris, copyrights and
trademarks—whether on a codex or a code—are, to say the least, simply
preposterous, impossible constructions.
only question that remains is how the technical sense of the word code was able
to so completely overshadow the term’s legal sense. It is well known that
today’s legal systems fail on a regular basis to even comprehend codes and,
consequently, to protect them—whether from pirates and their subsequent buyers
or, conversely, from their discoverers and writers. The answer seems to be
simply this: whatever we categorize as code— from the secret writings of Roman
emperors to the arcana imperii of the Modern Age— has, since the Late Medieval
Period, been referred to as a cipher. The term code was long used to refer to
very different sorts of encryption processes whereby pronouncability was
preserved but obscure or seemingly innocuous words simply replaced the secret
ones. Cipher, on the other hand, was just another name for the zero that made
its way in those times from India via Baghdad to Europe, and helped the sifr (Arabic:
empty) to achieve mathematical-technical prowess. Since then, there have been (in
contrast to the way of the Greeks) different sets of characters for letters and
numbers—here the alphabet of the people, there the figures of the secret
strategists (whereby the German word Ziffer, meaning numeral, is yet another
derivative of the spelling of the Arabic word sifr).
separate sets of characters have been an extremely fruitful development—in
concert, they have engendered wondrous creatures that would never even have
occurred to the Greeks or Romans. Without modern algebra, no encryption; without
Gutenberg’s letterpress printing, no modern cryptology.
1462 or ‘63, Battista Leone Alberti, the inventor of linear perspective, came
to realize two simple facts. First of all, sounds and letters occur with
differing frequency in each and every language, which, according to Alberti, is
proven by the design of the compartmentalized typesetter’s case in which
Gutenberg stored his characters. Thus, cryptographers were able to decipher the
encoded messages hidden among the letter shifts devised by Caesar and Augustus
from the very frequency of the letters used. And that is why, second of all, an
encryption system that employs a shift of a fixed number of letters is no longer
sufficient. And from then on until the time of World War II, cryptologists
followed Alberti’s suggestion that when a letter was added in the original
text, one was also to be added in the encoded message.8 A century after Alberti,
François Viète, the founder of modern algebra as well as a cryptologist in the
service of Henri IV, interwove numbers and letters even closer. It is only since
Viète that there have been equations with unknowns and general coefficients
whereby numbers have been encoded as letters.9 And that is the way it has
remained to this day for everyone who writes in an advanced programming language
that likewise (mathematically more or less correctly) attributes variables to
one another like in equations. On this seemingly unspectacular basis—Alberti’s
polyalphabetic code, Viète’s algebra and Leibniz’ differential calculus—
the nation-states of the Modern Age have progressed in the direction of
modernity began with Napoleon. In place of messengers on horseback, there
appeared in 1794 an optical telegraph that remote-controlled France’s armies
with secret codes.
laws and privileges that had been handed down from days of yore were replaced in
1806 by the Code Napoléon forming an integrated whole. Samuel Morse is
said to have inspected a New York print shop in 1838 in order to ascertain based
on the size of the respective compartments in the typesetter’s case—as
Alberti had done—which letters occurred most frequently and thus ought to be
made easiest to send with Morse’s dots and dashes.10 For the first time, an
alphabet had been optimized according to technical criteria and thus without
regard to semantics, but the upshot nevertheless was not yet called Morse code.
This was the result of the publication of certain books—so-called universal
code condensers—that listed the abbreviations in generally accepted use in
international cable traffic for the purpose of shortening the text and reducing
transmission costs, whereby the message sent by the operator was thus encoded a
what previously had been termed deciphering and enciphering was referred to as
decoding and encoding. All codes processed by computer nowadays are thus subject
to the Kolmogorov Test: it is bad when the input itself is longer than its
output; in the case of white noise, both are equally long; consequently, code is
elegant when its output is much longer than itself. Thus, the 20th century
turned supremely capitalistic economizing in the form of the “code condenser”
into consummate mathematical stringency.
this, I have just about reached the current state of affairs. But it remains to
be asked how it came to pass—that is, how mathematics and encryption entered
into that indissoluble union that determines how things work today. And the fact
that the answer is named Alan Turing is something that many of you have no doubt
already heard. After all, the Turing machine of 1936 as the theoretical switch
at the basis of any conceivable computer solved a fundamental problem of the
Modern Age: how the real—and in common parlance infinitely long—numbers upon
which technology and engineering have been based since Viètes day can
nevertheless be ascribed with finite—and thus ultimately whole— numbers.
Turing’s machine proved that this is, indeed, not possible for all real
numbers but it is for a considerable subset of them, which he termed computable
numbers.11An infinite number of signs made up of a finite alphabet that, as
we all know, can be reduced down to zero and one, has since then banished the
endlessness of numbers.
sooner had Turing succeeded in establishing this than it came time for the real
thing— namely, cryptological application. In Britain’s Code and Cipher
School, Turings protocomputer began in spring 1941 to successfully break the
secret code of the German Wehrmacht (which, disastrously, had remained true to
the principles of Alberti) and, in doing so, made a decisive contribution to the
outcome of the war. Today, at a time when the computer has practically decoded
the weather and the genome—and, thus, physical and, increasingly, biological
secrets—we all too often forget that this was not its prime mission. Turing
himself raised the question of what the computer had actually been created to
do, and initially cited the deciphering of our simple human language as its
languages would be the most impressive of the [...] possible applications
mentioned because it is the most human of these activities, although
area seems to be too highly dependant upon sensory organs and the capability of
locomotion. Cryptography would perhaps be the most rewarding field of
application. There are remarkably close parallels between the problems of the
physicist and those of the cryptographer. The system according to which a
message is decoded corresponds to the laws of the universe, intercepted messages
resemble the achievable level of proof, and the key that is valid for a single
day or a single message is the equivalent of important (natural) constants that
have to be determined. The correspondence is quite rigorous, although
cryptography can very easily rely on discrete machines to carry out its analysis,
whereas this is not so simple in physics.12
into telegraphese, that must mean: whether everything in the world can be
encoded is written in the stars. The only thing that seems guaranteed from the
outset is that since computers themselves operate on the basis of codes, they
are capable of decoding strange, new codes. For three and a half millennia,
alphabets have been the prototype for everything discrete. The question of
whether physics, despite its quantum theory, is to be accounted for solely as a
quantity of particles and not as a layering of waves has by no means been
definitively answered. And whether, ultimately, all languages— which are
precisely what make human beings human—from which, once upon a time in the
land of the Greeks, our alphabet was derived, are to be modeled entirely—including
their syntax and semantics—as codes is another question that must remain open.
indicated thus far means that the concept of code is as inflationary as it is
questionable. If every historical epoch has its leading philosophy, then ours is
the philosophy of code, which therefore—in an odd recurrence of the term’s
original sense of “codex”—has the last word on legality in all cases and
thus strives to do precisely what only Aphrodite could in the first philosophy
of the Greeks.13 Wherever possible, though, code means—as codex once did as
well—only the law of precisely that empire that keeps us in subjugation and
forbids even this sentence to be spoken. In any case, it is with triumphal
certainty that the major research institutions (that have the most to gain from
such findings) pronounce that there is nothing in outer space that is not code—from
the virus to the Big Bang. One should therefore—as Lily Kay did in the case of
biotech— be on one’s guard against metaphors that water down the legitimate
sense of the term code when—using DNA as an example—there is no discoverable
one-to-one correspondence between material elements and units of information.
Because of the very fact that this word over the course of its long history has
meant “shift” and “transference” in the sense of one letter to the next,
and from numbers to letters or vice versa, it is most susceptible of all to
false application or attribution. Today, in the reflected luster of the word
code, there are sciences giving off a brilliant appearance that have not even
mastered their ABCs and elementary multiplication tables, let alone being able
to turn something into something else other than in the case of metaphors in
which things are ascribed different names. Codes should therefore mean
exclusively alphabets in the sense of modern mathematics, clear and finite ones
and thus sequences of symbols that are as brief as possible and which, thanks to
a system of grammar, nevertheless possess the phenomenal capability to
infinitely reproduce themselves: semi-Thue groups, Markov chains,14 Backus-Naur
forms, etc. This and this alone is the difference between such modern alphabets
and the familiar one that elaborated our various languages and gave us the gift
of Homer’s verse,15 but is incapable of getting a world of technology running
the way computer codes do nowadays. After all, whereas Turing’s machine could
take whole numbers and merely produce real numbers out of them, their successors—named
after Turing’s great word—have asserted total dominion.16 Technology today
implements code in reality, and thus encodes the world.
this means that language as the House of Existence has been forsaken is
something I cannot say. Turing himself, when he scrutinized the technical
possibility of a machine learning a language, proceeded under the assumption
that it would not be computers but rather robots equipped with sensors and
effectors and thus possessing knowledge of their environment that would be able
to master this high art of speech. But it was precisely the robots’ new,
customizable knowledge of the environment that remained dark and elusive for the
programmers that had launched them with the early versions of code. The socalled
hidden layers of today’s neuronal networks provide a good though still trivial
example of the great extent to which the computations of the constructors
themselves can go astray even when the result itself turns out all right. Thus,
either we write code that clarifies like natural constants the way things are
and, in turn, expend millions of lines of code and billions of dollars for
digital hardware, or we leave it all up to machines that derive code from their
own environment, though this ends up being code that we are unable to either
read or pronounce. The dilemma between code and language ultimately seems to be
those who have written code even once—and be that in the most elaborate of
languages or even in assembler—know two simple things on the basis of their
own experience. One is that all the words from which the program has of
necessity been derived and developed just lead to a bunch of errors or bugs; the
other is that the program will suddenly run on its own as soon as its head is
emptied of words. And, as far as communication with others is concerned, that
means you can hardly go on interacting by means of the code you have written
yourself. And, with respect to these remarks, may this fate not befall you and
Wolfgang Coy, Aufbau und Arbeitsweise von Rechenanlagen. Eine Einführung in
Rechnerarchitektur und Rechnerorganisation für das Grundstudium der Informatik,
2nd revised and expanded edition. Braunschweig / Wiesbaden 1992, p. 5.
On the current state of research, see Barry P.
Cf. Wolfgang Riepl, Das Nachrichtenwesen des Altertums. Mit besonderer Rücksicht
auf die Römer . Reprint, Darmstadt 1972.
Cf. Caius Suetonius Tranquillus, Vitae Caesarum, I 56, 6 and II 86.
Cf. Suetonius, I 56, 6 and II 49, 3. On the cursus publicus, in which
Augustus himself recorded passes, orders and letters dated with the exact time
of day and night (Suetonius, II 50), see Bernhard Siegert, “Der Untergang des
römischen Reiches.” In: Paradoxien, Dissonanzen, Zusammenbrüche.
Situationen offener Epistemologie. Eds. Hans Ulrich Gumbrecht and K. Ludwig
Pfeiffer. Frankfurt/M. 1991, pp. 495–514.
On temporal and spatial media and the process of transition from empire to the
monastic Early Middle Ages, see Harold A. Innis, Empire and Communications. 2nd
ed., Toronto 1972, pp. 104–120.
On the subject of Alberti, see David Kahn, The Codebreakers. The Story of
Secret Writing, 9th edition, New York 1979. On the German Wehrmacht’s
Enigma, see Andrew Hodges, Alan Turing: The Enigma, New York 1983, pp.
Viète himself chose vowels for unknowns and consonants for coefficients. Since
Descartes’ Géométrie (1637), the coefficients proceed from the
beginning of the alphabet and the unknowns from the end (a, b, c..., x, y, z).
Since then, x^n + y^n = z^n has been the classic example of a mathematical
equation without any numbers at all, and thus one that would have been
inconceivable for the Greeks, Indians and Arabs.
Cf. Coy, Aufbau, p. 6.
Cf. Alan M. Turing, Intelligence Service. Schriften, edited by Bernhard
Dotzler and Friedrich Kittler. Berlin 1987, pp. 19–60.
Turing, Intelligence Service, p. 98.
“daímohn hê pánta kubernâi” (God, who controls all) is what Aphrodite
called the Parmenides (DK 8, B 12, 3).
On Markov chains, see Claude E. Shannon, Ein/Aus. Ausgewählte Schriften zur
Kommunikationsund Nachrichtentheorie, edited by Friedrich Kittler et al.
Berlin 2000, pp. 21–25.
On the subject of Homer and the vowel-based alphabet, see Barry B. Powell, Homer
and the Origin of the Greek Alphabet, Cambridge 1991.
Cf. Turing, Intelligence Service, p. 15.