The Arts of Concealing, Revealing, Communication, and Code (part 1)

The Arts of Concealing, Revealing, Communication, and Code (part 1)
Some wise words hiding in plain sight, Bacon style

The intertwined history of mathematics, writing, and cryptography dates back millennia, but a good starting point to discuss how these ideas evolved into modern usage is around the 1st century CE with Muhammad ibn Musa al-Khwarizmi (circa 780–850), a member of the Baghdad House of Wisdom (Bait al-Hikma). His book "Book on Calculation by Completion and Balancing" (Kitab al-Jabr wa‘l-Muqabala) gave rise to the word “algebra” (from al-Jabr) and the word “algorithm” comes from the Anglicization of his name. This book defined the method of using placeholders, or shay, meaning a “thing", or what we now call variables in math and computing, the classic x to solve for.

Al-Khwarizmi and those who followed him produced a wealth of mathematical and linguistic insights that underlie a good deal of modern computing and cryptography, including the decimal system and statistical analysis, and they are credited with making the earliest known references to cryptanalysis, as we understand it today. The scholars of the House of Wisdom studied their own language intimately, and gained much insight into combinatorics and permutations as principles, and utilized these insights to decipher Egyptian hieroglyphs (which they managed to a fair degree). They also borrowed heavily from math and numeric systems of the Hindus, which is the root of the Hindu-Arabic numerals we use today. We can attribute far more of our modern math and computing to this culture and era than is widely known, and it's a fascinating topic I'd encourage further reading on if it sparks your interest.

Language is Cryptographic, Context is Key

In semiotics, the study of signs and symbols, there's a distinction made between the signifier (the symbol, word, or sound) and the signified (the concept it represents or thing-in-itself). Variables are generic signifiers, or named references to potential "things", which are often mutable and prone to change over time - as opposed to constants, which are well-defined, named, immutable things such as π, e, etc.. Systems like Lambda Calculus heavily utilize variables with systematic methods of evaluation and reduction, sometimes with no literals whatsoever. Functional Type based languages like Haskell (inspired by category theory - aka "abstract nonsense") make use of pattern matching on the type of thing to intuit the correct set of operations to perform on it, denoting the emphasis on the functionality of the thing, categorically speaking, over the specific value it holds.

One could argue that, only slightly abstractly speaking, language is a form of algebra, where the surrounding equation (sentence) provides the context to determine the value of a given symbol or x. Have a go at some Mad Libs for a simple illustration, but the underlying principles including contextual factors (full equation), grammars and syntax (rules and algorithms), and variable interpolation (variable resolution or substitution), go well past this. By considering surrounding words and context cues we can decipher whether the word "bank" refers to a financial institution, or the edge of some water, for instance. We can then mentally substitute the correct intended meaning for the ambiguous symbol, once discerned. However, all language is strongly subject to interpretation through both personal and proximal surrounding context, and is itself a form of cipher or code in a very real sense. This cipher is solved by interpolation of signifiers and individual experiences of the signified as the key.

Language as a function evolved, and is organic, but the specifics of any given language are not - they're taught, and are generally not something one can naturally intuit otherwise. If a child is not taught at least one language (by exposure or direct instruction) they do not otherwise attain any, regardless of intellectual capacity. Once the functional aspect of language is learned, it's far easier to learn additional languages, but the specifics still must be gleaned from source material or existing speakers, on top of creating the functional neural pathways and processing ability (which is a whole other topic - or several) that underlies its use.

The specific symbolic mappings of any given language aren't something we can extrapolate from our environments organically or innately, but must be preserved and passed on to new generations, or we end up with a "dead language" eventually - some of which we've yet to decipher despite all our technology and advanced techniques. This conveying of the meanings of individual symbols is not unlike having the key to a cryptographic cipher that allows you to decode and make sense of an encoded message, just as foreign languages might as well be encrypted for all we can make sense of them without knowing the symbolic mappings and algorithms (grammar) they employ.

High Contexts and Implication Spaces

From this arises the realization that there are other forms of language besides the general spoken words, such as the languages of mathematics and various scientific domains, and programming code, among others, which are commonly also high context languages. All language, as mentioned, is contextually interpreted, sometimes with the aid of elaborate concordances to explicitly map broad contextual word usage for translation and comprehension of older or obscure literature (such as ancient biblical texts). In addition to word use, there are always myriad sociological contexts we assume to be "common knowledge" of a given culture and/or time period, which are necessary to usefully interpret spoken or written language - which any archaeologist will attest to.

Once a social and cultural context is lost (or when higher context was utilized but not documented), it becomes far harder to accurately or usefully interpret old writings. Even if we decipher the words and syntax, and manage to map the signifiers to the signified, the missing context will substantially affect how the words are read and understood by a reader, unless the author's cultural context was highly similar. Beyond the lexical and functional encoding is the conveying of meaning, which is not fully contained in the language itself, but in the convergence of symbol and effective contexts at play. We might think of this as something like environmental variables that will differ depending on conditions at runtime, even for the same code. When writings make references to not-so-common knowledge, we get higher context language, and the "I guess you had to be there" phenomenon, along with allusions, and inside jokes, for example.

Clever writers have, throughout history, made use of techniques based on high context language, so that the same text can convey very different things to different people. Just as different keys can extract different deciphered messages from a cryptographic message, high context language can be utilized to convey different messages depending on the reader's context and potentially special knowledge - such as double-meanings of words (aka puns) that are not commonly known - and thus utilize the reader themselves as a cryptographic "key" of a sort. This is functionally identical to sharing an inside joke with friends, but can be quite elaborate if desired.

This practice can also act as effective steganography, since the words make sense when read in the common-knowledge sense as well as the high context sense, leaving no implication of further hidden messages to search for. Notable (and more publicly known and innocent) examples of this sort of linguistic shenanigans writings include Shakespearean plays and Homer's Odyssey. Shakespeare's plays often contained double entendres and wordplay that had both innocent and more risqué meanings (much like a number of modern "kid's" shows and cartoons). Homer's Odyssey has been studied and analyzed for its layered meanings, where words can denote both a literal event and a symbolic or allegorical one.

The Bible, particularly in the original Hebrew, is also notorious for various linguistic play that has fascinated both lay readers and kabbalists for centuries. Shakespere's works and the Bible share a common propensity of inviting people to read into them a bit too deeply, perhaps, leading to "bible code" theories, and an enduring conspiracy theory involving Francis Bacon as the author of Shakespeare's plays, for instance.

What constitutes "common knowledge", however, can vary greatly depending on the domain or niche in question. Some math is largely common knowledge, like knowing π (Pi) is 3.14 and relates to circles, but e (Euler's number), ħ (Plank's constant), and countless other symbols and notations are far less known outside specific fields of study and practice. To a graduate student of higher math and physics, these symbols are all quite familiar, as will be their purpose and uses, and this knowledge will be considered common knowledge among their peers.

This is what it means to be initiated into a field or system, ultimately, whether academic or a secret society - the first step is to speak the language and gain enough context to comprehend and communicate about the topic at hand. Some high context jargon is largely utilized in notation, such as math, physics, and chemistry. Others may be intentional "security through obscurity", to maintain an in-group by initiation and keep prying eyes away without worrying about leaks. Some of these symbols are not just notation, but carry substantial additional context in "implication space", or what is unsaid, but implied by context. Sometimes this is utilized more specifically as negative space, and what is left unsaid stands out more loudly than what was actually said, highlighting through obvious omission.

Human experience is fundamentally comprised of a few basic senses, which make use of light and sound waves, tactile pressure, hot and cold, and particulate sampling through taste and smell, as well as internal perceptions that are harder to quantify sometimes. Despite this relatively small set of source stimulus types, the combinations and experiences of them is essentially infinite, and everything we can conceptualize is based on them - typically as compound symbols. The attributes of red, roundness, crisp, and sweet are basic qualities that might describe an apple, thus apple is a compound symbol that implies the particular specific qualities of experience one associates with apple-ness.

In programming, this is reflected in Objects and their attributes or Type definitions, in physics by molecules composed from sets of atoms, and in biology DNA defines cells which combine into a body and living thing. Each of these illustrates a macro, or gestalt, composed of micro parts in some particular (if varied within some limits) parts, often many levels deep in complexity. As Types suggest, the composition of a thing also implies its available set of functions as well, so this latent information is not trivial, though is often taken for granted outside of specialized fields and usages.

The human mind strives to categorize things in this hierarchical style to comprehend them, and so we largely see the world in this way, and our languages also reflect this with increasingly complex terms that represent levels of abstraction and more complex composition. Within each complex term is the full context of its composition, so if we say "atom" we imply particles, but omit the elaboration of parts to maintain coherence and succinctness. Acronyms go a step further, losing information similar to one-way hashes and shared key cryptography (where we compare inputs and outputs to verify, but can not reverse engineer the input from the result), but are still generally understood by those they're intended for, or who have the right key. This suggests language is not just cryptographic, but can contain several layers of encryption relying on levels of context and prior knowledge to decipher, or to be interpreted in some intended way.

I hope you enjoyed this article so far, and you can find the 2nd half in Part 2. Till then, I'll leave you with a relevant quote that speaks to the difference between knowing and understanding, and how our otherwise useful symbolic representations can obscure this:

“If men learn this, it will implant forgetfulness in their souls; they will cease to exercise memory because they rely on that which is written, calling things to remembrance no longer from within themselves, but by means of external marks. What you have discovered is a recipe not for memory, but for reminder. And it is no true wisdom that you offer your disciples, but only its semblance, for by telling them of many things without teaching them you will make them seem to know much, while for the most part they know nothing, and as men filled, not with wisdom, but with the conceit of wisdom, they will be a burden to their fellows.” – Plato, Phaedrus

Decoded: "Knowledge Is Power"