The Arts of Concealing, Revealing, Communication, and Code (part 2)

This article continues the post Part 1.

Architecture and Design, aka It's All Mental Constructs - So Construct Them Well

In technology and programming all of these concepts come to fruition in profound and nuanced ways, though are still commonly overlooked even by practitioners. Programming languages are cousins of spoken languages, but with sophisticated formal grammars, focused on consistent and well-defined syntax, logic, and structure. Similar to languages like Latin, coding languages utilize complex and strict grammars in an attempt to remove ambiguity from interpretation, and provide consistent, repeatable results.

To continue to improve on existing development and design practices, it is necessary to step back and re-assess things, see where we are, where we came from, and what we learned - and lost - along the way. There's a principle in evolutionary processes where a system approaches or reaches a local maxima for its current heuristic and trajectory. At this point, the best approach is generally to step back to a "lesser" prior state, while retaining any information gained thus far, in order to take a different path forward. The first time up a mountain there is little information, and we set out largely blindly leaping into new territory. We learn a lot on the way up, however, and can make better choices and improve the strategies and heuristics in ongoing iterations that are better informed.

Right now, we are at a point where the greatest single leap in AI we've seen has emerged from combining linguistic, philosophical, and mathematical concepts that were noted, if not yet able to be utilized as we can and do now, almost 2000 years ago. ChatGPT looks almost human by consuming massive amounts of symbols and utilizing statistical analysis and permutations (similar to Al-Khwarizmi), vector spaces (abstract geometry), classic gematria (translating letters and words into numeric values), and a bit of "black magic", to generate insightful and useful responses in our own language. Even the cutting edge quantum computers are utilizing principles in physics founded nearly a century ago.

It's time to step back and take another look at the vast storehouse of aggregate human knowledge and experience, I'd say, and tools like Large Language Models are able to assist in this venture, even. Analog computers are making a sudden comeback as we realize that - as many ardent audiophiles will tell you from digital media replacing vinyl - we've lost something subtle but important in that process.

Even AI itself will benefit immensely from a more analog and fully immersion based approach to learning from and interacting with a physical environment (even simulations are showing highly promising results). Our physical existence itself, our presence in the world and the laws of physics, generate a huge amount of "common knowledge" that we're rarely even aware of. Newton may have written first about gravity, but anyone alive was aware they didn't just float off into space, but remained firmly anchored to the earth, from simple, direct experience of natural laws.

We take this for granted, but this lack of common knowledge (and common sense that emerges from it - in humorously uneven amounts) is something AI lacks, and behind many of its ongoing shortcomings. It has no understanding of what it's talking about, even when it makes perfect sense and appears sentient. This is largely because it has no experience but through signifiers, and entirely lacks direct interaction with the signified. Whether AI will ever be truly "conscious" as humans are (and what that even means, really) is debatable, but if it is possible, direct experience and interaction with the world is a major prerequisite, certainly. The ability to simply "go outside and touch some grass" is vastly underrated.

There exists now an urgent need to focus on how we design, structure, and utilize information, returning to our roots and early intuitions - which came from direct observation and natural philosophy. We need to stop and intimately observe many things we typically take for granted, and as such have become essentially invisible to us. Even the language we speak, why it evolved as it did, how that evolution reflects our own cognitive processes, and the underlying structure it conveys, can inform and inspire us. Looking at concepts like gematria, where math and language converge, and rethinking how to utilize it, can give way to processes like embedding words in vector spaces to extrapolate semantic contexts and relationships. Nature waits to reveal her secrets, but we have to look, and ask new and unusual questions!

The technology we have now could be leveraged far more effectively. It's oft noted that classic gaming consoles and computers did amazing things with the minuscule resources they had available, even compared to today's wrist watches and cellphones. As engineers, we're generally spoiled and lazy, and let our bloated constructs fill insanely powerful machines to their limits because nothing prevents us - and it's far faster and easier than crafting elegant solutions. We ignore focusing on design and structure because we can get away with it. We're ignoring many amazing insights of the past because we can fake our way past needing them. Imagine, though, what we could do, even right now, if we better utilized what we have available - right now. "Invention is the child of necessity", as Plato said, and in an age of such an abundance of resources to exploit, we sometimes have to use self-discipline to create constraints, utilize better habits, and spur on innovation ourselves, on purpose.

Sir Francis Bacon - an early champion of the scientific method and empirical science - was largely inspired by believing he could "decode the secrets of nature" if he could figure out the symbols it used. This led him to amazing insights in cryptography and information theory, and his work in steganography paved the way for vast usage in digital mediums. Bacon's core principles are widely used to hide information in text, images, audio, video, and other innocent seeming files online to this day. Sometimes simple-seeming insights can lead to unexpectedly transformational results. As the pivotal AI transformer paper says "Attention Is All You Need", and where we focus it we're shown whatever the object of that attention has to offer. We might rephrase this as "seek, and ye shall find", if so inclined.

For nature, that's an infinite fountain of insights and inspiration relating directly to how our world works and what we can do with it. Someone built a functional computer out of fungi. Others are working to use DNA for large and efficient data storage, and common quartz crystal has been tested off and on in recent decades as an "immortal" storage medium. Weird, novel, and promising experimentation is going on all over, and everyone is invited to stand on the shoulders of giants, using our technical advances and this lofty vantage point to find both novel and heavily inspired approaches and solutions to both new and old problems. Once you complete the necessary initiations, of course. :-)