Find the Alien

Assembly Theory (AT) (original paper) is some new theoretical chemistry that tries to assess the relative complexity of the molecular underpinnings of life, even when the chemistry might be completely alien. For instance, if we send a probe to a Jovian moon and there are new microscopic creatures in the ocean, how will we figure that out? In AT, it is assumed that all living organisms require a certain complexity in order to function since that is a minimal requirement for life on Earth. The chemists experimentally confirmed that mass spectrometry is a fairly reliable way of differentiating the complexity of living things and their byproducts from other substances. Of course, they only have Earthly living things to test, but they had no false positives in their comparison set of samples, though some substances like beer tended to be unusually high in their spectral analysis. The theory is that when a mass spec ionizes a sample and routes it through a magnetic and electric field, the complexity of the original molecules is represented in the complexity of the spray of molecular masses recorded by the detectors.

But what is “complexity” exactly? There are a great number of candidates, as Seth Lloyd notes in this little round-up paper that I linked to previously. Complexity intuitively involves something like a trade-off between randomness and uniformity, but also reflects internal repetition with variety. There is a mathematical formalism that in full attribution is “Solomonoff-Chaitin-Kolmogorov Complexity”—but we can just call it algorithmic complexity (AC) for short—that has always been an idealized way to think about complexity: take the smallest algorithm (in terms of bits) that can produce a pattern and the length of the algorithm in bits is the complexity. So a uniform sequence will have low complexity because the algorithm is something like “10 print ‘a’; goto 10;” but a random sequence that has no internal repetition will be only representable by an algorithm as long as the sequence itself: “10 print ‘aiofgeyfgequegatwnvurpynrqildw'” Complex things have small generator algorithms that produce resulting sequences that are much larger than themselves with high levels of repetition with variation within them. Indeed, Ray Solomonoff in his original formulation hypothesized that the minimal generating algorithm among the sea of possible algorithms for a given sequence would also be an encoding of an ideal inference engine for that patten, leading to a new way of thinking about inference and prediction from a mathematical perspective.

But why do the AT folks think that AC is insufficient for characterizing the complexity of living vs. nonliving things? In part it is because Assembly Theory requires that the chemical structures are contingent on what is chemically possible in the same way that evolutionary change is contingent on past changes and physical limitations. The general mathematical theory considers all sequences (or structures more abstractly) as being possible. Is there a way to convert AC into something that would be constrained by an evolutionary history (or a “pre-living” one to invent a term)? Could we, for instance, assume an alphabet and a set of legal combining rules that limit that language to what is expressible by combinations of the terms, then apply AC to the possible productions of that language? Of course. In fact, a language like this is abstractly accepted by a Turing machine (depending on the rules), and the length of the minimal Turing machine that can generate any given production is exactly its AC.

But I don’t see why that criticism of AT applies to the theory for the exploratory examination of molecular forms on other planets. The mass spec approach uses existing technology already applied in deep space exploration and is a physical method for characterizing the complexity of life candidates. AC brings little to the table.

2 thoughts on “Find the Alien”

Leave a Reply

Your email address will not be published. Required fields are marked *