We Are Weak Chaos

Recent work in deep learning networks has been largely driven by the capacity of modern computing systems to compute gradient descent over very large networks. We use gaming cards with GPUs that are great for parallel processing to perform the matrix multiplications and summations that are the primitive operations central to artificial neural network formalisms. Conceptually, another primary advance is the pre-training of networks as autocorrelators that helps with smoothing out later “fine tuning” training programs over other data. There are some additional contributions that are notable in impact and that reintroduce the rather old idea of recurrent neural networks, networks with outputs attached back to inputs that create resonant kinds of running states within the network. The original motivation of such architectures was to emulate the vast interconnectivity of real neural systems and to capture a more temporal appreciation of data where past states affect ongoing processing, rather than a pure feed-through architecture. Neural networks are already nonlinear systems, so adding recurrence just ups the complexity of trying to figure out how to train them. Treating them as black boxes and using evolutionary algorithms was fashionable for me in the 90s, though the computing capabilities just weren’t up for anything other than small systems, as I found out when chastised for overusing a Cray at Los Alamos.

But does any of this have anything to do with real brain systems? Perhaps. Here’s Toker, et. al. “Consciousness is supported by near-critical slow cortical electrodynamics,” in Proceedings of the National Academy of Sciences (with the unenviable acronym PNAS). The researchers and clinicians studied the electrical activity of macaque and human brains in a wide variety of states: epileptics undergoing seizures, macaque monkeys sleeping, people on LSD, those under the effects of anesthesia, and people with disorders of consciousness. The range of electrical signals that they captured were mathematically analyzed to see how regular versus chaotic the signals were. To do this they relied on a complex comparison between models of electrical activity and some mathematical approaches related to Kolmogorov complexity, like Lempel-Ziv complexity. These concepts are fairly simple at heart and reflect the size of the underlying generator algorithm for whatever data pattern is observed. A simple, repeated cycle in brain activity would have low complexity, but a completely random pattern would not be representable by any underlying generator that isn’t as large as the pattern itself. The methods express the compressibility of data, with some edge that exists between regularity and runaway chaotic dynamics in the behaviors of the systems.

And the conclusion, which also follows for artificial recurrent neural networks, is that conscious states appear to be poised very near to the edge of chaos. My underlying babble machine that I try to harness for writing poetry and prose is not at all surprised by this outcome. The work also has the fortunate capability of explaining the historical suggestion of madness and creativity conjoined together in both constructive and destructive patterns. There are some especially interesting notes in the work, like the focus on interneurons as mediators for the regularity of firings across neural systems. So, when we sleep, we pulse, then we awake into a storm of weak chaos.

 

 

One thought on “We Are Weak Chaos”

Leave a Reply

Your email address will not be published. Required fields are marked *