The Universe’s Hidden Drive: A New Law of Increasing Complexity
By Apirate Monk
In the summer of 1950, over lunch at Los Alamos National Laboratory, the physicist Enrico Fermi posed a question that would echo through decades of scientific inquiry: If intelligent alien civilizations exist, why haven’t we seen them? The cosmos is vast and ancient—13.8 billion years old, with countless stars and planets. Surely, Fermi reasoned, some advanced societies should have had time to spread across the galaxy. So, where are they? This query, now known as the Fermi Paradox, has haunted scientists and philosophers alike, spawning answers ranging from the grim (civilizations self-destruct) to the humbling (intelligent life is vanishingly rare).
But what if the universe itself is wired to make complexity—not just life, but intricate systems of all kinds—inevitable? A bold new hypothesis, proposed by an interdisciplinary team led by mineralogist Robert Hazen and astrobiologist Michael Wong of the Carnegie Institution for Science, suggests exactly that. Published in the Proceedings of the National Academy of Sciences in October 2023, their work posits a “missing law” of nature: a universal principle that drives systems, from stars to cells to societies, toward ever-greater complexity over time. This law of increasing functional information, they argue, could reshape our understanding of evolution, time, and the very fabric of the cosmos.
The Arrow of Complexity
The universe, at its core, is a story of transformation. Moments after the Big Bang, it was a searing soup of undifferentiated energy. As it cooled, quarks coalesced into protons and neutrons, which fused into the nuclei of hydrogen and helium. Stars ignited, forging heavier elements like carbon and oxygen in their fiery cores. Supernovae scattered these elements across space, seeding the raw materials for planets, minerals, and, eventually, life. Each step seems to build on the last, creating systems that are not just different but more intricate, more organized, more capable of doing something remarkable.
This progression feels intuitive, yet science has struggled to explain it. The second law of thermodynamics—the unyielding rule that entropy, or disorder, increases in closed systems—seems to pull in the opposite direction. Eggs crack, ice melts, stars burn out. So why do we see galaxies, ecosystems, and civilizations emerge? “The second law alone doesn’t account for the richness we observe,” Wong told me in a conversation. “It describes a universe marching toward equilibrium, but we see systems that defy that trend, becoming more ordered, more functional.”
Hazen and Wong’s hypothesis seeks to bridge this gap. They propose that alongside the second law, another principle operates: a law of increasing functional information. This law suggests that systems evolve by accumulating configurations that perform specific functions—whether that’s a mineral crystallizing in a volcanic vent, a protein folding to catalyze a reaction, or a society developing language. These functions, selected by environmental pressures, drive systems toward greater complexity. “It’s not that entropy is wrong,” Hazen said. “Our law works in harmony with it, capturing a different kind of order.”
The Law of Increasing Functional Information
Core Concept
The law posits that natural systems evolve to states of greater complexity through selection for function. Functional information measures how many configurations of a system can perform a specific task. Systems with fewer configurations (high functional information) are more specialized and complex.
Key Examples
Minerals: Earth’s mineral diversity has grown from a handful of simple crystals 4.5 billion years ago to over 5,000 distinct types today, driven by geological processes selecting for stable or persistent forms.
Elements: The universe progressed from hydrogen and helium to heavier elements via stellar nucleosynthesis, increasing nuclear complexity.
Biology: Life evolves through natural selection, with organisms developing intricate structures (e.g., multicellularity, nervous systems) that enhance survival and reproduction.
Implications
Cosmic Evolution: Evolution isn’t exclusive to biology but applies to stars, planets, and even artificial systems like AI.
Time’s Arrow: The law suggests a second temporal direction, alongside entropy, where complexity increases.
Astrobiology: Signs of selection for function (e.g., unexpected molecular distributions) could be biosignatures on other worlds.
Challenges
Quantification: Functional information is contextual and hard to measure precisely, especially for complex systems like cells.
Testability: Critics argue the law’s predictions are too broad to be rigorously tested in controlled experiments.
A New Kind of Information
The idea of functional information, first articulated by biologist Jack Szostak in 2003, is central to this hypothesis. Unlike classical information theory, which measures complexity by how concisely a sequence can be described, functional information focuses on what a system does. Take an RNA molecule: its functional information depends on how many other RNA molecules of the same length can perform the same task, like binding to a target. If only a few can do it, the molecule has high functional information—it’s specialized, complex, and rare.
Hazen stumbled upon Szostak’s concept while pondering the origin of life. As a mineralogist, he was fascinated by how chemical reactions on mineral surfaces might have sparked the first biomolecules. “I realized that separating life from nonlife was a false dichotomy,” he said. “There’s a continuum, a drive toward complexity that applies to both.” In 2007, he and Szostak ran computer simulations showing that algorithms evolving to perform computational tasks increased in functional information over time. The idea lay dormant until Wong joined Hazen’s team in 2021, bringing fresh perspectives from planetary science and astrobiology.
Together, they assembled a diverse crew—philosophers, physicists, data scientists—to tackle the problem. “We needed to check each other’s biases,” Wong said. “This isn’t just a question for one field.” Their 2023 paper argues that functional information increases through three universal mechanisms: static persistence (stable configurations endure), dynamic persistence (self-reinforcing cycles maintain themselves), and novelty generation (new configurations arise and are selected). These processes, they claim, govern everything from the formation of quartz crystals to the emergence of human culture.
Evolution Everywhere
The implications are staggering. If Hazen and Wong are right, evolution isn’t a quirk of biology but a cosmic principle. Stars evolve, becoming chemically richer as they forge heavier elements. Minerals evolve, with Earth’s mineral diversity ballooning over billions of years. Even human-made systems, like economies or AI, might follow this law, selecting for functions that enhance efficiency or adaptability. “Evolution is everywhere,” Wong said. “Life is just one vivid case.”web:web:4
This view challenges the traditional narrative of biological evolution as a one-off phenomenon. Charles Darwin described natural selection as a mechanism for life’s diversification, but Hazen and Wong see it as a special instance of a broader rule. “Darwinian evolution is a subset,” Hazen said. “Selection for function applies to stars, atoms, minerals—anything where configurations compete and persist.”
Consider minerals: Earth’s early crust hosted a few dozen simple crystals. Today, there are over 5,000 mineral species, shaped by geological processes like volcanism and plate tectonics. Some minerals, like quartz, are stable and abundant; others, less stable, persist in specific niches. This selective process mirrors biological evolution, where traits are favored by environmental pressures. Similarly, the universe’s chemical complexity grew from hydrogen and helium to the 118 known elements, each step driven by stellar processes that “selected” for stable nuclei.
The Critics’ Case
Not everyone is convinced. Critics argue that the law of increasing functional information is too vague to be a true law of nature. “It’s an interesting idea, but I’m not sure it clears the bar,” said astronomer Martin Rees in an interview with The Guardian. “The complexity of the inanimate world emerges from physics and chemistry over vast timescales, not a new principle.”
One major sticking point is measurement. Functional information is contextual—what a system does depends on its environment. A protein’s function in a cell differs from its role in a test tube. Calculating functional information for a single cell, let alone a mineral or a star, is currently impossible. “I’d love to see an experiment that tests this objectively,” said Sara Walker, a physicist at Arizona State University who studies complexity through her assembly theory. “Without that, it’s hard to say if it’s right or wrong.”
Others question whether the law aligns with the second law of thermodynamics. “The second law is inviolable,” wrote Philip Ball in Quanta Magazine. “Proposing a law that seems to counter it, even if it claims harmony, invites skepticism.” Critics like Ball point out that while local systems can become more ordered (like a crystal forming), the overall entropy of the universe still increases. Hazen counters that their law doesn’t violate thermodynamics but describes a parallel process where selection for function creates pockets of complexity.
A Cosmic Perspective
Despite the doubts, the hypothesis is sparking excitement. Stuart Kauffman, a complexity theorist at the University of Pennsylvania, called it a “legitimate” step toward a grand narrative of nature. “They’re asking the right questions,” he said. “Physics alone can’t predict the novelties evolution introduces.”
The law also offers practical applications. In astrobiology, Wong suggests looking for signs of selection—say, an overabundance of certain molecules on an exoplanet—as evidence of lifelike processes. In oncology, researchers like Frédéric Thomas see parallels in how cancer cells evolve, selecting for functions that enhance survival. Even AI, with its rapidly evolving algorithms, might be governed by this principle, raising questions about how artificial systems could shape our future.
Perhaps the most profound implication is philosophical. If complexity is inevitable, Fermi’s Paradox takes on new light. Intelligent life might not be a fluke but a natural outcome of the universe’s drive toward complexity. “If our law holds,” Wong said, “complex life should be common, even expected.” This doesn’t mean aliens are knocking, but it suggests the cosmos is primed to produce intricate systems, from minds to machines.
Echoes of Thermodynamics
The debate over this new law feels like a replay of the early days of thermodynamics, when scientists grappled with heat, work, and the arrow of time. Back then, questions about steam engines led to profound insights about the universe’s fate. Today, questions about complexity could do the same. “There’s a sense that something big is afoot,” said Ball. “We’re converging on ideas about information, evolution, and purpose that could redefine how we see reality.”
Hazen and Wong’s work is just a beginning. They’re planning studies on mineral evolution, nucleosynthesis, and computational models to test their ideas. Whether their law holds up or not, it’s forcing us to rethink the universe as a place not just of decay but of creation. The cosmos, it seems, doesn’t just wind down—it builds up, crafting ever more intricate tapestries of matter, energy, and information.
As I spoke with Hazen, he leaned forward, his eyes bright. “We’re not saying we’ve cracked it,” he said. “But we’re pointing to a path—a way to understand why the universe looks the way it does.” In a world obsessed with entropy’s relentless march, that’s a hopeful thought: that the universe, in its deepest workings, might be conspiring to make things not just possible, but extraordinary.
This story draws on research published in the Proceedings of the National Academy of Sciences and reporting from Wired and Quanta Magazine
Comments