newcohospitality.com

Exploring the Concept of Reality as Information

Written on

Reflection of North Dome in Merced River, Yosemite

What lies at the core of reality? Can everything we perceive be reduced to a singular concept? This question has intrigued philosophers since ancient times, leading to three predominant theories:

  • Materialism: This view posits that everything consists of matter. Einstein's revelation, E=mc², indicates that matter and energy are interchangeable, prompting a refined definition: all is made of matter-energy. A more nuanced version is naturalism, which asserts that the universe comprises solely matter, energy, and the laws governing them.
  • Idealism: This perspective claims that matter is merely an illusion; the true essence of existence lies in ideas, consciousness, or spirit. Variants include pantheism, which suggests that God permeates all, and panpsychism, which posits that consciousness is fundamental to everything.
  • Dualism: This belief holds that both matter and mind exist independently. For example, Christianity views God as the creator of the physical world, distinct from it, thus rejecting pantheism.

I propose an alternative viewpoint: everything that exists can be distilled down to information. This concept can be termed the Information Paradigm.

The Information Paradigm

With the rise of technology and the information era, the notion of information has become ubiquitous. We acquire it through music, movies, books, and education. Money itself can be viewed as information, a collection of numbers within banking systems, further emphasized by the emergence of cryptocurrencies.

While the Information Paradigm may resemble Idealism, it diverges significantly; unlike abstract concepts, information is quantifiable. The challenge of reconciling mind with matter is avoided when both are viewed as forms of information, which is just as tangible as matter or energy. Various physical entities, such as genes and entropy, can be interpreted as forms of information.

Materialism struggles to explain the existence of laws governing matter and energy since these laws are fundamentally informational. Conversely, dualism grapples with the challenge of how two disparate realms—mind and matter—interact. If both are seen as information, the interaction becomes coherent.

Moreover, non-material entities like books, songs, and software can also be classified as information. This idea was notably discussed in "The Self and Its Brain," co-authored by neuroscientist Sir John Eccles and philosopher Karl Popper, who defended dualism. Today, we recognize that these non-material forms fundamentally consist of information.

The Challenge of Defining Information

Before we can fully embrace the Information Paradigm, we must establish a cohesive definition of information that unifies all existence. Various definitions exist within disciplines like computer science and thermodynamics. The challenge lies in demonstrating that these diverse concepts converge on a singular understanding.

Traditionally, information was perceived as a human construct. However, contemporary scientific views regard it as an independent entity, much like matter and energy. We can quantify information in DNA and analyze its fate within black holes. Our interactions with information mirror those with matter and energy, as something to be generated, stored, and utilized.

Defining information as a fundamental aspect of reality is fraught with difficulty. If we claim that X constitutes the essence of all existence, defining X invariably leads to circular reasoning. This predicament is similarly encountered in other philosophical frameworks, such as defining spirit in Idealism or God in Pantheism.

Materialism sidesteps this issue with clear definitions of matter and energy; however, it falters when addressing the nature of physical laws, which exist independently of matter and energy.

Some philosophers have shifted to Naturalism, proposing that ultimate reality consists of matter, energy, and natural laws. Ironically, this stance nudges us toward a form of dualism since these laws are distinct from matter and energy. Yet, if both can be reframed as information, we return to a unified framework.

Properties of Information

I anticipate that physics will soon yield a rigorous definition of information applicable across various scientific domains. In the meantime, I will outline several key properties of information:

  • It is measurable.
  • It defines connections or interactions among entities.
  • In a closed system, the amount of information can increase, not by emerging from nothing but through processes akin to algorithms, which operate under established rules.
  • Destroying information requires energy.
  • In thermodynamics, information equates to entropy.
  • Natural laws are forms of information, many of which are emergent and crafted by algorithms based on pre-existing laws.
Slide discussing the origin of natural laws

Physics: Information Transfer

The quantifiable nature of information suggests that a closed system cannot harbor infinite information. The Universe, being closed, likely contains a finite amount of information, which may clarify certain enigmas within Quantum Mechanics. As we delve into subatomic realms, we encounter systems with minimal information.

Heisenberg’s Uncertainty Principle might be interpreted as a consequence of a particle's finite information. When measuring a quantum system, the transfer of information occurs from a macroscopic measuring apparatus—rich in information—to a quantum system, which possesses so little information that its state remains undefined. This perspective alleviates the need for convoluted explanations involving multiple universes or consciousness's role at the quantum level.

Physicist John Archibald Wheeler first articulated the notion that Quantum Mechanics could be framed in terms of information with his phrase "it from bit":

> “It from bit. In other words, every particle, every force field, even the space-time continuum itself derives its function, meaning, and existence from binary choices, bits. This symbolizes the idea that everything physical has an immaterial source; reality arises from yes-no questions and the responses elicited by devices.”

Precise measurement of a point in space-time is impossible due to the infinite information requirement. Thus, space-time must be quantized, establishing limits at the Planck scale for distance and time. Information cannot travel at infinite speeds; it can only traverse one Planck length per Planck time, aligning with the speed of light.

In the framework of relativity, it isn't solely light or matter that must adhere to this limit; information itself is bound by it.

Thermodynamics: Information Equals Entropy

Consider a box filled with black and white balls. If the black balls are segregated to one side, the system conveys little information. In contrast, if the balls are mixed, we require extensive information to describe their arrangement. Thus, low information correlates with low entropy, while high information aligns with high entropy.

In thermodynamics, information and entropy are synonymous. To store information within a system, it should maintain low entropy. In a well-organized state, we can convey information by strategically placing balls. However, in a mixed state, conveying information necessitates considerable energy to rearrange them.

The limit of information storage in a system is dictated by its entropy. A high-entropy system cannot effectively store additional information. This principle parallels how we manage storage on computer hard drives; increasing available space requires the deletion of existing data, which demands energy.

As systems grow complex enough, they transform into algorithms capable of generating additional information. This process might account for the arrow of time—where time appears unidirectional, only manifesting in macroscopic systems, as quantum systems lack this asymmetry.

Molecular Biology: Life as Information Processing

In my article "The Secret of Life," I emphasize that homeostasis—the stable chemical balance in living organisms—is crucial to understanding life. Unlike non-living entities, living organisms continuously react with their environment, maintaining a state far from chemical equilibrium.

Homeostasis is upheld by signaling networks, primarily through negative feedback, which correct deviations from balance. Additionally, the information in DNA guides cellular responses to environmental changes. Life can thus be conceptualized as a complex algorithm.

In 1943, physicist Erwin Schrödinger made groundbreaking observations regarding life during a lecture at Trinity College, later published in "What Is Life?". He proposed two revolutionary concepts:

  • Living beings store information in "aperiodic crystals," molecules where atoms are organized in varied sequences.
  • Life can be understood through the lens of entropy; organisms maintain low entropy by absorbing energy from their surroundings.

These ideas were experimentally validated: DNA was identified as the aperiodic crystal, and Ilya Prigogine later expanded on the notion of living beings as "dissipative structures" in 1955, which earned him a Nobel Prize in Chemistry in 1977.

Subsequent discoveries unveiled the intricacies of genetic algorithms. DNA is transcribed into mRNA, which is translated into protein chains according to genetic codes. Proteins function as nanomachines facilitating metabolic processes, and ongoing research continues to unravel the complexities of cellular signaling pathways.

Evolutionary Biology: Evolution as an Algorithm

The concept of information processing can also elucidate evolution as an algorithm. This algorithm entails:

  1. Generating mutations in DNA.
  2. Translating mutations into proteins and behaviors.
  3. Testing these changes against environmental challenges.
  4. Discarding non-viable mutations.
  5. Retaining advantageous mutations.
  6. Repeating the process.

Stuart Kauffman created computer models to illustrate evolutionary algorithms, positing that organisms evolve by exploring the state space of compatible forms for survival. Such algorithms emerge as life diversifies through tighter metabolic control, leading to reproduction and the transmission of information beyond an organism's life.

Over time, organisms transitioned from random to directed mutations. While radiation and chemistry introduced randomness into DNA, organisms that allowed for more mutations ultimately outcompeted their more protective counterparts. A balance arose between conserving DNA and embracing beneficial mutations.

Some organisms discovered a clever strategy: maintaining paired sets of DNA, allowing mutations in one to be compensated by the other. This led to the development of sexual reproduction, enabling genetic diversity and adaptation.

Neuroscience: Mind as an Algorithm

The mutation-natural selection algorithm sustained life for eons, allowing for information extraction from the environment. However, this method proved inefficient. Certain species evolved sensory systems to swiftly adapt to environmental changes without relying on genetic translation.

As specialized organs emerged, such as the endocrine and nervous systems, information storage expanded beyond DNA to include neuronal memory. Evolution continued to explore the potential of larger brains and enhanced processing abilities.

Culture as an Algorithm

A limitation of brain-stored information is its inability to be inherited; it perishes with the individual. However, one species with large brains developed language, facilitating information sharing among its members. This advancement accelerated learning and allowed knowledge to accumulate across generations.

As society evolved, more efficient information storage and processing methods emerged—writing, books, mathematics, science, and computers.

A Nested Hierarchy of Information Systems

From the perspective of the Information Paradigm, existence can be understood as a nested hierarchy of information systems:

  • The physical realm encompasses particles, forces, planets, stars, and galaxies.
  • The chemical realm consists of atoms and molecules.
  • The biological domain includes cells, viruses, plants, and animals.
  • The psychological sphere pertains to minds.
  • The societal level encompasses cultures, economics, arts, and sciences.

This hierarchical structure adds complexity at each tier, with each level building upon and relying on its predecessor, much like layers of an onion. The properties of upper levels cannot be predicted solely from the lower levels; the characteristics of living systems cannot be deduced from physics and chemistry, nor can cultural properties be predicted from biology.

The Information Paradigm offers a comprehensive framework to understand existence. It circumvents the challenges of naturalism and dualism while presenting a scientifically definable concept of information, unlike panpsychism's ambiguous universal consciousness.

However, a lingering question remains: does information exist as a separate entity alongside matter and energy, or can matter and energy be defined as forms of information?