A model of the Universe

Published on by Catherine Toulsaly

The naturalist can never be its concept. Once it has become so, it is most likely an idealist. Sentiment, backward looking, attempting to collect what a constant collision has distributed to history; the natural idealist is what it means to be living-torn-apart.

Adam Staley Groves

 

Writing is for me an irrepressible and fragmented journey, the purpose of which is still a mystery. The process may involve picking up bread crumbs dropped along the way in the form of ideas or images. The trail won’t bring me back to where I started but where I can’t yet tell, places I don’t expect. The intertwined, curvy branches of the dead oak tree are whispering a pareidolic message in my ear. I see seeds of truth in a passing cloud, in mirroring rays of sunshine,  in petals of a blossoming redbud. Could a computational model of the Universe recreate the language spoken by nature, a massive siphonophore deep in the blue waters off the coast of Australia, bottlenose dolphins swimming downstream in the Chesapeake Bay, and the rainbows blown by whales? 

Cygnus Loop Nebula ( NASA/JPL-Caltech)

Cygnus Loop Nebula ( NASA/JPL-Caltech)

 

Visual encounters nurture my mind. From the blue Cygnus Loop Nebula to Wolfram’s spatial hypergraph, I see web-like monster structures drawn with three strokes crisscrossing each other endlessly as they create a causal chain of mazes and labyrinths in a seamless information flow through which causation drives the seen and unseen physical motion of an architectonic structure built on the principles of space, time and gravity. 

 

Structures in the Universe are entangled and organized from large scales to small ones.  The distribution of matter in space includes super-clusters, round-shaped galaxy clusters, long-shaped structures of matter named filaments, planar structures dubbed walls which are more extended than filaments, and have overall lower densities -- with some shorter filaments denser and brighter than longer ones -- and voids in between.

 

Computational representations of the Universe could, in theory, decipher the language used by the Universe. The spatial hypergraph may be described as “lines” connecting any number of points, not just one to another. In the constant flow of image and narrative running through my head, it reminds me of the structure of a symmetry group that repeats itself. The points of intersection are sequences of the film of events in spacetime. They embody the primary ontology of the Universe the same way those images and narratives embody the stream of my consciousness.

Spatial Hypergraph

Spatial Hypergraph

 

The Wolfram Physics Project is meant to add a new substrate in an attempt to fit all together existing knowledge about physics. It is intended to be an underlying theory of the whole universe, in perfect detail. The issue is precisely how challenging it would be to mirror every detail of the cosmos. A plethora of theories have spread seeds of truth in the past that computer scientists, mathematicians, and theoretical physicists are attempting to collect. Truth is a participative goal, a collaborative journey. Such an undertaking would benefit from inputs of previous and current simulations and computer models. 

 

Given the fact that most of the Universe is invisible and  95% of its contents consist of dark matter and dark energy, which we do not yet understand, I imagine that other projects such as the EDECS (Exploring Dark Energy through Cosmic Structures) could be of interest. 3-D models for sections of the Universe, such as the model for the Huygens Region of the Orion Nebula, could be combined to create a puzzle-like computational outline of the Universe. Other inputs to this collaboration could also be what we learned from simulations of early planetary migration in regard to our own solar system or models of the magnetic field of the Earth and the Sun.

 

We are learning that metal-enriched material ejected through galaxy formation roughly 10 billion years ago with a decrease toward present-day from supernovae and stellar feedback is deposited into the circumgalactic medium. In a paper last year entitled Voyage through the Hidden Physics of the Cosmic Web, Aurora Simionescu presented a new mission concept for a Cosmic Web Explorer expected to reach unprecedented X-ray sensitivity limits in order to gather data on the variabilities of gas and plasma conditions between, around and within galaxies, on specific measurements of metallicity and on the traceability of light and heavy elements. By measuring how far metals are spread, how many metals escape the halo of their host galaxy, and when this process occurs, and by determining the relative chemical composition between various light and heavy elements, we could in theory map out the chemical evolution of the Universe as a whole

Cosmic microwave background (JPL/ESA and the Planck Collaboration)

Cosmic microwave background (JPL/ESA and the Planck Collaboration)

 

Computational simulations bring me back to Franco Vazza and how he used comparisons to describe the evolution of gas in the cosmic web and estimated the total statistical complexity within the observable Universe. As I mentioned before, he stated that the combination of Information Theory and modern cosmological simulations makes it possible to tackle a challenging question such as the complexity of the Universe we live in. But the Universe, wrote Carlo Rovelli in his book The Order of Time,  is like a superposition of strata that influence and overlap each other. It is not just the flow of things -- gas and plasmas -- but the things that make those flows, their intrinsic complexity and wide range of densities, their granularity which allows them to manifest in the form of elementary particles or quanta. 

 

Those multiple flows within the flow -- streams of electrons, baryons, photons, and neutrinos -- make for the irregular, complex and dynamically evolving structure of spacetime. Deep into the interaction between gravitational and electromagnetic fields, within the texture of time and space and the web of a complex geometry are the things that all those superposed layers are made of.  A complete picture could only be achieved if all the variables are fed bit by bit into such a project.

 

A 2017 paper highlighted as well the need for magneto-hydrodynamical cosmological simulations and ultra-high energy cosmic rays simulations  in order to better understand turbulence, magnetic fields, and cosmic-ray evolution. Future radio observations will offer the chance to measure the magnetization at the outskirts of clusters and in filaments that connect them. Another paper on the current status of astrophysical and cosmological simulations of magnetic field amplification in structure formation addressed in 2018 the successes and limitations of numerical models for predictions of extragalactic magnetic fields.

Randomness has different antipodal relationships to determinism, computability, and compressibility. Factoring it into computational data remains extremely complex.  Nevertheless some give support to the idea that randomness comes as a result of our own ignorance and the limitations of our human condition. Based on a principle of computational equivalence and a deterministic approach to the theory of everything, a model of the Universe could in theory predict layers and layers of seemingly random events.  If one could run the model long enough, then it is intended to reproduce everything about the universe, Computer power, including quantum computing in the future, could crack a code that has been eluding us all along only if there is the necessary amount of computation required to do so. Indeterminacy is where the mystery still lies.

 

To be informed of the latest articles, subscribe:
Comment on this post