Category: Cosmology

Gravity and the Dark Star

Totality in Nebraska

I began at 5 AM from the Broomfield Aloft hotel, strategically situated in a sterile “new urban” office park cum apartment complex along the connecting freeway between Denver and Boulder. The whole weekend was fucked in a way: colleges across Colorado were moving in for a Monday start, half of Texas was here already, and most of Colorado planned to head north to the zone of totality. I split off I-25 around Loveland and had success using US 85 northbound through Cheyenne. Continuing up 85 was the original plan, but that fell apart when 85 came to a crawl in the vast prairie lands of Wyoming. I dodged south and east, then, (dodging will be a continuing theme) and entered Nebraska’s panhandle with middling traffic.

I achieved totality on schedule north of Scottsbluff. And it was spectacular. A few fellow adventurers were hanging out along the outflow lane of an RV dump at a state recreation area. One guy flew his drone around a bit. Maybe he wanted B roll for other purposes. I got out fast, but not fast enough, and dodged my way through lane closures designed to provide access from feeder roads. The Nebraska troopers were great, I should add, always willing to wave to us science and spectacle immigrants. Meanwhile, SiriusXM spewed various Sibelius pieces that had “sun” in their name, while the Grateful Dead channel gave us a half dozen versions of Dark Star, the quintessential jam song for the band that dates to the early, psychedelic era of the band.

Was it worth it? I think so, though one failed dodge that left me in a ten mile bumper-to-bumper crawl in rural Nebraska with a full bladder tested my faith in the stellar predictability of gravity. Gravity remains an enigma in many ways, though the perfection of watching the corona flare around the black hole sun shows just how unenigmatic it can be in the macroscopic sphere.

But reconciling gravity with quantum-scale phenomena remains remarkably elusive and is the beginning of the decades-long detour through string theory which, admittedly, some have characterized as “fake science” due to our inability to find testable aspects of the theory. Yet, there are some interesting recent developments that, though they are not directly string theoretic, have a relationship to the quantum symmetries that, in turn, led to stringiness.

So I give you Juan Maldacena and Leonard Susskind’s suggestion that ER = EPR. This is a rather remarkable conclusion that unites quantum and relativistic realities, but is based on a careful look at the symmetry between two theoretical outcomes at the two different scales. So how does it work? In a nut shell, the claim is that quantum entanglement is identical to relativistic entanglement. Just like the science fiction idea of wormholes connecting distant things together to facilitate faster-than-light travel, ER connects singularities like black holes together. And the correlations that occur between black holes is just like the correlations between entangled quanta. Neither is amenable to either FTL travel or signaling due to Lorentzian traversability issues (former) or Bell’s Inequality (latter).

Today was just a shadow, classically projected, maybe just slightly twisted by the gravity wells, not some wormhole wending its way through space and time. It is worth remembering, though, that the greatest realization of 20th century physics is that reality really isn’t in accord with our everyday experiences. Suns and moons kind of are, briefly and ignoring fusion in the sun, but reality is almost mystically entangled with itself, a collection of vibrating potentialities that extend out everywhere, and then, unexpectedly, the potentialities are connected together in another way that defies these standard hypothetical representations and the very notion of space connectivities.

Quantum Field Is-Oughts

teleologySean Carroll’s Oxford lecture on Poetic Naturalism is worth watching (below). In many ways it just reiterates several common themes. First, it reinforces the is-ought barrier between values and observations about the natural world. It does so with particular depth, though, by identifying how coarse-grained theories at different levels of explanation can be equally compatible with quantum field theory. Second, and related, he shows how entropy is an emergent property of atomic theory and the interactions of quantum fields (that we think of as particles much of the time) and, importantly, that we can project the same notion of boundary conditions that result in entropy into the future resulting in a kind of effective teleology. That is, there can be some boundary conditions for the evolution of large-scale particle systems that form into configurations that we can label purposeful or purposeful-like. I still like the term “teleonomy” to describe this alternative notion, but the language largely doesn’t matter except as an educational and distinguishing tool against the semantic embeddings of old scholastic monks.

Finally, the poetry aspect resolves in value theories of the world. Many are compatible with descriptive theories, and our resolution of them is through opinion, reason, communications, and, yes, violence and war. There is no monopoly of policy theories, religious claims, or idealizations that hold sway. Instead we have interests and collective movements, and the above, all working together to define our moral frontiers.

 

Entanglement and Information

shannons-formula-smallResearch can flow into interesting little eddies that cohere into larger circulations that become transformative phase shifts. That happened to me this morning between a morning drive in the Northern California hills and departing for lunch at one of our favorite restaurants in Danville.

The topic I’ve been working on since my retirement is whether there are preferential representations for optimal automated inference methods. We have this grab-bag of machine learning techniques that use differing data structures but that all implement some variation on fitting functions to data exemplars; at the most general they all look like some kind of gradient descent on an error surface. Getting the right mix of parameters, nodes, etc. falls to some kind of statistical regularization or bottlenecking for the algorithms. Or maybe you perform a grid search in the hyperparameter space, narrowing down the right mix. Or you can throw up your hands and try to evolve your way to a solution, suspecting that there may be local optima that are distracting the algorithms from global success.

Yet, algorithmic information theory (AIT) gives us, via Solomonoff, a framework for balancing parameterization of an inference algorithm against the error rate on the training set. But, first, it’s all uncomputable and, second, the AIT framework just uses strings of binary as the coded Turing machines, so I would have to flip 2^N bits and test each representation to get anywhere with the theory. Yet, I and many others have had incremental success at using variations on this framework, whether via Minimum Description Length (MDL) principles, it’s first cousin Minimum Message Length (MML), and other statistical regularization approaches that are somewhat proxies for these techniques. But we almost always choose a model (ANNs, compression lexicons, etc.) and then optimize the parameters around that framework. Can we do better? Is there a preferential model for time series versus static data? How about for discrete versus continuous?

So while researching model selection in this framework, I come upon a mention of Shannon’s information theory and its application to quantum decoherence. Of course I had to investigate. And here is the most interesting thing I’ve seen in months from the always interesting Max Tegmark at MIT:

Particles entangle and then quantum decoherence causes them to shed entropy into one another during interaction. But, most interesting, is the quantum Bayes’ theory section around 00:35:00 where Shannon entropy as a classical measure of improbability gets applied to the quantum indeterminacy through this decoherence process.

I’m pretty sure it sheds no particular light on the problem of model selection but when cosmology and machine learning issues converge it gives me mild shivers of joy.

A, B, C time!

time-flows-awayThis might get technical, despite the vaguely Sesame Street quality to the title. You see, philosophers have long worried over time and causality, and rightly so, going back to the Greeks like Heraclitus and Parmenides, as well as their documenters many years later. Is time a series of events one after another or is that a perceptual mistake? For if everything comes from some cascade of events that precede it, it is illogical to presume that something might emerge from nothing (Parmenides). And, contra, perhaps all things are in a state of permanent change and all such perceptions are confused (Heraclitus). The latter has some opaque formulations in the appreciation of the Einsteinian relativistic form of combining space and time together while still preserving the symmetry of time in the basic equations, allowing for the rolling forward and backward of the space-time picture without much in the way of consequences.

So Lee Smolin’s re-injection of time as a real phenomena in Time Reborn takes us from A and B theories of time to something slightly new, which might be called a C theory. This theory builds on Smolin’s previous work where he proposed an evolutionary model of cosmology to explain how the precarious constants of our observed universe might have come into being. In Smolin’s super-cosmology, many universes come to be and not be at an alarming rate. Indeed, perhaps in every little black hole is another one. But many of these universes are not very viable because they lack the physical constants needed to last a long time and for entities like us to evolve to try to comprehend them. This does away with any mysteries about the Anthropic Principle: we are just survivors.

Smolin’s new work has some other rather interesting temporal consequences that are buried behind a wall of revisited thermodynamic reasoning: there are actually only a few basic particles that become other instances as they evolve over time. Because they are still connected together at a fundamental level, these particles are entangled in a collection of what we call forces, but as time unwinds, they become increasingly differentiated. Time piles on history, and the disparate trajectories are distinct enough that they become the arrow of time that thermodynamic evolution dictates.

Interestingly, in Smolin’s universe, time piles up into consistencies that we interpret as physical law. Physical law does not pre-exist per se, but is a consequence of the mighty machinations that emerge from a universe in a state of change, perhaps like that of Heraclitus, but also like that of Leibniz and Husserl. The pervasiveness of the notion of randomization and selection as an alternative to static views of the universe is interesting because it also begs the question of what else can possibly explain what we observe? Is there a post-Darwin “crane” that can lift the universe or are we at the end of big science?

Cosmologies and Theories of Everything

Zach, fictional though he is, is not the only one interested in cosmological theories. But what form do these theories take? A Theory of Everything or TOE is a theory that intends to explain the entire observable universe using a compact specification of equations and the conceptual arguments that support them. In the modern sense, a TOE is a physical explanation of the large-scale structure of the universe. Later, we can start to expand the TOE to look for “bridging laws” that help justify other phenomena that approach the human scale.

What are our alternatives? The previous post mentioned the Catholic Church’s embrace of Big Bang cosmology as justifying Genesis. Apologist and philosopher of religion William Lane Craig also elaborately evaluates Big Bang theories as substantiating theism by supporting creation at the singularity event.

But do these notions change the underlying TOEs? No, in general. The best that they can do is accept the TOE as an input and make a deductive argument based on assumptions that are not excluded by the TOE. For apologists, that means that the singularity event provides a divide between a non-temporal pre-universe and the current universe–effectively between non-existence and existence. But that is not the only TOE available to us. There are a range of TOEs that have been devised.  The following is derived from Marcus Hutter’s A Complete Theory of Everything (Will Be Subjective):

  1. (G) Geocentric model: Ancient notion that the Earth is at the center of the known universe.
  2. (H) Heliocentric model: Evolution of the model to centralize on the Sun.
  3. (E) Effective theories: General relativity, quantum electrodynamics, and Newtonian mechanics, but without a unifying architecture.
  4. (P) Standard model of particle physics: Beginning of unification that contains numerous arbitrary parameters and has yet to unify gravity.
  5. (S) String theory: new theoretical framework that unifies gravitation and P.
  6. (C) Cosmological models: Standard inflationary Big bang stuff.
  7. (M) Multiverse theories: The notion that there are many possible universes and that they might overlay one another through black holes or just evolve in parallel with one another.
  8. (U) Universal ToE: We’ll get back to this in a future post, but this is just an extension of M where we live in one of the multiverses and that the multiverse is “computable” in that it can be characterized in a specific way that lets us argue about its properties.
  9. (R) Random universe: This is essentially the same argument that irrational numbers like Pi or e, in that they contain infinite, random digits, also contain all the known works of Shakespeare. Likewise, an infinite and random universe would contain low-entropy areas that might look like our universe and, perhaps, contain local information sufficient to deceive us about the properties of the universe.
  10. (A) Al-a-Carte models: This is like buffet-style religion, but we can simply claim that the universe is a subset of a random string of specifications and achieve similar results to R.

Do any of these theories have anything to do with religious notions, whether Western, abstractly New Age, or Eastern? I find no similarities. The defining difference is between an epistemological approach that reifies mystical abstractions derived from pure speculation versus one that attempts to harmonize empirical results with theorization.

Zach is justified in his enthusiasm for the latter.