Category: Physics

Gravity and the Dark Star

Totality in Nebraska

I began at 5 AM from the Broomfield Aloft hotel, strategically situated in a sterile “new urban” office park cum apartment complex along the connecting freeway between Denver and Boulder. The whole weekend was fucked in a way: colleges across Colorado were moving in for a Monday start, half of Texas was here already, and most of Colorado planned to head north to the zone of totality. I split off I-25 around Loveland and had success using US 85 northbound through Cheyenne. Continuing up 85 was the original plan, but that fell apart when 85 came to a crawl in the vast prairie lands of Wyoming. I dodged south and east, then, (dodging will be a continuing theme) and entered Nebraska’s panhandle with middling traffic.

I achieved totality on schedule north of Scottsbluff. And it was spectacular. A few fellow adventurers were hanging out along the outflow lane of an RV dump at a state recreation area. One guy flew his drone around a bit. Maybe he wanted B roll for other purposes. I got out fast, but not fast enough, and dodged my way through lane closures designed to provide access from feeder roads. The Nebraska troopers were great, I should add, always willing to wave to us science and spectacle immigrants. Meanwhile, SiriusXM spewed various Sibelius pieces that had “sun” in their name, while the Grateful Dead channel gave us a half dozen versions of Dark Star, the quintessential jam song for the band that dates to the early, psychedelic era of the band.

Was it worth it? I think so, though one failed dodge that left me in a ten mile bumper-to-bumper crawl in rural Nebraska with a full bladder tested my faith in the stellar predictability of gravity. Gravity remains an enigma in many ways, though the perfection of watching the corona flare around the black hole sun shows just how unenigmatic it can be in the macroscopic sphere.

But reconciling gravity with quantum-scale phenomena remains remarkably elusive and is the beginning of the decades-long detour through string theory which, admittedly, some have characterized as “fake science” due to our inability to find testable aspects of the theory. Yet, there are some interesting recent developments that, though they are not directly string theoretic, have a relationship to the quantum symmetries that, in turn, led to stringiness.

So I give you Juan Maldacena and Leonard Susskind’s suggestion that ER = EPR. This is a rather remarkable conclusion that unites quantum and relativistic realities, but is based on a careful look at the symmetry between two theoretical outcomes at the two different scales. So how does it work? In a nut shell, the claim is that quantum entanglement is identical to relativistic entanglement. Just like the science fiction idea of wormholes connecting distant things together to facilitate faster-than-light travel, ER connects singularities like black holes together. And the correlations that occur between black holes is just like the correlations between entangled quanta. Neither is amenable to either FTL travel or signaling due to Lorentzian traversability issues (former) or Bell’s Inequality (latter).

Today was just a shadow, classically projected, maybe just slightly twisted by the gravity wells, not some wormhole wending its way through space and time. It is worth remembering, though, that the greatest realization of 20th century physics is that reality really isn’t in accord with our everyday experiences. Suns and moons kind of are, briefly and ignoring fusion in the sun, but reality is almost mystically entangled with itself, a collection of vibrating potentialities that extend out everywhere, and then, unexpectedly, the potentialities are connected together in another way that defies these standard hypothetical representations and the very notion of space connectivities.

Quantum Field Is-Oughts

teleologySean Carroll’s Oxford lecture on Poetic Naturalism is worth watching (below). In many ways it just reiterates several common themes. First, it reinforces the is-ought barrier between values and observations about the natural world. It does so with particular depth, though, by identifying how coarse-grained theories at different levels of explanation can be equally compatible with quantum field theory. Second, and related, he shows how entropy is an emergent property of atomic theory and the interactions of quantum fields (that we think of as particles much of the time) and, importantly, that we can project the same notion of boundary conditions that result in entropy into the future resulting in a kind of effective teleology. That is, there can be some boundary conditions for the evolution of large-scale particle systems that form into configurations that we can label purposeful or purposeful-like. I still like the term “teleonomy” to describe this alternative notion, but the language largely doesn’t matter except as an educational and distinguishing tool against the semantic embeddings of old scholastic monks.

Finally, the poetry aspect resolves in value theories of the world. Many are compatible with descriptive theories, and our resolution of them is through opinion, reason, communications, and, yes, violence and war. There is no monopoly of policy theories, religious claims, or idealizations that hold sway. Instead we have interests and collective movements, and the above, all working together to define our moral frontiers.

 

On Woo-Woo and Schrödinger’s Cat

schrodingers-cat-walks-into-a-bar-memeMichael Shermer and Sam Harris got together with an audience at Caltech to beat up on Deepak Chopra and a “storyteller” named Jean Houston in The Future of God debate hosted by ABC News. And Deepak got uncharacteristically angry back behind his crystal-embellished eyewear, especially at Shermer’s assertion that Deepak is just talking “woo-woo.”

But is there any basis for the woo-woo that Deepak is weaving? As it turns out, he is building on some fairly impressive work by Stuart Hameroff, MD, of University of Arizona and Sir Roger Penrose of Oxford University. Under development for more than 25 years, this work has most recently been summed up in their 2014 paper, “Consciousness in the universe: A review of the ‘Orch OR’ theory” available for free (but not the commentaries, alas). Deepak was even invited to comment on the paper in Physics of Life Reviews, though the content of his commentary was challenged as being somewhat orthogonal or contradictory to the main argument.

To start somewhere near the beginning, Penrose became obsessed with the limits of computation in the late 80s. The Halting Problem sums up his concerns about the idea that human minds can possibly be isomorphic with computational devices. There seems to be something that allows for breaking free of the limits of “mere” Turing Complete computation to Penrose. Whatever that something is, it should be physical and reside within the structure of the brain itself. Hameroff and Penrose would also like that something to explain consciousness and all of its confusing manifestations, for surely consciousness is part of that brain operation.

Now, to get at some necessary and sufficient sorts of explanations for this new model requires looking at Hameroff’s medical speciality: anesthesiology. Anyone who has had surgery has had the experience of consciousness going away while body function continues on, still mediated by brain activities. So certain drugs like halothane erase consciousness through some very targeted action. Next, consider that certain prokaryotes have internally coordinated behaviors without the presence of a nervous system. Finally, consider that it looks like most neurons do not integrate and fire like the classic model (and the model that artificial neural networks emulate), but instead have some very strange and random activation behaviors in the presence of the same stimuli.

How do these relate? Hameroff has been very focused on one particular component to the internal architecture of neural cells: microtubules or MTs. These are very small (compared to cellular scale) and there are millions in neurons (10^9 or so). They are just cylindrical polymers with some specific chemical properties. They also are small enough (25nm in diameter) that it might be possible that quantum effects are present in their architecture. There is some very recent evidence to this effect based on strange reactions of MTs to tiny currents of varying frequencies used to probe them. Also, anesthetics appear to bind to MTs, Indeed, they could also provide a memory strata that is orders of magnitude greater than the traditional interneuron concept of how memories form.

But what does this have to do with consciousness beyond the idea that MTs get interfered with by anesthetics and therefore might be around or part of the machinery that we label conscious? They also appear to be related to Alzheimer’s disease, but this could be just related to the same machinery. Well, this is where we get woo-woo-ey. If consciousness is not just an epiphenomena arising from standard brain function as a molecular computer, and it is also not some kind of dualistic soul overlay, then maybe it is something that is there but is not a classical computer. Hence quantum effects.

So Sir Penrose has been promoting a rather wild conjecture called the Diósi-Penrose theory that puts an upper limit on the amount of time a quantum superposition can survive. It does this based on some arguments I don’t fully understand but that integrate gravity with quantum phenomena to suggest that the mass displaced by the superposed wave functions snaps the superposition into wave collapse. So Schrödinger’s cat dies or lives very quickly even without an observer because there are a lot of superposed quantum particles in a big old cat and therefore very rapid resolution of the wave function evolution (10^-24s). Single particles can live in superposition for much longer because the mass difference between their wave functions is very small.

Hence the OR in “Orch OR” stands for Objective Resolution: wave functions are subject to collapse by probing but they also collapse just because they are unstable in that state. The resolution is objective and not subjective. The “Orch” stands for “Orchestrated.” And there is the seat of consciousness in the Hameroff-Penrose theory. In MTs little wave function collapses are constantly occurring and the presence of superposition means quantum computing can occur. And the presence of quantum computing means that non-classical computation can take place and maybe even be more than Turing Complete.

Now the authors are careful to suggest that these are actually proto-conscious events and that only their large-scale orchestration leads to what we associate with consciousness per se. Otherwise they are just quantum superpositions that collapse, maybe with 1 qubit of resolution under the right circumstances.

At least we know the cat has a fate now. That fate is due to an objective event, too, and not some added woo-woo from the strange world of quantum phenomena. And the cat’s curiosity is part of the same conscious machinery.

Entanglement and Information

shannons-formula-smallResearch can flow into interesting little eddies that cohere into larger circulations that become transformative phase shifts. That happened to me this morning between a morning drive in the Northern California hills and departing for lunch at one of our favorite restaurants in Danville.

The topic I’ve been working on since my retirement is whether there are preferential representations for optimal automated inference methods. We have this grab-bag of machine learning techniques that use differing data structures but that all implement some variation on fitting functions to data exemplars; at the most general they all look like some kind of gradient descent on an error surface. Getting the right mix of parameters, nodes, etc. falls to some kind of statistical regularization or bottlenecking for the algorithms. Or maybe you perform a grid search in the hyperparameter space, narrowing down the right mix. Or you can throw up your hands and try to evolve your way to a solution, suspecting that there may be local optima that are distracting the algorithms from global success.

Yet, algorithmic information theory (AIT) gives us, via Solomonoff, a framework for balancing parameterization of an inference algorithm against the error rate on the training set. But, first, it’s all uncomputable and, second, the AIT framework just uses strings of binary as the coded Turing machines, so I would have to flip 2^N bits and test each representation to get anywhere with the theory. Yet, I and many others have had incremental success at using variations on this framework, whether via Minimum Description Length (MDL) principles, it’s first cousin Minimum Message Length (MML), and other statistical regularization approaches that are somewhat proxies for these techniques. But we almost always choose a model (ANNs, compression lexicons, etc.) and then optimize the parameters around that framework. Can we do better? Is there a preferential model for time series versus static data? How about for discrete versus continuous?

So while researching model selection in this framework, I come upon a mention of Shannon’s information theory and its application to quantum decoherence. Of course I had to investigate. And here is the most interesting thing I’ve seen in months from the always interesting Max Tegmark at MIT:

Particles entangle and then quantum decoherence causes them to shed entropy into one another during interaction. But, most interesting, is the quantum Bayes’ theory section around 00:35:00 where Shannon entropy as a classical measure of improbability gets applied to the quantum indeterminacy through this decoherence process.

I’m pretty sure it sheds no particular light on the problem of model selection but when cosmology and machine learning issues converge it gives me mild shivers of joy.

A, B, C time!

time-flows-awayThis might get technical, despite the vaguely Sesame Street quality to the title. You see, philosophers have long worried over time and causality, and rightly so, going back to the Greeks like Heraclitus and Parmenides, as well as their documenters many years later. Is time a series of events one after another or is that a perceptual mistake? For if everything comes from some cascade of events that precede it, it is illogical to presume that something might emerge from nothing (Parmenides). And, contra, perhaps all things are in a state of permanent change and all such perceptions are confused (Heraclitus). The latter has some opaque formulations in the appreciation of the Einsteinian relativistic form of combining space and time together while still preserving the symmetry of time in the basic equations, allowing for the rolling forward and backward of the space-time picture without much in the way of consequences.

So Lee Smolin’s re-injection of time as a real phenomena in Time Reborn takes us from A and B theories of time to something slightly new, which might be called a C theory. This theory builds on Smolin’s previous work where he proposed an evolutionary model of cosmology to explain how the precarious constants of our observed universe might have come into being. In Smolin’s super-cosmology, many universes come to be and not be at an alarming rate. Indeed, perhaps in every little black hole is another one. But many of these universes are not very viable because they lack the physical constants needed to last a long time and for entities like us to evolve to try to comprehend them. This does away with any mysteries about the Anthropic Principle: we are just survivors.

Smolin’s new work has some other rather interesting temporal consequences that are buried behind a wall of revisited thermodynamic reasoning: there are actually only a few basic particles that become other instances as they evolve over time. Because they are still connected together at a fundamental level, these particles are entangled in a collection of what we call forces, but as time unwinds, they become increasingly differentiated. Time piles on history, and the disparate trajectories are distinct enough that they become the arrow of time that thermodynamic evolution dictates.

Interestingly, in Smolin’s universe, time piles up into consistencies that we interpret as physical law. Physical law does not pre-exist per se, but is a consequence of the mighty machinations that emerge from a universe in a state of change, perhaps like that of Heraclitus, but also like that of Leibniz and Husserl. The pervasiveness of the notion of randomization and selection as an alternative to static views of the universe is interesting because it also begs the question of what else can possibly explain what we observe? Is there a post-Darwin “crane” that can lift the universe or are we at the end of big science?

Cosmologies and Theories of Everything

Zach, fictional though he is, is not the only one interested in cosmological theories. But what form do these theories take? A Theory of Everything or TOE is a theory that intends to explain the entire observable universe using a compact specification of equations and the conceptual arguments that support them. In the modern sense, a TOE is a physical explanation of the large-scale structure of the universe. Later, we can start to expand the TOE to look for “bridging laws” that help justify other phenomena that approach the human scale.

What are our alternatives? The previous post mentioned the Catholic Church’s embrace of Big Bang cosmology as justifying Genesis. Apologist and philosopher of religion William Lane Craig also elaborately evaluates Big Bang theories as substantiating theism by supporting creation at the singularity event.

But do these notions change the underlying TOEs? No, in general. The best that they can do is accept the TOE as an input and make a deductive argument based on assumptions that are not excluded by the TOE. For apologists, that means that the singularity event provides a divide between a non-temporal pre-universe and the current universe–effectively between non-existence and existence. But that is not the only TOE available to us. There are a range of TOEs that have been devised.  The following is derived from Marcus Hutter’s A Complete Theory of Everything (Will Be Subjective):

  1. (G) Geocentric model: Ancient notion that the Earth is at the center of the known universe.
  2. (H) Heliocentric model: Evolution of the model to centralize on the Sun.
  3. (E) Effective theories: General relativity, quantum electrodynamics, and Newtonian mechanics, but without a unifying architecture.
  4. (P) Standard model of particle physics: Beginning of unification that contains numerous arbitrary parameters and has yet to unify gravity.
  5. (S) String theory: new theoretical framework that unifies gravitation and P.
  6. (C) Cosmological models: Standard inflationary Big bang stuff.
  7. (M) Multiverse theories: The notion that there are many possible universes and that they might overlay one another through black holes or just evolve in parallel with one another.
  8. (U) Universal ToE: We’ll get back to this in a future post, but this is just an extension of M where we live in one of the multiverses and that the multiverse is “computable” in that it can be characterized in a specific way that lets us argue about its properties.
  9. (R) Random universe: This is essentially the same argument that irrational numbers like Pi or e, in that they contain infinite, random digits, also contain all the known works of Shakespeare. Likewise, an infinite and random universe would contain low-entropy areas that might look like our universe and, perhaps, contain local information sufficient to deceive us about the properties of the universe.
  10. (A) Al-a-Carte models: This is like buffet-style religion, but we can simply claim that the universe is a subset of a random string of specifications and achieve similar results to R.

Do any of these theories have anything to do with religious notions, whether Western, abstractly New Age, or Eastern? I find no similarities. The defining difference is between an epistemological approach that reifies mystical abstractions derived from pure speculation versus one that attempts to harmonize empirical results with theorization.

Zach is justified in his enthusiasm for the latter.