The Comets of Literary Cohesion

Every few years, with the hyperbolic regularity of Kahoutek’s orbit, I return to B.R. Myers’ 2001 Atlantic essay, A Reader’s Manifesto, where he plays the enfant terrible against the titans of serious literature. With savagery Myers tears out the elliptical heart of Annie Proulx and then beats regular holes in Cormac McCarthy and Don DeLillo in a conscious mockery of the strained repetitiveness of their sentences.

I return to Myers because I currently have four novels in process. I return because I hope to be saved from the delirium of the postmodern novel that wants to be written merely because there is nothing really left to write about, at least not without a self-conscious wink:

But today’s Serious Writers fail even on their own postmodern terms. They urge us to move beyond our old-fashioned preoccupation with content and plot, to focus on form instead—and then they subject us to the least-expressive form, the least-expressive sentences, in the history of the American novel. Time wasted on these books is time that could be spent reading something fun.

Myers’ essay hints at what he sees as good writing, quoting Nabakov, referencing T.S. Eliot, and analyzing the controlled lyricism of Saul Bellow. Evaporating the boundaries between the various “brows” and accepting that action, plot, and invention are acceptable literary conceits also marks Myers’ approach to literary analysis.

It is largely an atheoretic analysis but there is a hint at something more beneath the surface when Myers describes the disdain of European peasants for the transition away from the inscrutable Latin masses and benedictions and into the language of the common man: “Our parson…is a plain honest man… But…he is no Latiner.” Myers counts the fascination with arabesque prose, with labeling it as great even when it lacks content, as derived from the same fascination that gripped the peasants: majesty is inherent in obscurity.… Read the rest

The Unreasonable Success of Reason

Math and natural philosophy were discovered several times in human history: Classical Greece, Medieval Islam, Renaissance Europe. Arguably, the latter two were strongly influenced by the former, but even so they built additional explanatory frameworks. Moreover, the explosion that arose from Europe became the Enlightenment and the modern edifice of science and technology

So, on the eve of an eclipse that sufficiently darkened the skies of Northern California, it is worth noting the unreasonable success of reason. The gods are not angry. The spirits are not threatening us over a failure to properly propitiate their symbolic requirements. Instead, the mathematics worked predictively and perfectly to explain a wholly natural phenomenon.

But why should the mathematics work so exceptionally well? It could be otherwise, as Eugene Wigner’s marvelous 1960 paper, The Unreasonable Effectiveness of Mathematics in the Natural Sciences, points out:

All the laws of nature are conditional statements which permit a prediction of some future events on the basis of the knowledge of the present, except that some aspects of the present state of the world, in practice the overwhelming majority of the determinants of the present state of the world, are irrelevant from the point of view of the prediction.

A possible explanation of the physicist’s use of mathematics to formulate his laws of nature is that he is a somewhat irresponsible person. As a result, when he finds a connection between two quantities which resembles a connection well-known from mathematics, he will jump at the conclusion that the connection is that discussed in mathematics simply because he does not know of any other similar connection.

Galileo’s rocks fall at the same rates but only provided that they are not unduly flat and light.… Read the rest

From Smith to Darwin

The notion that all the contingencies of human history can be rendered down into law-like principles is the greatest reflection of the human desire for order and understanding. Adam Smith appears in that mirrored pool alongside Karl Marx and, in his original form, even Charles Darwin. That’s only the beginning: Freud, Machiavelli, Rousseau, Hegel, and a host of others are reflected there in varying, and transitory clarity.

Adam Smith is a iconic case, as I discovered reading Adam Smith’s View of History: Consistent or Paradoxical? by James Alvey. The paradoxical component arises from a merger of a belief in the inevitability of commercial society and, at various points in Smith’s intellectual development, a cynicism about the probability of forward progress towards that goal. Ever behind the curtain, however, was the invisible hand represented by a kind of teleological divine presence moving history and economics forward.

The paper uncovers some of the idiosyncrasies of Smith’s economic history:

[T]he burghers felt secure enough to import ‘improved manufactures and expensive luxuries’. The lords now had something beside hospitality for which they could exchange the whole of their agricultural surplus. Previously they had to share, but ‘frivolous and useless’ things, such as ‘a pair of diamond [shoe] buckles’, and ‘trinkets and baubles’, could be consumed by the lords alone. The lords were fascinated with such finely crafted items and wanted to own and vainly display them. As the lords ‘eagerly purchased’ these luxury items they were forced to reduce the number of their dependents and eventually dismiss them entirely.

The lords ultimately have to trade off economic freedom of the artisans in exchange for more diamond shoe buckles. Odd, but perhaps reflective of the excesses of the wealthy in Smith’s era–something that needed explanation.… Read the rest

Lies, Damn Lies, and the Statistics of Mortality

I spent a few minutes calculating my expected date of death today since I just turned 45. It turns out that I am beyond the half-way point in this journey.  I’ve created a spreadsheet that you can use to calculate your own demise and produce charts like the one above. The current spreadsheet uses the US Male/Female mortality numbers from the World Bank dataset and then a linear regression for future gains in life expectancy. Other country sets can be easily incorporated.

Enter your year of birth in the yellow box and it will create a plot of how long you can expect to live, as well as the transition point from green to red in your, eh, lifecycle.  Don’t forget to get your trusts, wills, organ donations, and directives in order.

Of course, there are no more compelling lies than statistical ones, and yet there are no other ways to guess the future than to extrapolate from the past.… Read the rest

Trust in What?

Slate’s Ankita Rao reports that American’s trust in science has remained largely unchanged for liberals and moderates over the past 40 years, while that same trust has eroded among political conservatives from 63% to 35% during the same period. Rick Santorum epitomized this attitude when he suggested that Obama was a snob for thinking college education was an inherent good.

Rao’s article blames the interactions between science and policy as driving this distrust, where snobbery is interchangeable with an educated elite that is overwhelmingly politically liberal and therefore the enemy. The “reality-based community” (quoting Karl Rove) must be tied to science and therefore untied from pure oppositional ideology. Science and technocracy is polarized against populism and manipulation.

What are we left with? What do we trust in? We can choose raw religious feeling, but then there is the problem of reconciling those feelings with religious freedom, religious pluralism, and the vast secular reality that we confront on a daily basis. We can pick and choose ideologies, tying our fate to Ayn Rand, to Code Pink, or to Cato.

Better still, we should simply not engage in trust. That is the secret. Epistemological doubt is the critical initial step that leads, in turn, to the dissolution of the expectation that trust is intrinsically valuable. American democracy, developed from Enlightenment ideals, was conceived as opposed to trust in individuals by juxtaposing aspects of government against one another. This was unprecedented, of course, and was coincident with the growth of science as an explanatory framework that drained the authoritative institutions of their power.

Similarly, we might be able to reestablish trust in science by educating the anti-elitists about the inherently contingent nature of scientific reasoning.… Read the rest

Magical Blood Typing

Since I’m in Japan, it’s raining and I slept too much on the plane to get back to sleep, I thought I’d post on the Japanese blood type craze.  Full disclosure, as AB+ I can be arty and mysterious.

I’m reminded of the suggestion that UFOs are demons that permeates some of more extreme fundamentalist community. In the case of the Japanese interest, it likely started from origins in “scientific racism” that permeated pre-WWII Japan and Germany. Quoting Mussolini, “The best blood will at some time get into a fool or a mosquito,” I have to wonder how the Axis powers managed their racial antipathy towards one another enough to remain part of the Axis?… Read the rest

Welcome to Ex Uno Plura!

Ex Uno Plura is a blog with the goal of exploring arguments and ideas that are new and a bit cutting edge. Some of the topics that are explored include recent developments in cognitive science, artificial intelligence, transhumanism, machine learning, evolutionary adaptation, atheism (and its discontents), technology and the humanities, literature and literary theory, and a raft of other topics that register as experimental. The goal is to always try to do justice to the topic and contextualize it socially and as part of intellectual history.

Note that the title might be a translation mistake. I have conflicting information on this; maybe it should be Ex Uno Plures but others concur with Ex Uno Plura. At least it’s not as bad as some of the other guesses floating around, including a title by David Foster Wallace. Update via Giacomo Miceli:

I can confirm you it’s “Ex uno, plura”. Plura => Neutral noun (more things, many things) vs Plures => Numeral adjective (more, several).

My name is Mark Davis and I am an entrepreneur, computer scientist, and author. I split my time between homes in New Mexico and Oregon.  My background ranges from an unusual upbringing in New Mexico, to teaching in the US Peace Corps, to running a performance art group, to working for Microsoft and XeroX PARC, and, recently, to startups in computational linguistics, big data analytics, and machine learning.  I’ve got a graduate degree in the latter. I retired in 2015 from a role as a CTO and Distinguished Engineer at Dell Software Group following the sale of my startup company, Kitenga, one of the first big data analytics companies. I just resurrected a cognitive computing and deep learning startup focused on immersive entertainment and intelligent assistants.… Read the rest