Towards an Epistemology of Uncertainty (the “I Don’t Know” club)

space-timeToday there was an acute overlay of reinforcing ideas when I encountered Sylvia McLain’s piece in Occam’s Corner on The Guardian drawing out Niall Ferguson for deriving Keynesianism from Keynes’ gayness. And just when I was digesting Lee Smolin’s new book, Time Reborn: From the Crisis in Physics to the Future of the Universe.

The intersection was a tutorial in the limits of expansive scientism and in the conclusions that led to unexpected outcomes. We get to euthanasia and forced sterilization down that path–or just a perception of senility when it comes to Ferguson. The fix to this kind of programme is fairly simple: doubt. I doubt that there is any coherent model that connects sexual orientation to economic theory. I doubt that selective breeding and euthanasia can do anything more than lead to inbreeding depression. Or, for Smolin, I doubt that the scientific conclusions that we have reached so far are the end of the road.

That wasn’t too hard, was it?

The I Don’t Know club is pretty easy to join. All one needs is intellectual honesty and earnesty.… Read the rest

A Paradigm of Guessing

boxesThe most interesting thing I’ve read this week comes from Jurgen Schmidhuber’s paper, Algorithmic Theories of Everything, which should be provocative enough to pique the most jaded of interests. And the quote is from way into the paper:

The first number is 2, the second is 4, the third is 6, the fourth is 8. What is the fifth? The correct answer is “250,” because the nth number is n 5 −5n^4 −15n^3 + 125n^2 −224n+ 120. In certain IQ tests, however, the answer “250” will not yield maximal score, because it does not seem to be the “simplest” answer consistent with the data (compare [73]). And physicists and others favor “simple” explanations of observations.

And this is the beginning and the end of logical positivism. How can we assign truth to inductive judgments without crossing from fact to value, and what should that value system be?… Read the rest

The Churches of Evil

The New York Times continues to mine the dark territory between religious belief and atheism in a series of articles in the opinion section, with the most recent being Gary Cutting’s thoughtful meditation on agnosticism, ways of knowing, and the contributions of religion to individual lives and society. In response, Penn Jillette and others discuss atheism as a religion-like venture.

We can dissect Cutting’s argument while still being generous to his overall thrust. It is certainly true that aside from the specific knowledge claims of religious people that there are traditions of practice that result in positive outcomes for religious folk. But when we drill into the knowledge dimension, Cutting props up Alvin Plantinga and Richard Swinburne as representing “the role of evidence and argument” in advanced religious argument. He might have been better to restrict the statement to “argument” in this case, because both philosophers focus primarily on argument in their philosophical works. So evidence remains elusively private in the eyes of the believer.

Interestingly, many of the arguments of both are simply arguments against a counter-assumption that anticipates a secular universe. For instance, Plantinga shows that the Logical Problem of Evil is not incoherent, resulting in a conclusion that evil (neglect “natural evil” for the moment) is not logically incompatible with omnibenevolence, omnipotence, and omniscience. But, and here we get back to Cutting, it does nothing to persuade us that the rapacious cruelty of Yahweh much less the moral evil expressed in the new concept of Hell in the New Testament are anything more than logically possible. The human dimension and the appropriate moral outrage are unabated and we loop back to the generosity of Cutting towards the religious: shouldn’t we provide equal generosity to the scriptural problem of evil as expressed in everything from the Hebrew Bible through to the Book of Mormon?… Read the rest

Randomness and Meaning

The impossibility of the Chinese Room has implications across the board for understanding what meaning means. Mark Walker’s paper “On the Intertranslatability of all Natural Languages” describes how the translation of words and phrases may be achieved:

  1. Through a simple correspondence scheme (word for word)
  2. Through “syntactic” expansion of the languages to accommodate concepts that have no obvious equivalence (“optometrist” => “doctor for eye problems”, etc.)
  3. Through incorporation of foreign words and phrases as “loan words”
  4. Through “semantic” expansion where the foreign word is defined through its coherence within a larger knowledge network.

An example for (4) is the word “lepton” where many languages do not have a corresponding concept and, in fact, the concept is dependent on a bulwark of advanced concepts from particle physics. There may be no way to create a superposition of the meanings of other words using (2) to adequately handle “lepton.”

These problems present again for trying to understand how children acquire meaning in learning a language. As Walker points out, language learning for a second language must involve the same kinds of steps as learning translations, so any simple correspondence theory has to be supplemented.

So how do we make adequate judgments about meanings and so rapidly learn words, often initially with a course granularity but later with increasingly sharp levels of focus? What procedure is required for expanding correspondence theories to operate in larger networks? Methods like Latent Semantic Analysis and Random Indexing show how this can be achieved in ways that are illuminating about human cognition. In each case, the methods provide insights into how relatively simple transformations of terms and their occurrence contexts can be viewed as providing a form of “triangulation” about the meaning of words.… Read the rest

The Unreasonable Success of Reason

Math and natural philosophy were discovered several times in human history: Classical Greece, Medieval Islam, Renaissance Europe. Arguably, the latter two were strongly influenced by the former, but even so they built additional explanatory frameworks. Moreover, the explosion that arose from Europe became the Enlightenment and the modern edifice of science and technology

So, on the eve of an eclipse that sufficiently darkened the skies of Northern California, it is worth noting the unreasonable success of reason. The gods are not angry. The spirits are not threatening us over a failure to properly propitiate their symbolic requirements. Instead, the mathematics worked predictively and perfectly to explain a wholly natural phenomenon.

But why should the mathematics work so exceptionally well? It could be otherwise, as Eugene Wigner’s marvelous 1960 paper, The Unreasonable Effectiveness of Mathematics in the Natural Sciences, points out:

All the laws of nature are conditional statements which permit a prediction of some future events on the basis of the knowledge of the present, except that some aspects of the present state of the world, in practice the overwhelming majority of the determinants of the present state of the world, are irrelevant from the point of view of the prediction.

A possible explanation of the physicist’s use of mathematics to formulate his laws of nature is that he is a somewhat irresponsible person. As a result, when he finds a connection between two quantities which resembles a connection well-known from mathematics, he will jump at the conclusion that the connection is that discussed in mathematics simply because he does not know of any other similar connection.

Galileo’s rocks fall at the same rates but only provided that they are not unduly flat and light.… Read the rest