Matter’s possibilities

William JamesWilliam James’ third son, Herman, died of whooping cough at the age of 18 months. Some 22 years later, James wrote in Pragmatism (he was 65):

To anyone who has ever looked on the face of a dead child or parent, the mere fact that matter could have taken for a time that precious form, ought to make matter sacred for ever after. It makes no difference what the principle of life may be, material or immaterial, matter at any rate co-operates, lends itself to all life’s purposes. That beloved incarnation was among matter’s possibilities.

William James: A religious man for our times

I call your attention to the introductory post for a new series in the Guardian’s “How to believe” feature: William James, part 1: A religious man for our times ; I look forward to the future posts.

William James bioI also wanted to mention a relatively recent (2006) biography, William James: In the Maelstrom of American Modernism. I picked up a hardcover copy from a bargain table in Minneapolis a year ago, and have been working my way through it ever since. It’s a reflection on my reading habits, not the quality of the writing, that a year later I’m only halfway through.

It’s by Robert Richardson, who won the 2007 Bancroft Prize for his work.

According to the Bancroft jury, William James is simultaneously an intellectual biography, and a biography tout court, of the James family, including William James’s father, Henry James, Sr., and his brother Henry.” The book “is a virtual intellectual genealogy of American liberalism and, indeed, of American intellectual life in general, through and beyond the twentieth century…the story Richardson tells is engaging, his research deep, his writing graceful and appealing.”

No argument from me. I’d put it at the top of the heap, along with Ray Monk’s most excellent Wittgenstein bio.

Magical voting

Is voting magical? | Andrew Brown

… So as a rational, self-interested actor, it makes no sense for me to vote. There is a reason why it’s important to tell us on election day that our votes will make a difference: thinking about it will lead the economically rational to conclude it’s not true. Nor did it make any sense for me to hold my nose. It was even more absurd than the enthusiasm of the football supporters in the pub last night, shouting in disappointment when their team missed a goal on television. They are least were taking part in a collective ritual with their friends; I was quite alone and unobserved.

One way of interpreting all these actions is as a form of sympathetic magic. While my rational mind knows perfectly well that neither my vote nor my pantomime will have any effect they are both behaviours that make sense only if on some level I do expect them to be effective. Similarly, the football fans surely believe that their support helps their team along – they behave as if they do, and still more as if the team was damaged by a lack of belief.

A more radical explanation is that the belief that we believe in magic is itself a rationalisation. Holding my nose while voting or shouting at an invisible football team is is entirely instinctive behaviour, and can be triggered even when it has no purpose at all, any more than giggling when I am tickled does, or sneezing when exposed to bright light. This is actually quite an important point in some theories of religion, like Pascal Boyer’s and one that is hard to answer: almost all our accounts of ritual behaviour are based on the idea that it is a conscious attempt to manipulate the world but maybe it is done entirely for its own sake. …


R G Collingwood

Andrew Brown points us to Simon Blackburn’s review in The New Republic of a new biography of R G Collingwood, quoting along the way this admiration.

“Although art as magic is not art proper, Collingwood accords it the greatest respect. He dismisses more brutally and contemptuously even than Wittgenstein the patronizing view, held by Frazer, Lévy-Bruhl, and other anthropologists of his time, that religion and magic simply amount to bad science, so that the “savage mind” is one lacking the most elementary knowledge of cause and effect. He also dismisses the ludicrous Freudian view that magic is a kind of neurosis in which the patient supposes that by wishing for a thing he can bring it about. Instead, Collingwood insists, surely correctly, that the end of magic is the raising and channeling of emotion: ‘magical activity is a kind of dynamo supplying the mechanism of practical life with the current that drives it.’ Its true purpose is not, say, to avert natural catastrophes, but to ‘produce in men an emotional state of willingness to bear them with fortitude and hope.’

“This attitude gave Collingwood an uncommon sympathy with religious ritual and practice, and a much more realistic understanding of its ongoing place in human life. He also enables us to see why the majority of people, including those like myself who have no religious attachments, are nevertheless embarrassed at the dogmatic contempt poured on religious practice by our more militant atheists. Every sane person recognizes at some level that dance, music, poetry, and ritual may be just what you need as you prepare to face a battle, or desolation, failure, grief, or death.”

Blackburn isn’t uncritical, though.

If Collingwood is as acute and interesting as I have suggested, how does it happen that he is largely a minority interest? He has his devotees, certainly; but I doubt if he is more than a ghost in the footnotes to syllabi across the Western world. The comparison to Wittgenstein might help. It is difficult to pick up a page of Wittgenstein without being seduced: whether you understand it or not, the sense is overwhelming that something of the highest importance is being addressed with a rare detachment and intelligence. With Collingwood, there is assertion and bravado instead of seduction. Wittgenstein shows that he is a wonderfully and originally reflective thinker; Collingwood cannot help telling you that he is. Wittgenstein is silent about his being capable of other things as well; Collingwood boasts of it. You can read all of Wittgenstein without knowing of his genuine heroism during World War I. One cannot help feeling that had Collingwood done anything like that, it would have cropped up on every other page. All this is off-putting, and Collingwood’s readers have to learn to shake their heads with a smile rather than toss the whole thing into the bin.

How to believe

New to me: How to believe is a series in the Guardian (“Join our experts as they blog great works of religion and philosophy”). Here’s a summary of the topics so far (follow the link for more detail); looks good to me. Next up: Giles Fraser on Wittgenstein.

How to believe

Franklin Lewis: Rumi’s influence has long been felt throughout the Muslim world. Will his recent success in the west prove as long lasting?

Paul Helm: Calvin’s influence is still being felt today. But the reformer was a complex man, with a dark side.

Mark Vernon: Plato increasingly looks not just like a generator of footnotes, but a philosopher whose time is coming again.

Simon Critchley: The most important continental philosopher of the last century was also a Nazi. How did he get there? What can we learn from him?

Mary Midgley: How to believe: Thomas Hobbes invented, in Leviathan, the modern idea of the individual. It has been hugely politically liberating. But is it realistic?

Julian Baggini: The most pressing and telling critiques of religion not only cannot, but should not, attempt to deliver any fatal blows.

Jane Williams: Acts tells the story of a disparate group of men who, against the odds, came to spearhead an international movement.

Giles Fraser: Nietzsche thought religion in general, and Christianity in particular, was a corruption of the human spirit.

via Andrew Brown

Black Friday Godblogging

From Fr Marc, via Sr Juliann.

For our money is the Lord’s, however we may have gathered it. If we provide for those in need, we shall obtain great plenty. That is why God has allowed you to have more: not for you to waste on prostitutes, drink, fancy food, expensive clothes, and all other kinds of indolence, but for you to distribute to those in need. Just as an official in the government treasury who neglects to distribute wealth as he is ordered (spending instead on his own indolence) pays the penalty and is put to death, so also the Rich Man is a kind of steward of the money which is owed for distribution to the poor. He is directed to distribute it to his fellow servants who are in want. So if he spends more on himself than his need requires, he will pay the harshest penalty hereafter. For his own goods are not his own, but belong to his fellow servants.

— Fr John Chrysostom, on Lazarus and the Rich Man

Midweek Godblogging

matthew1237.jpgFollowing up on my earlier Psalm 109 post with a bit of help from Fr Marc & Sr Juliann…

Juli reminded me of the term of art prooftexting. Borrowing from the Wikipedia article,

Prooftexting is the practice of using decontextualized quotations from a document (often, but not always, a book of the Bible) to establish a proposition. Critics of the technique note that often the document, when read as a whole, may not in fact support the proposition.

While the Ps 109 coffee mugs are not precisely “establishing a proposition”, they’re a good example of decontextualization and the hazards of prooftexting.

Fr Marc offers Matthew 12:36–37:

I tell you, on the day of judgment men will render account for every careless word they utter; for by your words you will be justified, and by your words you will be condemned.

…which loses nothing by being shorn of its context, though you’ll find the context interesting. It’s a warning that will fall on deaf ears, of course.

In one of those nice coincidences, Fr Marc also answers a question I hadn’t asked, aloud anyway. I wrote, “condemned out of their own mouths,” and then blew several minutes trying to find out where the hell the phrase came from. “Out of their own mouths shall they be condemned,” it turns out, comes from a piece of Reconstruction-era Republican campaign literature (q.G.), but obviously Matthew 12 was what I was looking for.

Sunday Godblogging: Thank God for Atheists

Christopher Lydon hosted a conversation with Harvey Cox, Mary Gordon and Cornell West at the Boston Book Festival last month, and the conversation showed up on Lydon’s Open Source Radio’s podcast. A little Lydon goes a long way, and there’s never just a little West, so I don’t necessarily recommend that you go listen to it. But toward the end (around 38:07), Cox quoted Tillich so:

Atheism is always the shadow of some understanding of God.

I don’t recall exactly what point Cox was trying to make; you can go listen for yourself, I guess. But I was curious about the context of the Tillich line, but I couldn’t track it down online—perhaps Cox was paraphrasing enough that Google couldn’t make the connection. While I was searching, I found this from Sri Aurobindo (Thoughts and Aphorisms, Bhakti, 538):

Atheism is the shadow or dark side of the highest perception of God. Every formula we frame about God, though always true as a symbol, becomes false when we accept it as a sufficient formula. The Atheist and Agnostic come to remind us of our error.

(Sri Aurobindo and Paul Tillich were more or less contemporaries. Is there more of a connection between them than that? I have no idea, but it seems likely. Frederic Spiegelberg is a likely bridge, and Google gives us some 1700 hits on the two names together.)

Andrew Brown: The Queen of Fairies caught me

So, it’s Hallowe’en (and Samhain). Let’s give Andrew Brown the floor.

The Queen of Fairies caught me

Halloween was once a night of real fright, when the dead and the fairies walked close to us. How did that work?

And pleasant is the fairy land,
But, an eerie tale to tell,
Ay at the end of seven years,
We pay a tiend to hell,
I am sae fair and fu o flesh,
I’m feard it be mysel.

But the night is Halloween, lady,
The morn is Hallowday,
Then win me, win me, an ye will,
For weel I wat ye may.

Just at the mirk and midnight hour
The fairy folk will ride,
And they that wad their true-love win,
At Miles Cross they maun bide.”

This from the ballad of Tam Lin, which, if you don’t know, you should go and listen to now. Now while the song is running, there is no trouble believing the story, or at least in suspending disbelief. The defiance of Janet to her father is more vivid to me than almost anything any living woman has said. But at the same time I find that modern hallowe’en, the children’s festival with dressing up and sweets, not all of them poisoned, is wholly impossible to take seriously.

So why are witches and fairies real within the confines of the song, and absurd when children play at them? It seems to be an example of a more general question: why is the absurdity of other people’s beliefs immediately apparent to us and yet entirely invisible to them? (We ourselves, of course, hold no absurd beliefs, whoever we are. Anyone who thinks otherwise is dangerously deranged.)

The best answer that occurs to me is that the difference is made by participation – if you like, by playing along. Children believe in the particular game they happen to be playing. Of course, they understand, as we do, that the world could be otherwise, and the game might stop. Hence the delicious thrill of a game that breaks that rule, and becomes real. But the point that the world might be otherwise, and that the game might end, actually testifies to its reality while it lasts.

Giles Fraser once said to me, in an entirely different context, that all sorts of people who can’t bring themselves to say the creeds will sing them happily enough. He’s right. The two activities are profoundly different. The song is not the same as the lyrics read out loud, and this is true even if it has no accompaniment. Choral or just collective singing is different again – a point that’s obvious if we look at the completely secular activity of football chanting: on Saturdays the terraces of North London are full of otherwise respectable men singing things about opposing players that they would find literally unspeakable at work on Monday morning.

So the way to understand the spread of Halloween is not as a spread of beliefs, but of a set of games, or little dramas, if you will. To get hung up on the apparent content of the game is to make a kind of category mistake: year after year, a certain kind of evangelical will announce that Halloween is a festival of evil; year after year, they fail to understand that the child who plays at being a witch is much closer to becoming a Christian or to understanding any kind of religion than the one who never plays at anything at all.

But it’s not just evangelicals who get this kind of thing wrong. I do it myself all the time, most recently when mocking the Anglo-Catholics; for the answer to the question “How can they believe these ludicrous things?” is that they act them out. They feel their beliefs are true because they are embedded in a structure of ritual, both inside and outside church. Their words are given content by their actions. Without the actions, the words mean nothing. This sounds like a vaguely moral exhortation but it is just a plain fact. Without action, we couldn’t understand the meaning of any words at all.

When the Christian says they believe in order to understand, this sounds to the atheist like an abdication of responsibility. But in fact is is a recognition of necessity. There is a sense in which we can’t understand the beliefs we don’t act on. That’s why playing is so important. By pretending to act, we gain a sort of understanding — which is why I believe that Queen of Fairies will look at Tam Lin tonight and say “Had I known, Tam Lin, what this night I did see. I would have plucked out both your ey’en and put in two of tree” — at least I will believe it while the music plays.

(Tiend is tithe.)

St Peter and the miserable worms

Andrew Brown.

St Peter and the miserable worms | Andrew Brown

I think now McClatchey was right, and I was wrong to say that the Anglican Communion ended this week. The Anglican Communion actually ended at least 20 years ago, almost as soon as I started to write about it. There might be a federation of churches, more or less united by affection and common ancestry, but there would not be a single body with a common understanding of who was a priest, or a bishop, or what these titles meant. That is why the Pope has parked his tanks on the lawn that was once Runcie’s. But at the same time, I wonder if Runcie, too, was not right all along, and that one day, despite all the best efforts of Pope John Paul II, a woman will not walk through the doors of St Peter’s and be received as a priest. After all, gay people have been doing that for centuries.


Precognition | Andrew Brown

Just sometimes, science fiction comes out right; in 1928 a philosophy lecturer saw the 21st century clearly.

My daughter picked up a book on Greek philosophy in a second hand bookshop and when we turned to the section on Epicurus one passage leaped out. The author, a lecturer at Queen’s University, Belfast, is trying to explain the philosopher’s thought-world since “the words epicure and epicurean have a bad sound”.

But, he says, “The man after whom they are named lived … in an age when the falsity of the orthodox religion had become apparent to intelligent men and when with the coming of great kingdoms the independence of the Greek states was lost forever. It was for that age as it would be for us if Christianity had become a discredited myth and if Britain had become a subject state in an American Empire.”

That was published in 1928.

Ockham’s broom

Mark Liberman. I have mixed feelings about the concept. Sometimes it’s the critical clue that ends up under the rug.

Ockham’s broom

Yesterday in the Journal of Biology, the editor introduced a new series (Miranda Robertson, “Ockham’s broom“):

Although it is increasingly difficult to gauge what people can be expected to know, it is probably safe to assume that most readers are familiar with Ockham’s razor – roughly, the principle whereby gratuitous suppositions are shaved from the interpretation of facts – enunciated by a Franciscan monk, William of Ockham, in the fourteenth century. Ockham’s broom is a somewhat more recent conceit, attributable to Sydney Brenner, and embodies the principle whereby inconvenient facts are swept under the carpet in the interests of a clear interpretation of a messy reality. (Or, some – possibly including Sydney Brenner – might say, in order to generate a publishable paper.)

Robertson points out that sweeping things under the rug is often a necessary condition for scientific progress:

While Ockham’s razor clearly has an established important and honourable place in the philosophy and practice of science, there is, despite its somewhat pejorative connotations, an honourable place for the broom as well. Biology, as many have pointed out, is untidy and accidental, and it is arguably unlikely that all the facts can be accounted for early in the investigation of any given biological phenomenon. For example, if only Charles Darwin had swept under the carpet the variation he faithfully recorded in the ratios of inherited traits in his primulas, as Mendel did with his peas, we might be talking of Darwinian inheritance and not Mendelian (see [3]). Clearly, though, it takes some special sophistication, or intuition, to judge what to ignore.

In praise of idleness

Weekend reading from Bertrand Russell, 1932. Here’s his first paragraph; I hope you’re not so idle that you don’t read the whole thing.

russell.gifLike most of my generation, I was brought up on the saying: ‘Satan finds some mischief for idle hands to do.’ Being a highly virtuous child, I believed all that I was told, and acquired a conscience which has kept me working hard down to the present moment. But although my conscience has controlled my actions, my opinions have undergone a revolution. I think that there is far too much work done in the world, that immense harm is caused by the belief that work is virtuous, and that what needs to be preached in modern industrial countries is quite different from what always has been preached. Everyone knows the story of the traveler in Naples who saw twelve beggars lying in the sun (it was before the days of Mussolini), and offered a lira to the laziest of them. Eleven of them jumped up to claim it, so he gave it to the twelfth. this traveler was on the right lines. But in countries which do not enjoy Mediterranean sunshine idleness is more difficult, and a great public propaganda will be required to inaugurate it. I hope that, after reading the following pages, the leaders of the YMCA will start a campaign to induce good young men to do nothing. If so, I shall not have lived in vain. …

Mill on voter qualification

John Stuart Mill was an early and strong advocate of universal suffrage, at a time when it was taken for granted that women, for example, did not vote. In Considerations on Representative Government, published in 1861, when the modern suffrage movement was just getting started, he wrote:

…it is a personal injustice to withhold from any one, unless for the prevention of greater evils, the ordinary privilege of having his voice reckoned in the disposal of affairs in which he has the same interest as other people. If he is compelled to pay, if he may be compelled to fight, if he is required implicitly to obey, he should be legally entitled to be told what for; to have his consent asked, and his opinion counted at its worth, though not at more than its worth. There ought to be no pariahs in a full-grown and civilised nation; no persons disqualified, except through their own default. Every one is degraded, whether aware of it or not, when other people, without consulting him, take upon themselves unlimited power to regulate his destiny. And even in a much more improved state than the human mind has ever yet reached, it is not in nature that they who are thus disposed of should meet with as fair play as those who have a voice. Rulers and ruling classes are under a necessity of considering the interests and wishes of those who have the suffrage; but of those who are excluded, it is in their option whether they will do so or not, and, however honestly disposed, they are in general too fully occupied with things which they must attend to, to have much room in their thoughts for anything which they can with impunity disregard. No arrangement of the suffrage, therefore, can be permanently satisfactory in which any person or class is peremptorily excluded; in which the electoral privilege is not open to all persons of full age who desire to obtain it.

Clear and unequivocal? Not quite as unequivocal as we, reading this with 21st-century eyes, might think. I don’t mean the “full age” qualification; we might argue over the age in question, but we generally accept an age qualification for voting. The catch shows up in the phrase “no persons disqualified, except through their own default” and again, “open to all persons of full age who desire to obtain it.”

Mill isn’t being coy. He explains in the very next paragraph how his vision of universal suffrage differs from our default idea of automatic universal suffrage.

There are, however, certain exclusions, required by positive reasons, which do not conflict with this principle, and which, though an evil in themselves, are only to be got rid of by the cessation of the state of things which requires them. I regard it as wholly inadmissible that any person should participate in the suffrage without being able to read, write, and, I will add, perform the common operations of arithmetic. Justice demands, even when the suffrage does not depend on it, that the means of attaining these elementary acquirements should be within the reach of every person, either gratuitously, or at an expense not exceeding what the poorest who earn their own living can afford. If this were really the case, people would no more think of giving the suffrage to a man who could not read, than of giving it to a child who could not speak; and it would not be society that would exclude him, but his own laziness. When society has not performed its duty, by rendering this amount of instruction accessible to all, there is some hardship in the case, but it is a hardship that ought to be borne. If society has neglected to discharge two solemn obligations, the more important and more fundamental of the two must be fulfilled first: universal teaching must precede universal enfranchisement. No one but those in whom an à priori theory has silenced common sense will maintain that power over others, over the whole community, should be imparted to people who have not acquired the commonest and most essential requisites for taking care of themselves; for pursuing intelligently their own interests, and those of the persons most nearly allied to them. This argument, doubtless, might be pressed further, and made to prove much more. It would be eminently desirable that other things besides reading, writing, and arithmetic could be made necessary to the suffrage; that some knowledge of the conformation of the earth, its natural and political divisions, the elements of general history, and of the history and institutions of their own country, could be required from all electors. But these kinds of knowledge, however indispensable to an intelligent use of the suffrage, are not, in this country, nor probably anywhere save in the Northern United States, accessible to the whole people; nor does there exist any trustworthy machinery for ascertaining whether they have been acquired or not. The attempt, at present, would lead to partiality, chicanery, and every kind of fraud. It is better that the suffrage should be conferred indiscriminately, or even withheld indiscriminately, than that it should be given to one and withheld from another at the discretion of a public officer. In regard, however, to reading, writing, and calculating, there need be no difficulty. It would be easy to require from every one who presented himself for registry that he should, in the presence of the registrar, copy a sentence from an English book, and perform a sum in the rule of three; and to secure, by fixed rules and complete publicity, the honest application of so very simple a test. This condition, therefore, should in all cases accompany universal suffrage; and it would, after a few years, exclude none but those who cared so little for the privilege, that their vote, if given, would not in general be an indication of any real political opinion.

We find (or at least I found, at first reading) this to be just a bit shocking, reading it through the lens of history, in particular Reconstruction and the Jim Crow South, aware as Mill was not of the pernicious use of literacy tests and poll taxes to prevent whole classes and races from voting, even after legal suffrage was granted.

On the other hand, it’s hard to see how a democracy can work if its citizens are throwing darts at their ballots or, worse, voting based on systematically bad information, something demagogues and soundbites make their living on.

To the extent that we try to address Mill’s concerns these days, it’s through voter education: media reporting, candidate debates and the like. But these can generate more heat than light, with he-said/she-said and horserace reporting, and “debates” that end up being extended exercises in message control.

This is, I believe, the central problem of the democratic project, and I’m completely at a loss for a solution.

Sorites in the comics

Dinosaur Comics are my favorite comics. Today, anyway. If you’re not familiar with the DC conventions, go browse a few dozen; you won’t regret it.

Tomorrow my favorite may be Red Meat; it was, back on October 27, 1977, and it could be again. Don’t say I didn’t warn you.

(If you want to see the reference from the missing panel 7 mentioned in the tooltip, it’s here.)


via Mark Liberman

Happy π Day!

Because, of course, you can’t spell πράγματος without π.


Pi Day is also Einstein’s birthday, so we’ll take as our text for today, “I want to know God’s thoughts; the rest are details.” I’ve been saving up a couple of links, so pay attention.

First, via Brad DeLong, Boltzmann’s Universe at Cosmic Variance.

Here’s how it goes. Forget that we are “typical” or any such thing. Take for granted that we are exactly who we are — in other words, that the macrostate of the universe is exactly what it appears to be, with all the stars and galaxies etc. By the “macrostate of the universe,” we mean everything we can observe about it, but not the precise position and momentum of every atom and photon. Now, you might be tempted to think that you reliably know something about the past history of our local universe — your first kiss, the French Revolution, the formation of the cosmic microwave background, etc. But you don’t really know those things — you reconstruct them from your records and memories right here and now, using some basic rules of thumb and your belief in certain laws of physics.

The point is that, within this hypothetical thermal equilibrium universe from which we are purportedly a fluctuation, there are many fluctuations that reach exactly this macrostate — one with a hundred billion galaxies, a Solar System just like ours, and a person just like you with exactly the memories you have. And in the hugely overwhelming majority of them, all of your memories and reconstructions of the past are false. In almost every fluctuation that creates universes like the ones we see, both the past and the future have a higher entropy than the present — downward fluctuations in entropy are unlikely, and the larger the fluctuation the more unlikely it is, so the vast majority of fluctuations to any particular low-entropy configuration never go lower than that.

Therefore, this hypothesis — that our universe, complete with all of our records and memories, is a thermal fluctuation around a thermal equilibrium state — makes a very strong prediction: that our past is nothing like what we reconstruct it to be, but rather that all of our memories and records are simply statistical flukes created by an unlikely conspiracy of random motions. In this view, the photograph you see before you used to be yellow and wrinkled, and before that was just a dispersed collection of dust, before miraculously forming itself out of the chaos.

Note that this scenario makes no assumptions about our typicality — it assumes, to the contrary, that we are exactly who we (presently) perceive ourselves to be, no more and no less. But in this scenario, we have absolutely no right to trust any of our memories or reconstructions of the past; they are all just a mirage. And the assumptions that we make to derive that conclusion are exactly the assumptions we really do make to do conventional statistical mechanics! Boltzmann taught us long ago that it’s possible for heat to flow from cold objects to hot ones, or for cream to spontaneously segregate itself away from a surrounding cup of coffee — it’s just very unlikely. But when we say “unlikely” we have in mind some measure on the space of possibilities. And it’s exactly that assumed measure that would lead us to conclude, in this crazy fluctuation-world, that all of our notions of the past are chimeric.

Our spoilsport author goes on, “Now, just like Boltzmann’s Brain, nobody believes this is true,” but, just for Pi Day, let’s suspend our disbelief until tomorrow.

Speaking of Sean Carroll, we also have Michael Bérubé’s review of his forthcoming From Eternity to Here: The Origin of the Universe and the Arrow of Time.

Time just isn’t what it used to be. And space has gotten to be a bit of a problem, as well. When I was a lad, physicists told me that they had these things pretty well figured out: they had discovered material evidence of the Big Bang, they had adjusted their conception of the age and evolution of the universe accordingly, and, having recalculated the universe’s rate of expansion (after Hubble’s disastrous miscalculations threw the field into disarray), they were working on the problem of trying to figure out whether the whole thing would keep expanding forever or would eventually slow down and snap back in a Big Crunch. The key, they said, lay in finding all the “missing mass” that would enable a Big Crunch to occur, because at the time it looked as if we only had two or three percent of the stuff it would take to bring it all back home. When I asked them why a Big Crunch, and a cyclical universe, should be preferable to a universe that just keeps going and going, they told me that the idea of a cyclical eternity was more pleasing and comfortable than the idea of a one-off event; and when I asked them what came before the Big Bang, they patted my head and told me that because the Big Bang initiated all space and time, there was no such thing as “before the Big Bang.”

But now they tell me that most of that account of the world is wrong. For one thing, the expansion of the universe seems to be accelerating, which puts a crimp in the plans of everyone who’d been counting on its eventual collapse; worse still, no one can explain why it is that the universe is different now than it was, say, 14 billion years ago, or why it will be different 14 billion years from now. For the simple and stupefying fact remains that the laws of physics are reversible; nothing in those laws prevents time from running backwards, and it’s entirely possible to have universes in which conscious entities remember the future and remark offhandedly to each other that you can’t get some eggs without breaking an omelet. And yet, our universe obeys those reversible laws of physics even though effects follow causes, old age follows youth, and systems move from states of low entropy to states of high entropy. How can this be? How might it be otherwise?

It’s above my pay grade, this much I know. But thanks in part to local fluctuations in my corner of the universe that allow me to read books before they are written (these are known technically as Borges-Boltzmann Waveforms, or more colloquially, “wrinkles in time”), I can reveal that Caltech physicist Sean Carroll will have addressed—if not quite “answered”—these questions in his new book, From Eternity to Here: The Origin of the Universe and the Arrow of Time. (Not to be confused with this superficially similar book, which has been published in parallel universe XGH0046, where Frank Viola gave up a promising baseball career in order to become a Christian writer.)

Good stuff. Go get your pie (you’ll need two slices), follow the link (not neglecting the comments thread), settle back, and enjoy the day.

Ludwig and Bertie

Bertie and JeevesLudwig WittgensteinBackground: In a NY Times review of Alexander Waugh’s The House of Wittgenstein, Jim Holt refers to Ludwig as “was the greatest philosopher of the 20th century.” This inspired Brian Leiter to run a poll to “settle this once and for all” (answer: Wittgenstein by a narrow plurality). Harry Brighouse, at Crooked Timber, linked to the poll, and a long list of comments ensued.

All well and good, but not the point of this post. Tom Hurka, in comments, gives us this:

At the Edinburgh Festival in 1977 I saw a wonderful play called ‘Ludwig and Bertie.’ It was about Wittenstein and Russell … and Bertie Wooster. You see, Russell and Wittgenstein have agreed to meet, for the first time, in the Trinity College, Cambridge library, which happens to be where Bertie Wooster is going to meet this new man he’s hired, called Jeeves. (He’s going to the library to find an ethics book and read about this ‘categorical aperitif.’) Well, various misidentifications follow, with Russell thinking Bertie is Wittgenstein (and utterly unsuited to philosophy) while Wittgensein thinks Bertie is Russell (and the stupidest man he’s ever met). It all reaches its climax when Russell encounters Jeeves, who’s of course been the Wittgenstein family butler in Vienna and taught Ludwig everything he knows. How, Russell asks him, can the sentence ‘The present king of France is bald’ be meaningful if there’s no present king of France? ‘May I venture to suggest, sir,’ Jeeves replies, ‘that we can analyze this sentence as saying that there is one and only one x such that x is the present king of France and x is bald?’ Fantastic!

Google doesn’t yield much, but I did find this from “the cover blurb on a published version of the play” in a lit-ideas post by David Ritchie

Bertie Wooster has become betrothed to Honoria Russell, daughter of the famous philosopher and Hefeweizen expert, Bertrand Russell. Bertie’s Aunt Dahlia finding herself once again short of funds for her magazine, “Milady’s Untenaable Propositions,” asks Bertie to break into Ludwig Wittgenstein’s bedroom in dead of night and steal his priceless, gold-plated poker, a souvenir of the famous encounter with Professor Popper. Bertie bungles the burglary, escapes with the aid of Jeeves and goes to ground in underneath a ladder in the library. The action begins with Honoria discovering what Bertie has not yet understood: that the ladder, underneath which he pretends to busy himself with the works of Spinoza, is not only not unoccupied, it is festooned with yards of Hildegard Wittgenstein, daughter of Ludwig and a Brownie leader of ferocious aspect. Honoria announces that the engagement is at an end. Hildegard announces that she has been compromized and must therefore marry Bertie. The fathers square off to debate the proposition. Jeeves saves the day and puts both of them right on minor but important points.

Please, God, I would dearly love to have a copy of the play.