Friday, March 16, 2007


From: Al-Qaeda mastermind says he beheaded U.S. reporter(Katherine Shrader, Associated Press, March 15th, 2007)

Suspected 9/11 mastermind Khalid Sheikh Mohammed confessed to the beheading of American journalist Daniel Pearl and was central to 30 other attacks and plots in the United States and worldwide that killed thousands of victims, said a revised transcript released Thursday by the U.S. military.

“I decapitated with my blessed right hand the head of the American Jew, Daniel Pearl, in the city of Karachi, Pakistan,” Mr. Mohammed is quoted as saying in a transcript of a military hearing at Guantanamo Bay, Cuba, released by the Pentagon.[...]

Sealing a legacy of historical notoriety, Mr. Mohammed portrayed himself as al-Qaeda's most ambitious operational planner in a confession to a U.S. military tribunal that said he planned and supported a series of terrorist attacks, topped by 9/11. The gruesome attacks range from the suicide hijackings of Sept. 11, 2001 — which killed nearly 3,000 — to a 2002 shooting on an island off Kuwait that killed a U.S. Marine, according to an account released by the Pentagon.

Many plots, including a previously undisclosed plan to kill several former U.S. presidents, were never carried out or were foiled by international counter-terrorism authorities.

“I was responsible for the 9/11 operation from A to Z,” Mr. Mohammed said in a statement read on Saturday during a Combatant Status Review Tribunal at the U.S. detention facility at Guantanamo Bay, Cuba. Mr. Mohammed's confession was read by a member of the U.S. military who is serving as his personal representative.[...]

Kenneth Roth, executive director of Human Rights Watch, questioned the legality of the closed-door sessions and whether the confession was actually the result of torture.

“We won't know that unless there is an independent hearing,” he said. “We need to know if this purported confession would be enough to convict him at a fair trial or would it have to be suppressed as the fruit of torture?”

In listing the 28 attacks he planned and another three he supported, Mr. Mohammed said he tried to kill international leaders including Pope John Paul II, President Bill Clinton and Pakistani President Pervez Musharraf.

He said he planned the 2002 bombing of a Kenya beach resort frequented by Israelis and the failed missile attack on an Israeli passenger jet after it took off from Mombasa, Kenya.

He also said he was responsible for the bombing of a nightclub in Bali, Indonesia. In 2002, 202 people were killed when two nightclubs there were bombed.

Other plots he said he was responsible for included planned attacks against the Sears Tower in Chicago, the Empire State Building and New York Stock Exchange in New York, the Panama Canal, and Big Ben and Heathrow Airport in London – none of which happened.

What word is missing from this lengthy account of the outrages of this despicable creature? Give up? This may help.

If the mainstream media and the tranzi human rights brigade been around at Nuremburg, surely they would have challenged every verdict on procedural grounds and only described the defendants using ambiguous phrases like “senior German Government official” or “suspected Jewish resettlement mastermind”.

Wednesday, March 14, 2007


From: Is your baby playing with its toes yet? If not the government wants to know why (Lucy Ward, The Guardian, March 14th, 2007)

Babies will be assessed on their gurgling, babbling and toe-playing abilities when they are a few months old under a legally enforced national curriculum for children from birth to five published by the government yesterday.

Every nursery, childminder and reception class in Britain will have to monitor children's progress towards a set of 69 government-set "early learning goals", recording them against more than 500 development milestones as they go.

At five, each child will be assessed against 13 scales based on the learning goals and their score, called an early years profile, must be passed to the Department for Education and Skills.

When children enter compulsory schooling, they should be able to read simple sentences using a phonics-based approach, count reliably up to 10 and sing simple songs from memory, as well as respecting others' beliefs and learning to share and take turns.

We can’t shake this horrific vision of a whole generation of British five-year olds being trained in secret to drop whatever they are doing the second the Government inspector arrives and start all playing with their toes while belting out a lusty rendition of You’ll Never Walk Alone.

Tuesday, March 13, 2007


Beyond Stones & Bones (Newsweek, March 13th, 2007)

Now the contentious part. In 2001, a team digging in Chad unearthed what it claimed was the oldest fossil of an ancestor of humans but not chimps. If so, it must have lived after the two lineages split. Trouble was, Sahelanthropus tchadensis (nicknamed Toumai, the local word for "child") lived close to 7 million years ago. The genetic data, pointing to a human-chimp split at least 1 million years later, suggest that Toumai is not the ur-hominid—the first creature ancestral only to human and not our chimp cousins—after all.

If Toumai is not our ancestor, what is he doing with such a humanlike face and teeth, which look like those of species 5 million years his junior? "A 7 million-year-old hominid should be just starting to look like a hominid, not have a trait you see so much later in the fossil record," says paleoanthropologist Bernard Wood of George Washington University. Even if he is not our ancestor, Toumai is valuable because he undermines the "begat" model of human evolution—that Toumai begat Australopithecus who begat Homo habilis who begat Homo erectus who begat Homo sapiens. That model assumes that each biological innovation, whether bipedality or a large brain or any other, evolved only once and stuck.

Instead, evolution played Mr. Potato Head, putting different combinations of features on ancient hominids then letting them vanish until a later species evolved them. "Similar traits evolved more than once, which means you can't use them as gold-plated evidence that one fossil is descended from another or that having an advanced trait means a fossil was a direct ancestor of modern humans," says Wood. "Lots of branches in the human family tree don't make it to the surface."


From: The great unread: DBC Pierre, Harry Potter ... oh yes, and David Blunkett (Paul Lewis and John Ezard, The Guardian, March 13th, 2007)

It's the literary club no author wants to belong to, but boasts the likes of Salman Rushdie, Bill Clinton, Paulo Coelho and Fyodor Dostoyevsky. A survey out today of the books Britons own but do not finish shows a surprising lack of appetite for many of the nation's most popular titles.

The bestselling book that topped the poll, DBC Pierre's Vernon God Little, has been lauded the world over - ironically, for its explosive denouement. But 35% of respondents who bought or borrowed the Man Booker-winning satire about a Texan schoolboy in a death row reality TV show failed to get to the end.

And while few can dispute the crazed popularity of JK Rowling's books amid the under 16s, the survey of 4,000 adults found 32% were not particularly fussed about the fourth in the series. Harry Potter and the Goblet of Fire beat James Joyce's 1912 novel Ulysses - running to more than 1,000 notoriously laborious pages - into second place.

Confession being good for the soul, you are invited to reveal a well-known book you only finished half of at most, but which you saw no harm in talking about as if you had finished it. For us, Lord of the Rings.

Sunday, March 11, 2007


From: Bishop demands 'better theology' of sex (Michael Valpy, Globe and Mail, March 8th, 2007)

The Christian church has a deeply flawed understanding of sex that has led to morally groundless objections to masturbation, birth control, abortion and homosexuality, says a leading Canadian Anglican bishop.

In particular, the church has been wrong for centuries on the notion that sex exists only for the purpose of procreation, Right Rev. Michael Ingham, bishop of the Greater Vancouver Diocese of New Westminster, told a conference in Ottawa last night.

"Christianity as a religion stands in need of a better theology of sexuality," he said, "a better understanding of the complex role sexuality plays in our human nature and of the purposes of God in creating us as sexual beings."

He said the church has misunderstood references to homosexuality in the Bible, wasted energy in persecuting individuals who have argued for a new understanding of sexuality, and failed to comprehend how much the Bible and church doctrines have been shaped through the lens of male experience.

Bishop Ingham's call for a new theology of sex will be felt as a shock throughout the 77-million member Anglican Communion, Christianity's third largest denomination.

If so, they must be an easily shocked lot because this little play is into its umpteenth revival, although each time the houses are emptier.

A keen-eyed modern Rip Van Winkle awakening from a slumber of several decades might perceive a certain dislocate between the rhetoric and the action that attends many public issues in the West. For example, he might notice how many environmental activists are forever claiming to ground their terrifying predictions in science while at the same time declaring the scientific debate to be over and shouting down any further inquiry. He might wonder why many religious folks in the West, characterized widely as slaves to intolerant, absolute dogmas that brook no dissent, are getting their hands dirty trying to distinguish between Muslims who threaten them and Muslims who don’t, while many secularists proclaiming tolerance and freedom can’t wait for the glorious day Islamic culture and faith are completely eradicated. And he might notice that certain church leaders whose dawn-to-dusk calling seems to lie in preaching or defending sexual amorality are forever accusing their opponents of being fixated by sex.

Theological libertines like Ingham love to drop phrases like “the complex role sexuality plays in our human nature”, when they mean the exact opposite. What they mean is that it is complex only for those who believe in restraint and objective morality and suffer all manner of warping complexes and hang-ups as a consequence. Their sub-text is that, when the sexual apocalypse arrives, we will understand that sex stands in glorious isolation from the rest of our material, psychological and spiritual lives and none of us will give a hoot what we or anyone else does. We will go wherever the itch leads us because that’s the Divine Will. When that happens, there will be no further need to talk about it, but until then, do they and his cheerleaders ever seem to have a lot to say.


From: The Brain on the Stand (Jeffrey Rosen, New York Times, March 11th, 2007)

“To a neuroscientist, you are your brain; nothing causes your behavior other than the operations of your brain,” Greene says. “If that’s right, it radically changes the way we think about the law. The official line in the law is all that matters is whether you’re rational, but you can have someone who is totally rational but whose strings are being pulled by something beyond his control.” In other words, even someone who has the illusion of making a free and rational choice between soup and salad may be deluding himself, since the choice of salad over soup is ultimately predestined by forces hard-wired in his brain. Greene insists that this insight means that the criminal-justice system should abandon the idea of retribution —the idea that bad people should be punished because they have freely chosen to act immorally —which has been the focus of American criminal law since the 1970s, when rehabilitation went out of fashion. Instead, Greene says, the law should focus on deterring future harms. In some cases, he supposes, this might mean lighter punishments. “If it’s really true that we don’t get any prevention bang from our punishment buck when we punish that person, then it’s not worth punishing that person,” he says. (On the other hand, Carter Snead, the Notre Dame scholar, maintains that capital defendants who are not considered fully blameworthy under current rules could be executed more readily under a system that focused on preventing future harms.)

Others agree with Greene and Cohen that the legal system should be radically refocused on deterrence rather than on retribution. Since the celebrated M’Naughten case in 1843, involving a paranoid British assassin, English and American courts have recognized an insanity defense only for those who are unable to appreciate the difference between right and wrong. (This is consistent with the idea that only rational people can be held criminally responsible for their actions.) According to some neuroscientists, that rule makes no sense in light of recent brain-imaging studies. “You can have a horrendously damaged brain where someone knows the difference between right and wrong but nonetheless can’t control their behavior,” says Robert Sapolsky, a neurobiologist at Stanford. “At that point, you’re dealing with a broken machine, and concepts like punishment and evil and sin become utterly irrelevant. Does that mean the person should be dumped back on the street? Absolutely not. You have a car with the brakes not working, and it shouldn’t be allowed to be near anyone it can hurt.”

Even as these debates continue, some skeptics contend that both the hopes and fears attached to neurolaw are overblown. “There’s nothing new about the neuroscience ideas of responsibility; it’s just another material, causal explanation of human behavior,” says Stephen J. Morse, professor of law and psychiatry at the University of Pennsylvania. “How is this different than the Chicago school of sociology,” which tried to explain human behavior in terms of environment and social structures? “How is it different from genetic explanations or psychological explanations? The only thing different about neuroscience is that we have prettier pictures and it appears more scientific.”

Morse insists that “brains do not commit crimes; people commit crimes” — a conclusion he suggests has been ignored by advocates who, “infected and inflamed by stunning advances in our understanding of the brain . . . all too often make moral and legal claims that the new neuroscience . . . cannot sustain.” He calls this “brain overclaim syndrome” and cites as an example the neuroscience briefs filed in the Supreme Court case Roper v. Simmons to question the juvenile death penalty. “What did the neuroscience add?” he asks. If adolescent brains caused all adolescent behavior, “we would expect the rates of homicide to be the same for 16- and 17-year-olds everywhere in the world — their brains are alike — but in fact, the homicide rates of Danish and Finnish youths are very different than American youths.” Morse agrees that our brains bring about our behavior —— “I’m a thoroughgoing materialist, who believes that all mental and behavioral activity is the causal product of physical events in the brain” —but he disagrees that the law should excuse certain kinds of criminal conduct as a result. “It’s a total non sequitur,” he says. “So what if there’s biological causation? Causation can’t be an excuse for someone who believes that responsibility is possible. Since all behavior is caused, this would mean all behavior has to be excused.” Morse cites the case of Charles Whitman, a man who, in 1966, killed his wife and his mother, then climbed up a tower at the University of Texas and shot and killed 13 more people before being shot by police officers. Whitman was discovered after an autopsy to have a tumor that was putting pressure on his amygdala. “Even if his amygdala made him more angry and volatile, since when are anger and volatility excusing conditions?” Morse asks. “Some people are angry because they had bad mommies and daddies and others because their amygdalas are mucked up. The question is: When should anger be an excusing condition?”

Still, Morse concedes that there are circumstances under which new discoveries from neuroscience could challenge the legal system at its core. “Suppose neuroscience could reveal that reason actually plays no role in determining human behavior,” he suggests tantalizingly. “Suppose I could show you that your intentions and your reasons for your actions are post hoc rationalizations that somehow your brain generates to explain to you what your brain has already done” without your conscious participation. If neuroscience could reveal us to be automatons in this respect, Morse is prepared to agree with Greene and Cohen that criminal law would have to abandon its current ideas about responsibility and seek other ways of protecting society.

Fortunately, there is no shortage of historical precedents to guide and inspire us.

Although ambiguous about where this all leads, Professor Morse is correct that there is nothing particularly original here. Each new wave of determinist thinking tends to arrive with a splash and claim the idea that our behaviours are influenced by genes, brains, nature, nurture, the stars, the climate or whatever is brand new and a counterpoint to a supposed universal historical belief that humans are independent actors in full control of their lives and equally capable of choosing from an infinite number of possible actions. In fact, the opposite is the case. Almost nobody believes that or ever did. Free will, moral agency and individual responsibility are gifts of monotheism, which holds we have the capacity to rise above our largely determined natures, but not without struggle and not unaided. That belief is the historical exception to the rule and the grounding of the most prosperous, culturally rich and successful civilization in history.

Determinism is the default belief in human history. It defines paganism, which explains why aboriginal peoples and so many African communities cannot break out of endless cycles of poverty and pathology. It defined much of Asia until Asians consciously and expressly rejected their traditions to adopt Western ways. Since about fifty years after the Enlightenment, it has largely defined secularism. Not unlike medieval astrologers, Marx, Freud, Darwin and a host of minor others all argued man is in the grip of forces of which he is unaware and which absolved him of responsibility for his actions and fate. Their popularity was instant and widespread, demonstrating what every lawyer knows–that people will go to the most extreme lengths to find exculpatory explanations for their actions, no matter how heinous or injurious. It is the man who genuinely admits responsibility that is the rare exception.

Neuroscience appears to be the current exciting cutting edge of determinism, which unfortunately means we will once again probably spend decades watching judges grapple with the implications of considering defendants and witnesses as mindless automatons in the controlling grip of independent cerebral forces only judges and neuroscientists can escape. One can only hope this latest attack on free will and ultimate individual responsibility, the plinth of Western civilization, does limited damage before the inevitable reaction sets in. At an intellectual level, the reaction will come when our learned scientific sages finally admit that, while their theories work great on chimps and slugs, there are too many aspects of human nature and human behaviour that simply cannot be explained by them. At a popular level, it will occur when a collective revulsion wells up from within at the gut realization that the idea we cannot control our destinies and are not responsible for our choices means there is no particular reason to move forward in life or even go on living.

Even soccer hooligans know that.

Saturday, March 10, 2007


From: Gerald Own Gerald Owen, National Post, March 8th, 2007)

Anna Nicole Smith's body lies amouldering in her Bahamian grave, but her media presence will go marching on, decelerating slowly. My favourite moment in her life-after-death so far was an interview by Larry King of Barbara Walters (the occasion for which I have forgotten), in which he asked her to explain the enormous attention being paid to Ms. Smith. Ms. Walters said she wasn't following the story and fittingly asked him to explain it himself, since he had been covering it for days on end, with little interruption. Mr. King said he didn't understand it, as if he were just an unpiloted boat being swept along by a tidal wave of popular demand and ratings (possibly, he is). But Ms. Walters defended the attention to the misfortunes of Britney Spears, on the questionable ground that she has talent, unkindly contrasting her to the newly departed soul of Ms. Smith.

There is a good case to be made that this has been the purest instance yet of celebrity culture, because it is so hard to say who Ms. Smith was. She was a kind of Platonic ideal of the phenomenon pointed out in 1961 by the historian Daniel Boorstin in his brilliant book The Image: A Guide to Pseudo-Events in America: "The celebrity is a person who is known for well-knownness."

She does not lend herself to any of the customary front-end-loaded descriptions: "Bahamas-based reality TV actress Anna Nicole Smith," "Texas born former fried-chicken waitress?," "former Playboy model?" or "litigant and alleged gold digger Anna Nicole Smith" -- none of these is adequate to her. Reporting on her death, the New York Times made a brave attempt in its lead sentence, and refrained from front-end-loading: "Miami, Feb. 8 -- Anna Nicole Smith, a former Playboy centerfold, actress and television personality who was famous, above all, for being famous" -- the usual restatement of Professor Boorstin's wise saying -- "but also for being sporadically rich and chronically litigious."

Let's see now. Beauty? Nope. Talent? Nope. Achievement? Nope? Inspiration? Nope. Tragedy? Nope.

Can anyone offer an explanation for this? We can’t even craft a good theory of decline out of it.


From: A French intellectual--in the worst sense of the term (Robert Fulford, National Post, March 10th, 2007)

Jean Baudrillard, who died on Tuesday in Paris at the age of 77, was a French intellectual in the most sinister meaning of that term.

He was intoxicated by hastily concocted theories and drunk on incomprehensible explanations of world affairs. He could make any subject more obscure just by briefly visiting it. Many of his readers eventually discovered that his work, some 50 books in all, usually wasn't about what it claimed to be about. His real concern was always Baudrillard and the passionate drama of his daydreams.

His way of thinking involved intense snobbery on his part and great tolerance on the reader's. To the public and his students he said, in effect: "You poor fools are deluded by all your ideals, your dreams, your accomplishments. You think that's reality? It's a fraud, all of it. I know better."

Strange as it seems, in the 1970s much of the Western world was ready to embrace him. He and Jacques Derrida were among the most prominent members of the platoon of French imperialist intellectuals who landed on the shores of North America and conquered a whole continent.

They set up base camps on elite college campuses and soon began enlisting local recruits for their army of postmodernists, post-structuralists, post-Marxists and full-time professional obscurantists. They became an all-consuming vogue. Soon it was impossible to get through Yale without encountering them, and by the early 1990s their thoughts had penetrated Western Canada, where you could hear professors talking the ugly and mostly incomprehensible language of critical theory while students struggled pathetically to keep up. In some circles, those who didn't imitate the French stars were considered eccentric.

What is it about North American (and Australian) life that makes generation after generation of progressives, artists and intellectuals defer so slavishly to the putative superiority of European culture and thought? From early twentieth century American expats in Paris to Swedes like Myrdal and Bergman to existentialists like Sartre and de Beauvoir to artistic weirdos like Dali to radical darlings of the sixties like Marcuse through to the cerebral pathologies of French deconstructionalists, our intellectual history is marked by a repeated self-abnegating embrace of European philosophical fads we hold to far beyond their sell-by dates.

The themes are always the same: America is rough, unlettered, materialist, exploitative and (let’s face it) stupid. By contrast, Europe is learned, wise, subtle, sharing, reflective and aesthetically rich. Their toilets may not work, they may be self-immolating demographically, their economies may be in reverse, there may be riots in their streets and they may even be going through one of their periodic internecine slaughters, but my goodness, these people know how to live!

Certainly we want our children to tap into the formative richness of European cultures. There is an admittedly limited return from cathedral tours of Wyoming and Ontario or post-doctoral work on the social philosophy of Daniel Boone. But almost all of that cultural treasure-trove long pre-dates the twentieth century and has been renounced repeatedly and comprehensively by European elites for generations, sometimes with words, sometimes with guns. Yet still they come with their gobbledegook celebrating despair and decline and still we welcome them speechlessly with feigned deferential awe, secretly praying it’s all a bad dream our children will grow out of someday.

Thursday, March 8, 2007


From: Song of the Week #45 (Mark Steyn, SteynOnline, February 26th, 2007)

What do these five songs have in common?

“The Way You Look Tonight”, “Thanks For The Memory”, “Over The Rainbow”, “When You Wish Upon A Star” and “White Christmas”.

Answer: They were all Academy Award-winning songs from the Best Song Oscar’s first decade.

And what do these five songs have in common?

“When You Believe”, “You’ll Be In My Heart”, “Into The West”, “Al Otro Lado del Rio” and “It’s Hard Out Here For A Pimp”.

Answer: They were all Academy Award-winning songs from the last decade.

We’ll spare you the curmudgeon’s obligatory rant on the putridness of modern music and simply ask whether anyone can think of a song from film or Broadway from, say, the last twenty years that he or she thinks is likely to endure in the repertoire of popular, memorable favourites.


From: Canada told not to use term 'visible minorities' (Steven Edwards, National Post, March 8th, 2007)

Canada's use of the term "visible minorities" to identify people it considers susceptible to racial discrimination came under fire at the United Nations yesterday --for being racist.

In a report on Ottawa's efforts to eliminate racial discrimination in Canada, the world body's anti-racism watchdog said the words might contravene an international treaty aimed at combatting racism.

Members of the Geneva-based Committee on the Elimination of Racial Discrimination also questioned other terms used by the federal government, among them "ethnocultural communities."[...]

"The committee is concerned that the use of the term may not be in accordance with the aims and objectives of the Convention," the report says.

It adds that Canada should "reflect further on the implications of the use of the term," but offers no suggestions about what words would be acceptable.