Sunday, December 25, 2011

The Woodmans - Portrait of the Artist as a Suicide



The Woodmans (2010)
Self-Portrait of the Artist as a Suicide
Family dynamics examined through the prism of art: “The Woodmans,” C. Scott Willis’s compelling documentary study of an artistic clan whose comfortable life was shattered by the suicide of its youngest member, asks profound questions to which there really are no answers.

Brilliant piece of work

Asks, explores yet ulitimately does not answer several important questions


What caused Francesca Woodman, a prodigiously gifted 22-year-old photographer to throw herself out of a window in 1981?

Friday, December 23, 2011

My Future & Possibly 1967 Mercedes-Benz 230 SL




I know some of you are having a hard time 'gifting' me this year




Might I casually suggest....??




Merry XMas to all

Thursday, December 22, 2011

While Your Heart Ran



While your heart ran from the twisted thread of truth
And the souls of my lanterns were torn into tiny fragments
All over the countryside
And the children's eyes of my family were covered
With the black blindfold of faith



When life came forth from the anxious depths of desire
When my love was not anything other than the chiming of a clock
I sensed that I must love
That I must love insanely.

Wednesday, December 21, 2011

Patricia Highsmith BIO - author of STRANGERS ON A TRAIN and THE TALENTED MR RIPLEY


Just started this - ohhhhh all of the complicated people!

Patricia Highsmith - author of STRANGERS ON A TRAIN and THE TALENTED MR RIPLEY - had more than her fair share of secrets. During her life, she felt uncomfortable about discussing the source of her fiction and refused to answer questions about her private life. Yet after her death in February 1995, Highsmith left behind a vast archive of personal documents - diaries, notebooks and letters - which detail the links between her life and her work. Drawing on these intimate papers, together with material gleaned from her closest friends and lovers, Andrew Wilson has written the first biography of an author described by Graham Greene as the 'poet of apprehension'. Wilson illuminates the dark corners of Highsmith's life, casts light on mysteries of the creative process and reveals the secrets that the writer chose to keep hidden until after her death.

Sunday, December 18, 2011

Nicolette Scorsese - National Lampoon Christmas Vacation


Started out her career as a model, which is where she met her former boyfriend, actor/model Antonio Sabato Jr..
The grand-daddy of them all - she still rocks my world every XMas when this film is shown.
Kinda like a treat for the big boys - still gorgeous BTW

Bran Van 3000. The electronica collective from Montreal




I am obsessing over this when I need to be reading Sartre - Being and Nothingness

Bran Van 3000 Drinking in LA Video





I am oddly obsessed with this video




Can anyone suggest some potential cures??





I sooooo need to be reading Sartre - Being and Nothingness right now





Oh dear!







Saturday, December 17, 2011

The Oldest Name for Truth






The oldest name for truth is ALETHEIA
The root: LETHE: hiddenness, concealment, coveredness, veiledness.
The A has a privative function.
The whole word essentially can be rendered as:
un-hiddenness
un-concealment
dis-closure
dis-covery
re-velation
The elemental Greeks viewed truth as violent and uncanny spoliation
Things wrenched from hiddenness and exposed to the light
Show me what they really are
Our words for Truth are much more banal...
but the original Greek insight still lingers in our truth
No less than the Greek ALETHEIA

Sunday, December 11, 2011

Rimbaud - Une Saison en Enfer Part Deux - A Mouthful of Poison


Painting by Brett Whiteley - Oh Pen and Chose!

I come again to my favorite poet: Arthur Rimbaud
"You have to pass an exam, and the jobs that you get are either shining shoes, or herding cows, or tend to pigs. Thank God, I don't want any of that! Damn it! And besides that they smack you for a reward; they call you an animal and it's not true, a little kid, etc.... Oh! Damn Damn Damn Damn Damn!"

I am working on a new translation of Une Saison en Enfer - I know... Pretentious much??

An amazing work - completed in his LATE TEENS! What were you doing when you were 18?? lol

It reports his suffering and descent to madness... passing close to but not through the gates of death.... a grand and glorious failure to become a seer. He speaks of his delusions... his doubts... his fears.. but also.. his hopes and desires.

Let us pass a “Night in Hell”. The opening lines are powerful:

I have just swallowed a terrific mouthful of poison.
-- Blessed, blessed, blessed the advice I was given! --
My guts are on fire.
The power of the poison twists my arms and legs, cripples me, drives me to the ground.
I die of thirst, I suffocate, I cannot cry.
This is Hell, eternal torment!
See how the flames rise! I burn as I ought to.
Go on, Devil!


The poison represents the drugs and other methods he used to derange his senses in order to become a visionary. Despite being tortured by hallucinations, Rimbaud vaunts of his talent and his imagination:

I will tear the veils from every mystery: mysteries of religion or of nature, death, birth, the future, the past, cosmogony, and nothingness.
I am a master of phantasmagoria.


The night in hell is followed by two “Deliriums.” The first, “Delirium I” or “The Foolish Virgin -- The Infernal Bridegroom”, is a dialogue by the Foolish Virgin (Verlaine) describing the traits of the Infernal Bridegroom (Rimbaud). We get a hard look at the relationship of the two men, and Rimbaud does not spare himself. The Virgin at one point says “I go where he goes, I have to. And lots of times he gets mad at me, at me, poor sinner. That Devil! He really is a Devil, you know, and not a man.”

Existentialism Unbound - Sartre's Bad Faith Waiter - Que Sais-Je??


What is one to make of Sartre's treatment of his waiter in one of his famous analyses of bad faith?
The example is supposed to be an obvious one, but the more we examine it, the less obvious it becomes.
Let us remind ourselves of Sartre's example: Let us consider this waiter in the cafe. His movement is quick and forward, a little too precise, a little too rapid.

He comes toward the patrons with a step a little too quick. He bends forward a little too eagerly; his voice, his eyes express an interest a little too solicitous for the order of the customer.
Finally there he returns, trying to imitate in his walk the inflexible stiffness of some kind of automaton while carrying his tray with a recklessness of a tight-rope walker by putting it in a perpetually unstable, perpetually broken equilibrium which he perpetually re-establishes by a light movement of the arm and hand.

What is to be our reaction to the actions of this waiter? The sentences immediately following those I have quoted may mislead us about the answer Sartre would have us give to this question:
All his behaviour seems to us a game.

He applies himself to chaining his movements as if they were mechanisms, the one regulating the other; his gestures and even his voice seem to be mechanisms; he gives himself the quickness and pitiless rapidity of things. He is playing, he is amusing himself.
But what is he playing? We need not watch long before we can explain it: he is playing at being a waiter in a cafe. There is nothing there to surprise us.

If we were to follow the suggestion which seems to be in these lines, Sartre's waiter would be seen as one who could be contrasted with a waiter of a more natural disposition. Some men, while waiting on tables in a cafe, play at being waiters. Others are waiters.

Sartre's description would not need much extending in order to be a depiction of a Uriah Heep-like waiter. Whether we say that he is consciously play-acting or not, we can say that Sartre's waiter is in danger of becoming a caricature of a waiter.

Thursday, December 1, 2011

Shakespeare - SONNET 129 - The expense of spirit in a waste of shame


SONNET 129
The expense of spirit in a waste of shame
Is lust in action; and till action, lust
Is perjured, murderous, bloody, full of blame,
Savage, extreme, rude, cruel, not to trust,
Enjoy'd no sooner but despised straight,
Past reason hunted, and no sooner had
Past reason hated, as a swallow'd bait
On purpose laid to make the taker mad;
Mad in pursuit and in possession so;
Had, having, and in quest to have, extreme;
A bliss in proof, and proved, a very woe;
Before, a joy proposed; behind, a dream.
All this the world well knows; yet none knows well
To shun the heaven that leads men to this hell.

This poem, as incisive an anatomy of erotic compulsion as exists in English, begins by evoking “the expense of spirit in a waste of shame” and cycles through the rages and frustrations of lust before collapsing in exhausted fatalism:

All this the world well knows; yet none knows well
to shun the heaven that leads men to this hell.


This, one of the most famous sonnets, explores the reaction of the human psyche to the promptings of sexual urges. The folk wisdom of omne animal post coitum triste est, which is often quoted in connection with this sonnet, is banal in comparison to the ideas developed here. One has to look back to the ancient Greek world, and to the plays of Euripides, especially The Bacchae and Hippolytus, to find an equivalent. Particularly striking is the torrent of adjectives describing the build up of desire, and the imagery of the hooked fish which portrays the victim of lust as a frenzied animal expending its last vital energies in paroxysms of rage and futile struggle, even though it is inevitably doomed.

In relation to the sonnet sequence as a whole, it is worth noting that nothing like this is found in the series to the young man. The profound hatred of sexuality does not occur within that context, where the passions expressed are undying and lofty, although often intermingled with sexual humour and puns.

Tuesday, November 29, 2011

Elvis Presley: Young Man With the Big Beat - When He Was Still Dangerous and Sexy


Elvis Presley: Young Man With the Big Beat

RCA/Legacy, five CDs, $85. In the span of one pivotal year, a Southern rockabilly singer became the biggest star on the planet. This set, titled after a poster campaign for his first album, chronicles Presley’s explosive rise in 1956 with his complete RCA studio master recordings, live cuts, outtakes from New York, Nashville and Memphis, interviews and an 80-page book with rare photos, documents and a richly detailed timeline

Don’t Be Cruel, Love Me Tender, Hound Dog, Blue Suede Shoes and other hits are here, as well as a previously unreleased concert in Shreveport, La., remasters of four songs performed at the Frontier Hotel in Las Vegas, and segments of his candid “The Truth About Me” monologue.More:

Monday, November 28, 2011

The Asteroids Galaxy Tour - "The Golden Age".


The Asteroids Galaxy Tour - "The Golden Age".
How can you not love this spiffy and shiny group??
Danish pop band consisting of vocalist Mette Lindberg and producer Lars Iversen
As seen on Gossip Girl... and most importantly the Heinie adverts from the Heineken Channel.

Sunday, November 27, 2011

Neruda Poem XIV - My words rained over you, stroking you.


Every day you play with the light of the universe.
Subtle visitor, you arrive in the flower and the water.
You are more than this white head that I hold tightly
as a cluster of fruit, every day, between my hands.

You are like nobody since I love you.
Let me spread you out among yellow garlands.
Who writes your name in letters of smoke among the stars of the south?
Oh let me remember you as you were before you existed.

Suddenly the wind howls and bangs at my shut window.
The sky is a net crammed with shadowy fish.
Here all the winds let go sooner or later, all of them.
The rain takes off her clothes.

The birds go by, fleeing.
The wind. The wind.
I can contend only against the power of men.
The storm whirls dark leaves
and turns loose all the boats that were moored last night to the sky

Sunday, November 20, 2011

Martha Argerich - Magician on the Piano and Chopin No less!


Caught Kiss Me Deadly - apocalyptic, nihilistic, science-fiction film noir - what more is there to say??


Caught Kiss Me Deadly today
Amazing film said to be: the definitive, apocalyptic, nihilistic, science-fiction film noir of all time – at the close of the classic noir period.
Amazing for the time - and filled with a vision of what was to come via Sergio among others.

Shot in 2r days and released in May of that paradigm-shift year 1955, Kiss Me Deadly was directed by Robert Aldrich from a screenplay by proletarian poet AI. 'Buzz' Bezzerides, taking plot incident and little else from Mickey Spillane's sixth novel starring his hugely popular skull-cracking hero Hammer. The narrator of a DVD documentary supplement about crew-cut Spillane, a "blue-collar writer of comic books", calls it "a left-wing attempt by Aldrich to undermine and criticise the conservative Spillane". In a recorded introduction, Alex Cox reveals that "disguised as a tough-guy detective picture", Kiss Me Deadly "is actually an anti-nuclear parable with classical allusions".

"Actually," Kiss Me Deadly is both things - it wouldn't have any shelf life if it weren't a pluperfect genre film in which every investigative episode has its own kook rhythm. The mongrel authorship contributes to the feeling of a movie that coalesced out of the atmosphere rather than being storyboarded. Gross contrasts define this film, not least that between the rotting filigree of the previous century, in the gingerbread boarding houses and faded gentility of LA's Bunker Hill neighbourhood where Hammer does his snooping, and the streamlined Atomic Age that he's the avatar of, with his clean-machine Jaguar XK and neat-as-a-pin mid-century-modern Wilshire Boulevard bachelor pad. The concluding white-light apocalypse is Armageddon as the epitome of 1950s tidiness: the Big One as Big Disinfectant.

Saturday, November 19, 2011

I don't know quite what to say?? !! Yngwie Malmsteen - Concerto Suite in Eb minor For Electric Guitar

I am equal parts appalled and amazed
Those Japanese are truly crazy folks
Allow me to let M. de Montaigne to express my feelings:
~ We are, I know not how, double in ourselves, so that what we believe we disbelieve, and cannot rid ourselves of what we condemn.

Essay Today - Nietzsche The Gay Science - Amor Fati and Fatalism


Aphosrism 312
My Dog - I have named my pain and called it 'dog'.

That is it - my complete essay in one sentence... wild.... neh??
Ohhhh Yeahhhhh -I did soooo use a Shogun reference

Fatalism (Amor Fati)
Taking risks requires accepting the consequences, and this sort of fatalism appeals to Nietzsche. I think that in this he is very much in league with Sartre, who may not have believed in fate but certainly did insist that one must accept the consequences of his or her actions. Fatalism appealed to Nietzsche in his analysis of the ancient Greeks, their acceptance of life and their fate in the face of absurdity and suffering. It appealed to him in considering his own miserable life, the triumph of his genius in the face of his own absurd suffering. “Not just to accept fate,” he exclaims, toward the end of his life, “but to love it, amor fati!”  I will have a good deal more to say about Nietzsche’s “classical” concept of fatalism in the next chapter. But taking fatalism as a virtue, it is of a piece with his overall insistence on “life affirmation” and his rejection of Schopenhauer’s pessimism. To accept joyfully rather than bitterly curse one’s fate—and Nietzsche surely had a good deal in his life to bitterly curse—is one of life’s greatest virtues.

The question then becomes whether Nietzsche’s many comments and occasional arguments in favor of “the love of fate” (amor fati) and against “free will” undermine any interpretation of his philosophy in existentialist and “self-making” terms. I take it that some such conception of self-making or self-creation is central to both Kierkegaard and Sartre, at least, and as such I take it to be the definitive core of that exquisite sensibility called “Existentialism.” I want to argue that Nietzsche’s fatalism and his “self-making” are ultimately two sides of the same coin and not at odds or contradictory. Nietzsche embraces the notions of responsibility—in particular, the responsibility for one’s character and “who one is”—but without invoking “free will.”

The Epic Gilgamesh - Re-read many times! ;-)


Such an amazing work - and such a great historical tale.
You can see all of the bible, Homer and Plato starting to be reflected in these pages
Do the Andrew George version - my personal fav and I have read six different ones!

Gilgamesh was king in Uruk, the great walled city. On the sheepfold of Uruk he lifts his gaze high.
But Gilgamesh oppressed his people.
Day and night he oppresses the weak
Gilgamesh does not let the young women go to her mother,
The girl to the warrior, the bride to the young groom
So the people cried out to the gods and the mother, the great lady heard them, and formed Enkidu in the wilderness from a pinch of clay.
She gave birth in darkness and silence to one like the war god
He knew neither people nor homeland
He fed with the gazelles on grass
With wild animals he drank at the waterholes
With hurrying animals his heart grew light.
He came into the city and fought with Gilgamesh. They fought in the market. They fought in the temples. They fought on the walls. All day they fought until dusk, at the gates to the city,
Gilgamesh at last threw down Enkidu. Gilgamesh shaped his mouth to speak and praised Enkidu, saying “who is like you, you have no match. Let me cease from oppressing these people, and let us be friends.”
Enkidu stood there, listening to his words.
They caused him to grow pale.
He sat down, weeping;
His eyes filled with tears,
His arms went slack, his strength left him.
And they seized on another, embracing,
Took one another’s hands like brothers
Enkidu spoke words to Gilgamesh: friend.

Wednesday, November 16, 2011

Boltzmann’s Entropy Formula - A Very Beautiful Equation



Boltzmann’s Entropy Formula

Nature loves chaos when it pushes systems toward equilibrium, and geeks call this universal property entropy.

Austrian physicist Ludwig Boltzmann laid entropy’s statistical foundations; his work was so important that the great physicist Max Planck suggested that his version of Boltzmann’s formula* be engraved on Boltzmann’s tombstone in Vienna (above).

The equation describes the tight relationship between entropy (S), and the myriad ways particles in a system can be arranged (k log W). The last part is tricky. k is Boltzmann's constant and W is the number of microscopic elements of a system (e.g. the momentum and position of individual atoms of gas) in a macroscopic system in a state of balance (e.g., gas sealed in a bottle).

Lifetime of Listening - THE DOORS


The best piece of advice I’ve heard someone give an aspiring rock critic is this: For God’s sake, don’t try to write like Greil Marcus.

THE DOORS
A Lifetime of Listening to Five Mean Years
By Greil Marcus

It was meant as a compliment. Mr. Marcus’s style — brainy but fevered, as if the fate of Western society hung on a chord progression — is nearly impossible to mimic without sounding portentous and flatulent. This voice is so hard to pull off that 15 percent of the time even Mr. Marcus can’t do it. He takes a pratfall in the attempt.

But, oh my, that other 85 percent. Reading Mr. Marcus at his best — on Bob Dylan, Van Morrison, Sly Stone, the Band, Sleater-Kinney, Dock Boggs or Randy Newman, to name just a few of his obsessions over the years — is like watching a surfer glide shakily down the wall of an 80-foot wave, disappear under a curl for a deathly eternity, then soar out the other end. You practically feel like applauding. He makes you run to your iPod with an ungodly itch in your cranium. You want to hear what he hears. It’s as if he were daring you to get as much out of the music as he does.

Mr. Marcus’s acute and ardent new book, “The Doors: A Lifetime of Listening to Five Mean Years,” is his 13th and among his best. I say this as someone who has never cared deeply or even shallowly about the Doors, a band that to my ears (I was 6 in 1971, the year Jim Morrison died in Paris) has always been classic-rock sonic wallpaper. “The End” sounded ruinous and sublime in “Apocalypse Now.” But please don’t make me listen to “Hello, I Love You” or “Touch Me” again. I’m pretty sure Jose Feliciano will be singing “Light My Fire” in hell.

Mr. Marcus’s achievement in “The Doors” is to isolate and resurrect this band’s best music and set it adrift in a swirling and literate cultural context. He catches “the sweep, the grandeur, the calmness” of their songs. He underscores Jim Morrison’s otherworldly appeal: “Unlike any rock ’n’ roll singer since ‘Heartbreak Hotel’ devoured the world’s airwaves, he had Elvis’s Greek-god looks, his seductive vampire’s hooded eyes; like Elvis he communicated the disdain of the beautiful for the ordinary world.”

He captures, excellently, how Morrison unnerved, during the Charles Manson years, everyone who saw the band. “Here’s this nice-looking person on the stage all but threatening you with a spiritual death penalty,” Mr. Marcus writes, “and turning you into a jury that convicts yourself.”

Don’t come to “The Doors” looking for a history of the band; Mr. Marcus dispenses with that in one short paragraph in front. Don’t arrive looking for another overview of Morrison’s childhood, or a fresh account of his arrest in Miami in 1969 for exposing himself onstage.

Don’t come looking for hagiography, either. He’s fully aware that some of the band’s music “carries the smell of falsity, pretension, bad poetry.” Some of the Doors’ music was so fetid, Mr. Marcus writes, that “Morrison sounded as if he had a bag over his face, so no one would know who was singing it.”

This book is broken down into short chapters, most of them named for songs Mr. Marcus either admires or has something to say about: “Roadhouse Blues,” “L.A. Woman,” “The End,” “When the Music’s Over,” “Soul Kitchen.” He spends a lot of time listening to bootleg live performances of these songs, as well as concert moments like the band’s 11-minute cover of the Elvis song “Mystery Train.” He is a close reader of the band’s “language of dread.”

Mr. Marcus is old enough — he was born in 1945 — to have seen the Doors perform live a dozen or so times. He’s old enough, too, that those of us who’ve kept up with his work over the years worry that he might lose a step someday. He hasn’t. And he can still surprise you. In this book he professes his admiration for Oliver Stone’s oft criticized 1991 movie “The Doors,” which starred Val Kilmer as Morrison. “The movie should have been awful,” Mr. Marcus writes. “Instead it was terrifying.”

He surprises in other ways. Who’d have thought Mr. Marcus would have loved Lady Gaga’s song “Bad Romance,” or (it hurts me a bit to type this) Train’s “Hey, Soul Sister”? Or that he’s a fan of the 1990 Christian Slater movie “Pump Up the Volume”?

Along the way, in “The Doors,” Mr. Marcus bounces his assessments of the band’s music against the work of other musicians, writers and artists, from Thomas Pynchon to the blues singer Robert Johnson, to the dramatist Dennis Potter and the Firesign Theater.

He quotes others shrewdly. Here’s a small X-ray of the Oliver Stone movie from the writer Eve Babitz: “Val Kilmer is supposed to have gotten Jim’s looks exactly right, but what can Val Kilmer know of having been fat all his life and suddenly one summer taking so much LSD and waking up a prince? Val Kilmer has always been a prince, so he can’t have the glow.”

Mr. Marcus dilates at length, too, on a quotation from Kim Gordon, the bassist for the band Sonic Youth. “People pay,” Ms. Gordon wrote in 1983 about the best rock musicians, “to see others believe in themselves.” The author follows this tangent a long way, and follows others too, including the way FM radio has compressed so many careers into a few songs and the idiocies of ’60s nostalgia.

He is pointed about why bands need hit songs. “If ‘Light My Fire’ hadn’t made the Doors into stars,” Mr. Marcus suggests, “you can hear how their music could have curdled into artiness, everything self-referential, post-modern, each note a parody of something else, not a word needing to mean what it said, the group more popular in Paris or Milan.”

The Doors will never mean to me what they mean to Mr. Marcus. But this book means more to me than most rock books. Thankfully there’s little history, little reporting and few facts in it. As Jim Morrison said in a 1967 interview, in a line Mr. Marcus happily reprints, “Critical essays are really where it’s at.”

Monday, November 14, 2011

Goya's Black Paintings - Witches's Sabbath


It really must be admitted that things seen in sleep are, as it
were, painted images, which could have been produced
only in the likeness of true things.-Rene Descartes,
Discourse on Method and Meditations on First Philosophy'
Fowl take flight as an unearthly entourage of powerful male
nudes, infant victims, an old crone, goats, and strange skeletal
yet animate creatures rush along to the sound of a horn,
perhaps also wind and wailing. A solitary witch squats amid
the hindquarters
of a Leviathan-like skeleton,2 directly above
a crouching male nude, the bottoms of his feet thrust toward
the viewer (not to mention his buttocks), whereas four fully
extended nudes, three of them so youthful as to be yet
unbearded, propel themselves forward through the pictorial
space like stallions at a canter.
Lo stregozzo, or the procession
to a witches' Sabbath,3 is a
large engraving (almost 12 by about 25 inches) of debatable
but early sixteenth-century date, whose inventor and engraver
are uncertain (Fig. 1). The existing scholarly consensus that
the print be classified as essentially Roman and basically High
Renaissance in design is reconsidered here. Such a concep-
tion of this work, at the very least, fails to do justice to the
complexity of its reference. But I hope to establish more than
a dismantling of this engraving's usual classification: I believe
this object documents a very rare intersection between
heretical and artistic instances of fantasia. By addressing a
pressing dispute about the manifestations of demonic power,
which hinged on determining the boundary between imagina-
tion and fact, this work of art-relatively little known today--
was intended to play an unusually critical role in molding
opinion about extra-artistic matters in early modern northern
and central Italy. The crux of Lo stregozzo lies less in simply
recognizing the subject, which is not learned, than in answer-
ing the following question: Does it represent this nocturnal
cavalcade as fact or fiction? Even the learned held differing
opinions about this question.

Is Neuroscience the Death of Free Will? The Meaning of Self


Is free will an illusion? Some leading scientists think so. For instance, in 2002 the psychologist Daniel Wegner wrote, “It seems we are agents. It seems we cause what we do… It is sobering and ultimately accurate to call all this an illusion.” More recently, the neuroscientist Patrick Haggard declared, “We certainly don’t have free will. Not in the sense we think.” And in June, the neuroscientist Sam Harris claimed, “You seem to be an agent acting of your own free will. The problem, however, is that this point of view cannot be reconciled with what we know about the human brain.”

Many neuroscientists are employing a flawed notion of free will.
Such proclamations make the news; after all, if free will is dead, then moral and legal responsibility may be close behind. As the legal analyst Jeffrey Rosen wrote in The New York Times Magazine, “Since all behavior is caused by our brains, wouldn’t this mean all behavior could potentially be excused? … The death of free will, or its exposure as a convenient illusion, some worry, could wreak havoc on our sense of moral and legal responsibility.”

Indeed, free will matters in part because it is a precondition for deserving blame for bad acts and deserving credit for achievements. It also turns out that simply exposing people to scientific claims that free will is an illusion can lead them to misbehave, for instance, cheating more or helping others less. [1] So, it matters whether these scientists are justified in concluding that free will is an illusion.

Here, I’ll explain why neuroscience is not the death of free will and does not “wreak havoc on our sense of moral and legal responsibility,” extending a discussion begun in Gary Gutting’s recent Stone column. I’ll argue that the neuroscientific evidence does not undermine free will. But first, I’ll explain the central problem: these scientists are employing a flawed notion of free will. Once a better notion of free will is in place, the argument can be turned on its head. Instead of showing that free will is an illusion, neuroscience and psychology can actually help us understand how it works.

Leif Parsons
When Haggard concludes that we do not have free will “in the sense we think,” he reveals how this conclusion depends on a particular definition of free will. Scientists’ arguments that free will is an illusion typically begin by assuming that free will, by definition, requires an immaterial soul or non-physical mind, and they take neuroscience to provide evidence that our minds are physical. Haggard mentions free will “in the spiritual sense … a ghost in the machine.” The neuroscientist Read Montague defines free will as “the idea that we make choices and have thoughts independent of anything remotely resembling a physical process. Free will is the close cousin to the idea of the soul” (Current Biology 18, 2008).[2] They use a definition of free will that they take to be demanded by ordinary thinking and philosophical theory. But they are mistaken on both counts.

We should be wary of defining things out of existence. Define Earth as the planet at the center of the universe and it turns out there is no Earth. Define what’s moral as whatever your God mandates and suddenly most people become immoral. Define marriage as a union only for procreation, and you thereby annul many marriages.

The sciences of the mind do give us good reasons to think that our minds are made of matter. But to conclude that consciousness or free will is thereby an illusion is too quick. It is like inferring from discoveries in organic chemistry that life is an illusion just because living organisms are made up of non-living stuff. Much of the progress in science comes precisely from understanding wholes in terms of their parts, without this suggesting the disappearance of the wholes. There’s no reason to define the mind or free will in a way that begins by cutting off this possibility for progress.

Our brains are the most complexly organized things in the known universe, just the sort of thing that could eventually make sense of why each of us is unique, why we are conscious creatures and why humans have abilities to comprehend, converse, and create that go well beyond the precursors of these abilities in other animals. Neuroscientific discoveries over the next century will uncover how consciousness and thinking work the way they do because our complex brains work the way they do.

Our capacities for conscious deliberation, rational thinking and self-control are not magical abilities.
These discoveries about how our brains work can also explain how free will works rather than explaining it away. But first, we need to define free will in a more reasonable and useful way. Many philosophers, including me, understand free will as a set of capacities for imagining future courses of action, deliberating about one’s reasons for choosing them, planning one’s actions in light of this deliberation and controlling actions in the face of competing desires. We act of our own free will to the extent that we have the opportunity to exercise these capacities, without unreasonable external or internal pressure. We are responsible for our actions roughly to the extent that we possess these capacities and we have opportunities to exercise them.

These capacities for conscious deliberation, rational thinking and self-control are not magical abilities. They need not belong to immaterial souls outside the realm of scientific understanding (indeed, since we don’t know how souls are supposed to work, souls would not help to explain these capacities). Rather, these are the sorts of cognitive capacities that psychologists and neuroscientists are well positioned to study.

This conception of free will represents a longstanding and dominant view in philosophy, though it is typically ignored by scientists who conclude that free will is an illusion. It also turns out that most non-philosophers have intuitions about free and responsible action that track this conception of free will. Researchers in the new field of experimental philosophy study what “the folk” think about philosophical issues and why. For instance, my collaborators and I have found that most people think that free will and responsibility are compatible with determinism, the thesis that all events are part of a law-like chain of events such that earlier events necessitate later events.[3] That is, most people judge that you can have free will and be responsible for your actions even if all of your decisions and actions are entirely caused by earlier events in accord with natural laws.

Our studies suggest that people sometimes misunderstand determinism to mean that we are somehow cut out of this causal chain leading to our actions. People are threatened by a possibility I call “bypassing” — the idea that our actions are caused in ways that bypass our conscious deliberations and decisions. So, if people mistakenly take causal determinism to mean that everything that happens is inevitable no matter what you think or try to do, then they conclude that we have no free will. Or if determinism is presented in a way that suggests all our decisions are just chemical reactions, they take that to mean that our conscious thinking is bypassed in such a way that we lack free will.

Even if neuroscience and psychology were in a position to establish the truth of determinism — a job better left for physics — this would not establish bypassing. As long as people understand that discoveries about how our brains work do not mean that what we think or try to do makes no difference to what happens, then their belief in free will is preserved. What matters to people is that we have the capacities for conscious deliberation and self-control that I’ve suggested we identify with free will.

But what about neuroscientific evidence that seems to suggest that these capacities are cut out of the causal chains leading to our decisions and actions? For instance, doesn’t neuroscience show that our brains make decisions before we are conscious of them such that our conscious decisions are bypassed? With these questions, we can move past the debates about whether free will requires souls or indeterminism — debates that neuroscience does not settle — and examine actual neuroscientific evidence. Consider, for instance, research by neuroscientists suggesting that non-conscious processes in our brain cause our actions, while conscious awareness of what we are doing occurs later, too late to influence our behavior. Some interpret this research as showing that consciousness is merely an observer of the output of non-conscious mechanisms. Extending the paradigm developed by Benjamin Libet, John-Dylan Haynes and his collaborators used fMRI research to find patterns of neural activity in people’s brains that correlated with their decision to press either a right or left button up to seven seconds before they were aware of deciding which button to press. Haynes concludes: “How can I call a will ‘mine’ if I don’t even know when it occurred and what it has decided to do?”

However, the existing evidence does not support the conclusion that free will is an illusion. First of all, it does not show that a decision has been made before people are aware of having made it. It simply finds discernible patterns of neural activity that precede decisions. If we assume that conscious decisions have neural correlates, then we should expect to find early signs of those correlates “ramping up” to the moment of consciousness. It would be miraculous if the brain did nothing at all until the moment when people became aware of a decision to move. These experiments all involve quick, repetitive decisions, and people are told not to plan their decisions but just to wait for an urge to come upon them. The early neural activity measured in the experiments likely represents these urges or other preparations for movement that precede conscious awareness.

This is what we should expect with simple decisions. Indeed, we are lucky that conscious thinking plays little or no role in quick or habitual decisions and actions. If we had to consciously consider our every move, we’d be bumbling fools. We’d be like perpetual beginners at tennis, overthinking every stroke. We’d be unable to speak fluently, much less dance or drive. Often we initially attend consciously to what we are doing precisely to reach the point where we act without consciously attending to the component decisions and actions in our complex endeavors. When we type, tango, or talk, we don’t want conscious thinking to precede every move we make, though we do want to be aware of what we’re doing and correct any mistakes we’re making. Conscious attention is relatively slow and effortful. We must use it wisely.

We need conscious deliberation to make a difference when it matters — when we have important decisions and plans to make. The evidence from neuroscience and psychology has not shown that consciousness doesn’t matter in those sorts of decisions — in fact, some evidence suggests the opposite. We should not begin by assuming that free will requires a conscious self that exists beyond the brain (where?), and then conclude that any evidence that shows brain processes precede action thereby demonstrates that consciousness is bypassed. Rather, we should consider the role of consciousness in action on the assumption that our conscious deliberation and rational thinking are carried out by complex brain processes, and then we can examine whether those very brain processes play a causal role in action.

For example: suppose I am trying to decide whether to give $1,000 to charity or buy a new TV. I consciously consider the reasons for each choice — e.g., how it fits with my goals and values. I gather information about each option. Perhaps I struggle to overcome my more selfish motivations. I decide based on this conscious reasoning (it certainly would not help if I could magically decide on no basis at all), and I act accordingly. Now, let’s suppose each part of this process is carried out by processes in my brain. If so, then to show that consciousness is bypassed would require evidence showing that thosevery brain processes underlying my conscious reasoning are dead-ends. It would have to show that those brain processes do not connect up with the processes that lead to my typing my credit card number into the Best Buy Web site (I may then regret my selfish decision and re-evaluate my reasons for my future decisions).

None of the evidence marshaled by neuroscientists and psychologists suggests that those neural processes involved in the conscious aspects of such complex, temporally extended decision-making are in fact causal dead ends. It would be almost unbelievable if such evidence turned up. It would mean that whatever processes in the brain are involved in conscious deliberation and self-control — and the substantial energy these processes use — were as useless as our appendix, that they evolved only to observe what we do after the fact, rather than to improve our decision-making and behavior. No doubt these conscious brain processes move too slowly to be involved in each finger flex as I type, but as long as they play their part in what I do down the road — such as considering what ideas to type up — then my conscious self is not a dead end, and it is a mistake to say my free will is bypassed by what my brain does.

So, does neuroscience mean the death of free will? Well, it could if it somehow demonstrated that conscious deliberation and rational self-control did not really exist or that they worked in a sheltered corner of the brain that has no influence on our actions. But neither of these possibilities is likely. True, the mind sciences will continue to show that consciousness does not work in just the ways we thought, and they already suggest significant limitations on the extent of our rationality, self-knowledge, and self-control. Such discoveries suggest that most of us possess less free will than we tend to think, and they may inform debates about our degrees of responsibility. But they do not show that free will is an illusion.

If we put aside the misleading idea that free will depends on supernatural souls rather than our quite miraculous brains, and if we put aside the mistaken idea that our conscious thinking matters most in the milliseconds before movement, then neuroscience does not kill free will. Rather, it can help to explain our capacities to control our actions in such a way that we are responsible for them. It can help us rediscover free will.

Sunday, November 13, 2011

Heidegger in Five Bullet Points - Being and Time


While you really should try and get through all 500 pages of Being and Time - the MacQuarrie Edition please!
Here is a short cheat sheet that will make you sound like you know what you are talking about... as long as you can keep your mouth shut afterwards!
1. Humans are Beings that care about what they are - we care about our lives, our surroundings.
2. Humans always take some perspective or active stand on their life. We are actually the sum of our stands.
3. In addition to #2 - Humans are simply what we do - there is no magical essence given to us by our parents or by a mysterious force ahead of time.
4. We are defined by how we live our life - all of our life - we are non-essentialist beings
5. You knew this was coming based on the title of the book - Our own existence will always have a Temporal structure that is unique to us: Throwness and Futurity and lastly Being toward the end.
Thank you verrrry much.... Elvis Heidegger has left the building.

Aristotelian Baked Broccoli Rabe With Parmesan - very yummy


Baked Broccoli Rabe With Parmesan
Yield 6 servings

Time 25 minutes

Ingredients
Salt
3 pounds broccoli rabe, washed and trimmed
2 to 4 tablespoons olive oil, plus some for greasing the pan
10 garlic cloves, peeled and sliced
1 cup grated Parmigiano-Reggiano.
Method
1. Bring a large pot of salted water to boil. Preheat the oven to 350 degrees. Drop broccoli rabe into the water, and cook until bright green and tender, about 3 minutes. Remove, and plunge into ice water. Drain.
2. Place 2 tablespoons olive oil in a skillet, and turn heat to medium-high. Toast garlic in oil until golden. Chop broccoli rabe into pieces, about an inch or two long, and add to skillet. Toss, then turn off heat.
3. Add broccoli rabe mixture to a baking dish. Sprinkle with grated cheese, and bake until cheese melts, about 10 minutes. Serve hot.

Saturday, November 12, 2011

Brown Rice Magic - Pilaf, Stew,Salad, Cakes


1. PILAF
Garlic and Parsley

Cook 1 tablespoon minced garlic in 2 tablespoons butter for 2 minutes. Add 1 1/2 cups brown basmati (or other) rice and cook, stirring, about 3 minutes. Add 3 cups stock, bring to a boil, lower the heat and cover. Cook until the liquid is absorbed and the rice tender, 40 to 50 minutes. Stir in 1/2 cup chopped parsley, cover and let rest for 10 minutes. Garnish: Chopped parsley.

Sausage, Red Peppers and Onions

Use olive oil instead of butter. Cook 1 sliced onion, 1 sliced red bell pepper and 8 ounces sliced or chunked Italian sausage in the oil before adding the garlic. Substitute basil for the parsley.

Shrimp, Scallions and Snow Peas

Use neutral oil (like corn) instead of butter. Cook 1/2 cup chopped scallions in oil before adding garlic. Add 8 ounces peeled shrimp (chopped, if large), 1 cup snow peas, 1 tablespoon soy sauce and 1 teaspoon sesame oil for the last 5 minutes of cooking. Garnish: Chopped cilantro.

2. STEW
Fried Egg and Chives

Combine 1 1/2 cups brown rice with 3 cups water over high heat. Bring to a boil, lower the heat and partly cover. Cook, stirring occasionally and adding more water if necessary, until the rice is tender and thick, 45 to 60 minutes. Stir in 1/2 cup chopped chives. Meanwhile, fry 4 eggs in 1 tablespoon sesame oil. Serve the eggs over the rice. Garnish: Chopped chives.

Jerk Chicken

Skip the chives and eggs. First, sear 4 bone-in chicken thighs in 3 tablespoons olive oil. Add 1 chopped onion and cook for 5 minutes; add 1 tablespoon minced ginger, 1 teaspoon thyme, 1/2 teaspoon minced habanero chili, 1 teaspoon allspice, 1/2 teaspoon cinnamon, rice and the water; proceed as above. Garnish: Thyme leaves.

Coconut and Molasses

Skip the chives and eggs. Substitute 1 can coconut milk for 1 1/2 cups of the water and add 1/2 cup shredded unsweetened coconut to the rice. Serve drizzled with molasses.

3. SALAD
White Bean, Lemon and Tomato

Combine 1 1/2 cups brown rice with 2 1/2 cups water over high heat. Bring to a boil, lower the heat and cook until the liquid is absorbed and the rice is tender, 40 to 45 minutes. Chill if time allows. Toss with 1 cup cooked white beans, 1 cup halved cherry tomatoes, 1/4 cup chopped dill, 1/4 cup olive oil, 2 tablespoons lemon juice and 1 tablespoon minced garlic. Garnish: Chopped dill.

Grape and Ricotta

Substitute ricotta cheese for the white beans, grapes for the cherry tomatoes, basil for the dill and 1 chopped small shallot for the garlic.

Broccoli, Pine Nut and Sage

Steam 2 cups broccoli florets until just tender, about 5 minutes; shock in ice water, then drain and chop. Substitute 1/2 cup toasted pine nuts for the white beans, the broccoli for the cherry tomatoes and 1 tablespoon chopped sage for the dill.

4. CAKES
Parmesan and Scallions

Combine 1 1/2 cups brown rice with 3 cups water over high heat. Bring to a boil, then lower the heat. Cook, stirring occasionally and adding more water if needed, until the rice is starchy and soft, about 1 hour. Chill for at least 1 hour. Stir in 1 cup grated Parmesan, 1/2 cup chopped scallions and 1/4 cup chopped parsley. Form into patties and cook in olive oil over medium-high heat until browned on both sides. Garnish: Grated Parmesan.

Carrots and Parsnips

Skip the cheese, scallions and parsley. Instead, stir 1 cup shredded carrots, 1 shredded small onion, 1/2 cup shredded parsnips and 1 tablespoon minced sage into the rice; proceed as above. Garnish: Chopped parsley.

Caramelized Leeks and Spinach

Skip the cheese, scallions and parsley. Cook 2 chopped leeks in 2 tablespoons olive oil until very soft and brown, 15 to 20 minutes. Add 3 cups chopped spinach and cook just until wilted. Stir the leeks and spinach into the rice; proceed as above. Serve with lemon wedges.

Thursday, November 10, 2011

Charlotte Rampling, 65 years old and un-retouched


There aren’t many film stars who, entering their golden years, endure as icons of sensuality. Charlotte Rampling, 65 years old and un-retouched (both on screen and in real life), is one of them. The daughter of a British colonel and NATO commander, Rampling attended tony girls’ schools in France and England.

She modeled briefly, before trying films, and burst onto the scene in “Georgy Girl” (1966), in which she played a flirt who is the embodiment of Swinging London. Today, after five decades on screen (including a star turn in Visconti’s classic, “The Damned”), and modeling for everyone from Helmut Newton to Juergen Teller, Rampling’s career shows no signs of abating.

In “The Mill and the Cross” (currently in theaters), the Polish director Lech Majewski’s visually ravishing examination of a Breugel painting, she takes on the Virgin Mary; in “Melancholia” (opening Friday), the bad-boy Danish director Lars von Trier’s latest, her character is based upon von Trier’s despised mother. And in the fashion world, Rampling continues to inspire: Marc Jacobs designed much of his fall 2011 collection for Louis Vuitton after her role in “The Night Porter” (1974), Liliana Calvani’s controversial, S&M-infused drama about a death-camp survivor’s affair with a Nazi officer (think: leather caps and suspenders).

Age cannot Love destroy,
But perfidy can blast the flower,
Even when in most unwary hour
It blooms in Fancy’s bower.
Age cannot Love destroy,
But perfidy can rend the shrine
In which its vermeil splendours shine.
~Untitled (1810); titled "Love's Rose" by William Michael Rossetti in Complete Poetical Works of Percy Bysshe Shelley (1870)

Wednesday, November 9, 2011

Real Chili Uses Beans! Bleahhhhhh


Ingredients
1 pound beef round, trimmed and cut into 1/2-inch chunks
Salt & freshly ground pepper, to taste
1 1/2 tablespoons canola oil, divided
3 onions, chopped
1 green bell pepper, seeded and chopped
1 red bell pepper, seeded and chopped
6 cloves garlic, minced
2 jalapeno peppers, seeded and finely chopped
2 tablespoons ground cumin
2 tablespoons chili powder
1 tablespoon paprika
2 teaspoons dried oregano
12 ounces dark or light beer
1 28-ounce can diced tomatoes
8 sun-dried tomatoes, (not packed in oil), snipped into small pieces
2 bay leaves
3 19-ounce cans dark kidney beans, rinsed
1/4 cup chopped fresh cilantro
2 tablespoons lime juice

Preparation
1.Season beef with salt and pepper. Heat 1 1/2 teaspoons oil in a Dutch oven over medium-high heat. Add half the beef and cook, stirring occasionally, until browned on all sides, 2 to 5 minutes. Transfer to a plate lined with paper towels. Repeat with another 1 1/2 teaspoons oil and remaining beef.
2.Reduce heat to medium and add remaining 1 1/2 teaspoons oil to the pot. Add onions and bell peppers; cook, stirring frequently, until onions are golden brown, 10 to 20 minutes. Add garlic, jalapenos, cumin, chili powder, paprika and oregano. Stir until aromatic, about 2 minutes.
3.Add beer and simmer, scraping up any browned bits, for about 3 minutes. Add diced tomatoes, sun-dried tomatoes, bay leaves and reserved beef. Cover and simmer, stirring occasionally, until beef is very tender, 1 1/2 to 2 hours.
4.Add beans; cook, covered, stirring occasionally, until chili has thickened, 30 to 45 minutes. Remove bay leaves. Stir in cilantro and lime juice. Adjust seasoning with salt and pepper.

Venus in Furs - Ohhhhh Yeahhhhh



New revival in NY - Why am I here??



The plays opens with a thunderclap, as Ms. Arianda’s Vanda staggers into the generic room where Mr. Dancy’s Thomas is getting ready to wrap up his day, having failed to find the actress he was hoping for. In a tempest of irritation and mortification, Vanda rails at the subway, the rain and the fates, spewing expletives about the creepy guy who was feeling her up on the train.
She is hours late and knows she’s probably blown her chances, but proceeds to cajole, apologize and charm Thomas into hearing her out, flickering between abjection and steely insistence, modes that will prove to be unusually appropriate for the role she’s come to try out for.
Ms. Arianda is the best physical comedian the stage has produced in some time, as affirmed by her physically exuberant performance in the Broadway revival of “Born Yesterday” last spring. Here her klutzy ballet of desperation as Vanda wrestles with a recalcitrant umbrella, roots around in her bag for a costume and then suddenly strips down to her saucy black lingerie, is a dazzling comic set piece.



http://video.nytimes.com/video/2011/11/08/theater/100000001156486/excerpt-venus-in-fur-on-broadway.html

Sunday, November 6, 2011

Nietzsche's Amor Fati and Becoming What One Is - Some surprising conclusions to my investigations!




Fatalistic themes recur throughout Ecce Homo. Explaining why he returned to Rome while writing Zarathustra, Nietzsche comments that “some fatality was at work” (EH III: Z-4).


He declares that “amor fati” is the mark of “greatness”: that one does not merely “bear what is necessary . . . but love[s] it” (EH II: 10). Later he remarks (not surprisingly) that “amor fati is my inmost nature” (EH III: CW-4).


The depth of Nietzsche’s fatalism regarding his own life becomes most apparent in a long passage from the second chapter of Ecce Homo. Nietzsche is here discussing his development as a philosopher, after noting that, “To become what one is, one must not have the faintest notion what one is” (EH II: 9).

Saturday, November 5, 2011

The MOST Beautiful Equation: Euler’s Identity































I am totally convinced that this is what Plato was pursuing in the Republic
where e is Euler's number, the base of natural logarithms,

i is the imaginary unit, which satisfies i2 = −1,

and π is pi, the ratio of the circumference of a circle to its diameter

Mathematics for the win!

Philosophers are guys who failed Intro to Calculus






Thursday, November 3, 2011

Marie Howe - Practicing - In the What the Living Do

In WHAT THE LIVING DO,

Marie Howe explores the emotional impact of incest and death on a woman from childhood to adulthood.
In the first section, Howe illustrates the tests that the young girl, Marie, experiences. “Sixth Grade” shows her brothers tying Marie and a friend to the garage door and torturing them with a dried deer’s leg. “Practicing” is the girls, first explorations of romance, kissing each other in the basement to get ready for the real thing. And in “The Mother,” readers see the father, drunk and abusive, forcing himself on the girl. In all, readers feel the girl’s detachment as she tries to stay outside of the pain


PRACTICING

I want to write a love poem for the girls I kissed in seventh grade,
a song for what we did on the floor in the basement
of somebody’s parents’ house, a hymn for what we didn’t say but thought:
That feels good or I like that, when we learned how to open each other’s mouths
how to move our tongues to make somebody moan. We called it practicing, and
one was the boy, and we paired off—maybe six or eight girls—and turned out
the lights and kissed and kissed until we were stoned on kisses, and lifted our
nightgowns or let the straps drop, and, Now you be the boy:
concrete floor, sleeping bag or couch, playroom, game room, train room, laundry.
Linda’s basement was like a boat with booths and portholes
instead of windows. Gloria’s father had a bar downstairs with stools that spun,
plush carpeting. We kissed each other’s throats.
We sucked each other’s breasts, and we left marks, and never spoke of it upstairs
outdoors, in daylight, not once. We did it, and it was
practicing, and slept, sprawled so our legs still locked or crossed, a hand still lost
in someone’s hair . . . and we grew up and hardly mentioned who
the first kiss really was—a girl like us, still sticky with moisturizer we’d
shared in the bathroom. I want to write a song
for that thick silence in the dark, and the first pure thrill of unreluctant desire,
just before we’d made ourselves stop.

Science, Faith and First Principles contra Goats Piggy-Backing other Goats

This is a common, recurring thought in our culture. But its very persistence can seem a bit mysterious. After all, taken one way, it is easy to answer. “Science” isn’t a name for a collection of beliefs. It names a collection of methods for acquiring beliefs — methods that involve logic, observation and experiment. It is these methods that distinguish science, not doctrine. So in that sense, science is clearly not a faith — it isn’t a religion.
Nonetheless, the thought that science may still be based on something like faith remains. And there is a reason it hangs around. Like so many nagging questions, the idea that science is not free from faith contains a grain of truth.
In an earlier post, I noted that even science has its first principles. These principles — call them epistemic principles — tell us what methods and sources to trust. They are fundamental (“first”) precisely because you can’t defend them without relying on them. (Try giving a good argument for why logic is reliable that doesn’t use logic.)


This reaction is understandable. But it rests on a mistake. It is right that we can’t give epistemic reasons — evidence — for those fundamental principles that tell us what evidence to trust. But that doesn’t mean we can’t give reasons for those principles at all. Indeed, we had better. As I argued in the first post, dogmatism, or conviction without reason, is the enemy of the democratic enterprise. But the reasons we give can’t involve further appeals to methods for belief. We can’t give epistemic reasons for epistemic first principles. We have to give reasons of a different sort.
But what sort of reasons? A number of readers of that post suggested that the methods and principles of science are superior simply because they are more useful. It is only by relying on them that we can build bridges, cure diseases and so on. And of course that is correct. But this point alone won’t answer the skeptics about reason. First, skeptics about scientific reason are rarely if ever skeptical about it across the board. Their quarrel is with its use in certain domains. They aren’t going to say that we should never use observation, logic and experiment to figure things out. What they will argue is that these methods have a lower priority in some subjects than others. In some domains, other methods — such as consultation of sacred texts — trump.
Nonetheless, appealing to the utility of science is a good start: it is the right sort of reason, even if it is, by itself, insufficient. Even if we can’t give epistemic reasons for epistemic first principles, we can give what philosophers sometimes call practical reasons for employing the methods of science, and therefore for committing to the principles that give them more weight than others.

Here’s one example of what I have in mind. Scientific principles of rationality have certain democratic virtues that many of their rivals lack. One of the virtues of scientific rationality is that it privileges principles that — as we in fact just noted — everyone appeals to most of the time — just because we are built that way. Of course the fact that people can’t help but use methods like observation and logic doesn’t prove that those methods are always more reliable than others, or even reliable at all. (Just as the fact that people thought the earth was flat doesn’t mean it was). But it does mean that principles which privilege these methods — which give them more weight than others, no matter what the question — have an obvious virtue: they recommend methods that aren’t secret or the province of a few. They recommend methods that everyone can and does use. Indeed, it is this very virtue of scientific methods that was so celebrated in the Enlightenment. Prioritizing scientific methods is liberating precisely because it frees one from appeals to authority, from the thought that something is true because some person, religious tradition, or political party, says so.
But isn’t it just a matter of opinion that scientific methods are open and democratic in this way? Do we have any real reasons — other than personal preference — for thinking this is the case? I think we do. Here’s a brief thought experiment that makes the point. Imagine playing a game the point of which is to figure out, together with other players, what epistemic principles and methods to privilege on another world (call it Parallel Earth). Principles are privileged on Parallel Earth, let’s imagine, when they are taught in the schools, used to make decisions about grants and the like. In playing the game, you know that Parallel Earth will appear to be just like our planet. And you also know you will live on Parallel Earth after the game ends. But you don’t know two important facts: when playing the game, you don’t know what social and educational position you’ll occupy on Parallel Earth. And you don’t know what methods are going to be reliable on Parallel Earth. But you have to decide which methods to privilege on Parallel Earth anyway.

So how are you going to decide? Since, in playing the game, you don’t know which methods are actually going to be reliable (or if any will be), you can’t base your decisions on which methods we think will produce the truth. That’s out of bounds as far as the game is concerned. And yet since you also don’t know your future social position, it is unlikely you’ll decide on methods that would only be available to a few, or which would allow some people to have secret knowledge that no one else could ever obtain. After all, you might not turn out to be a member of the inner circle. Given the rules, it will make sense to endorse methods that build on abilities that everyone — just by being human — can appeal to. Methods that build on common experience are by nature non-secret, open to public revision and capable of being used repeatedly. That alone gives us a practical reason to privilege them, to give them more weight — independently of the question of whether they are reliable.


Of course, it would take more work to show in detail that scientific principles and methods would turn out to have more of these democratic virtues than other, competing principles. But it seems very likely that they do. And if so, we have an objective reason for favoring scientific principles of rationality over others — a reason that could be accepted no matter what your prior epistemic commitments. The first principles of science give weight to methods like observation and experiment. Because of their open public nature, they are the sorts of principles we would commit to even were we to abstract from their truth. These are the principles we would favor in an ideal state of social and epistemic equality. As such, our faith in them — our faith in reason, as it were — is not blind. It is an expression of our commitment to democracy itself.


I was just kindding about the goats!

Wednesday, November 2, 2011

The One Right Way to Run - Guaranteed!


When you’re stalking barefoot runners, camouflage helps. “Some of them get kind of prancy when they notice you filming,” Peter Larson says. “They put on this notion of what they think barefoot running should be. It looks weird.” Larson, an evolutionary biologist at Saint Anselm College in New Hampshire who has been on the barefoot beat for two years now, is also a stickler about his timing. “You don’t want to catch them too early in a run, when they’re cold, or too late, when they’re tired.”

If everything comes together just right, you’ll be exactly where Larson was one Sunday morning in September: peeking out from behind a tree on Governors Island in New York Harbor, his digital video camera nearly invisible on an ankle-high tripod, as the Second Annual New York City Barefoot Run got under way about a quarter-mile up the road. Hundreds of runners — men and women, young and old, athletic and not so much so, natives from 11 different countries — came pattering down the asphalt straight toward his viewfinder.

http://video.nytimes.com/video/2011/11/02/magazine/100000001149415/the-lost-secret-of-running.html

About half of them were actually barefoot. The rest wore Vibram FiveFingers — a rubber foot glove with no heel cushion or arch support — or Spartacus-style sandals, or other superlight “minimalist” running shoes. Larson surreptitiously recorded them all, wondering how many (if any) had what he was looking for: the lost secret of perfect running.

It’s what Alberto Salazar, for a while the world’s dominant marathoner and now the coach of some of America’s top distance runners, describes in mythical-questing terms as the “one best way” — not the fastest, necessarily, but the best: an injury-proof, evolution-tested way to place one foot on the ground and pick it up before the other comes down. Left, right, repeat; that’s all running really is, a movement so natural that babies learn it the first time they rise to their feet. Yet sometime between childhood and adulthood — and between the dawn of our species and today — most of us lose the knack.

We were once the greatest endurance runners on earth. We didn’t have fangs, claws, strength or speed, but the springiness of our legs and our unrivaled ability to cool our bodies by sweating rather than panting enabled humans to chase prey until it dropped from heat exhaustion. Some speculate that collaboration on such hunts led to language, then shared technology. Running arguably made us the masters of the world.

So how did one of our greatest strengths become such a liability? “The data suggests up to 79 percent of all runners are injured every year,” says Stephen Messier, the director of the J. B. Snow Biomechanics Laboratory at Wake Forest University. “What’s more, those figures have been consistent since the 1970s.” Messier is currently 11 months into a study for the U.S. Army and estimates that 40 percent of his 200 subjects will be hurt within a year. “It’s become a serious public health crisis.”

Nothing seems able to check it: not cross-training, not stretching, not $400 custom-molded orthotics, not even softer surfaces. And those special running shoes everyone thinks he needs? In 40 years, no study has ever shown that they do anything to reduce injuries. On the contrary, the U.S. Army’s Public Health Command concluded in a report in 2010, drawing on three large-scale studies of thousands of military personnel, that using shoes tailored to individual foot shapes had “little influence on injuries.”

Two years ago, in my book, “Born to Run,” I suggested we don’t need smarter shoes; we need smarter feet. I’d gone into Mexico’s Copper Canyon to learn from the Tarahumara Indians, who tackle 100-mile races well into their geriatric years. I was a broken-down, middle-aged, ex-runner when I arrived. Nine months later, I was transformed. After getting rid of my cushioned shoes and adopting the Tarahumaras’ whisper-soft stride, I was able to join them for a 50-mile race through the canyons. I haven’t lost a day of running to injury since.

“Barefoot-style” shoes are now a $1.7 billion industry. But simply putting something different on your feet doesn’t make you a gliding Tarahumara. The “one best way” isn’t about footwear. It’s about form. Learn to run gently, and you can wear anything. Fail to do so, and no shoe — or lack of shoe — will make a difference.

That’s what Peter Larson discovered when he reviewed his footage after the New York City Barefoot Run. “It amazed me how many people in FiveFingers were still landing on their heels,” he says. They wanted to land lightly on their forefeet, or they wouldn’t be in FiveFingers, but there was a disconnect between their intentions and their actual movements. “Once we develop motor patterns, they’re very difficult to unlearn,” Larson explains. “Especially if you’re not sure what it’s supposed to feel like.”

The only way to halt the running-injury epidemic, it seems, is to find a simple, foolproof method to relearn what the Tarahumara never forgot. A one best way to the one best way.

Earlier this year, I may have found it. I was leafing through the back of an out-of-print book, a collection of runners’ biographies called “The Five Kings of Distance,” when I came across a three-page essay from 1908 titled “W. G. George’s Own Account From the 100-Up Exercise.” According to legend, this single drill turned a 16-year-old with almost no running experience into the foremost racer of his day.

I read George’s words: “By its constant practice and regular use alone, I have myself established many records on the running path and won more amateur track-championships than any other individual.” And it was safe, George said: the 100-Up is “incapable of harm when practiced discreetly.”

Could it be that simple? That day, I began experimenting on myself.

When I called Mark Cucuzzella to tell him about my find, he cut me off midsentence. “When can you get down here?” he demanded.

“Here” is Two River Treads, a “natural” shoe store sandwiched between Maria’s Taqueria and German Street Coffee & Candlery in Shepherdstown, W.Va., which, against all odds, Cucuzzella has turned into possibly the country’s top learning center for the reinvention of running.

“What if people found out running can be totally fun no matter what kind of injuries they’ve had?” Cucuzzella said when I visited him last summer. “What if they could see — ” he jerked a thumb back toward his chest — “Exhibit A?”

Cucuzzella is a physician, a professor at West Virginia University’s Department of Family Medicine and an Air Force Reserve flight surgeon. Despite the demands of family life and multiple jobs, he still managed enough early-morning miles in his early 30s to routinely run marathons at a 5:30-per-mile pace. But he constantly battled injuries; at age 34, severe degenerative arthritis led to foot surgery. If he continued to run, his surgeon warned, the arthritis and pain would return.

Cucuzzella was despondent, until he began to wonder if there was some kind of furtive, Ninja way to run, as if you were sneaking up on someone. Cucuzzella threw himself into research and came across the work of, among others, Nicholas Romanov, a sports scientist in the former Soviet Union who developed a running technique he called the Pose Method. Romanov essentially had three rules: no cushioned shoes, no pushing off from the toes and, most of all, no landing on the heel.

Once Cucuzzella got used to this new style, it felt suspiciously easy, more like playful bouncing than serious running. As a test, he entered the Marine Corps Marathon. Six months after being told he should never run again, he finished in 2:28, just four minutes off his personal best.

“It was the beginning of a new life,” Cucuzzella told me. “I couldn’t believe that after a medical education and 20 years of running, so much of what I’d been taught about the body was being turned on its head.” Two weeks before turning 40, he won the Air Force Marathon and has since completed five other marathons under 2:35. Shortly before his 45th birthday this past September, he beat men half his age to win the Air Force Marathon again. He was running more on less training than 10 years before, but “felt fantastic.”

When he tried to spread the word, however, he encountered resistance. At a Runner’s World forum I attended before the Boston Marathon in April 2010, he told the story of how he bounced back from a lifetime of injuries by learning to run barefoot and relying on his legs’ natural shock absorption. Martyn Shorten, the former director of the Nike Sports Research Lab who now conducts tests on shoes up for review in Runner’s World, followed him to the microphone. “A physician talking about biomechanics — I guess I should talk about how to perform an appendectomy,” Shorten said. He then challenged Cucuzzella’s belief that cushioned shoes do more harm than good.

No matter. Cucuzzella went home and began hosting his own conferences. Peter Larson traveled from New Hampshire for Cucuzzella’s first gathering on a snowy weekend this past January. “I was a bit curious about how many people might show up to such an event in rural West Virginia,” Larson says. “Were the panelists going to outnumber the audience?” In fact, more than 150 attendees crowded right up to the dais.

Since then, West Virginia has become a destination for a growing number of those who are serious about the grass-roots reinvention of running. Galahad Clark, a seventh-generation shoemaker who created the Vivobarefoot line, flew in from London with the British running coach Lee Saxby for a one-day meeting with Cucuzzella. International researchers like Craig Richards, from Australia, and Hiro Tanaka, chairman of Exercise Physiology at the University of Fukuoka, have also visited, as well as scientists from a dozen different American states.

“He has turned a small town in an obese state into a running-crazed bastion of health,” Larson says. “Mark’s effort in transforming Shepherdstown is a testament to what a single person can accomplish.”

Not that he has everything figured out. I was at one of Cucuzzella’s free barefoot running clinics in May when he confronted his big problem: how do you actually teach this stuff? He had about 60 of us practicing drills on a grassy playground. “Now to run,” he said, “just bend forward from the ankles.” We all looked down at our ankles.

“No, no,” Cucuzzella said. “Posture, remember? Keep your heads up.”

We lifted our heads, and most of us then forgot to lean from the ankles. At that moment, a young girl flashed past us on her way to the monkey bars. Her back was straight, her head was high and her bare feet skittered along right under her hips.

“You mean like — ” someone said, pointing after the girl.

“Right,” Cucuzzella said. “Just watch her.”

So what ruined running for the rest of us who aren’t Tarahumara or 10 years old?

Back in the ’60s, Americans “ran way more and way faster in the thinnest little shoes, and we never got hurt,” Amby Burfoot, a longtime Runner’s World editor and former Boston Marathon champion, said during a talk before the Lehigh Valley Half-Marathon I attended last year. “I never even remember talking about injuries back then,” Burfoot said. “So you’ve got to wonder what’s changed.”

Bob Anderson knows at least one thing changed, because he watched it happen. As a high-school senior in 1966, he started Distance Running News, a twice-yearly magazine whose growth was so great that Anderson dropped out of college four years later to publish it full time as Runner’s World. Around then, another fledgling operation called Blue Ribbon Sports was pioneering cushioned running shoes; it became Nike. Together, the magazine and its biggest advertiser rode the running boom — until Anderson decided to see whether the shoes really worked.

“Some consumer advocate needed to test this stuff,” Anderson told me. He hired Peter Cavanagh, of the Penn State University biomechanics lab, to stress-test new products mechanically. “We tore the shoes apart,” Anderson says. He then graded shoes on a scale from zero to five stars and listed them from worst to first.

When a few of Nike’s shoes didn’t fare so well in the 1981 reviews, the company pulled its $1 million advertising contract with Runner’s World. Nike already had started its own magazine, Running, which would publish shoe reviews and commission star writers like Ken Kesey and Hunter S. Thompson.

“Nike would never advertise with me again,” Anderson says. “That hurt us bad.” In 1985, Anderson sold Runner’s World to Rodale, which, he says, promptly abolished his grading system. Today, every shoe in Runner’s World is effectively “recommended” for one kind of runner or another. David Willey, the magazine’s current editor, says that it only tests shoes that “are worth our while.” After Nike closed its magazine, it took its advertising back to Runner’s World. (Megan Saalfeld, a Nike spokeswoman, says she was unable to find someone to comment about this episode.)

“It’s a grading system where you can only get an A,” says Anderson, who went on to become the founder and chief executive of Ujena Swimwear.

Just as the shoe reviews were changing, so were the shoes: fear, the greatest of marketing tools, entered the game. Instead of being sold as performance accessories, running shoes were rebranded as safety items, like bike helmets and smoke alarms. Consumers were told they’d get hurt, perhaps for life, if they didn’t buy the “right” shoes. It was an audacious move that flew in the face of several biological truths: humans had thrived as running animals for two million years without corrective shoes, and asphalt was no harder than the traditional hunting terrains of the African savanna.

In 1985, Benno Nigg, founder and currently co-director of the University of Calgary’s Human Performance Lab, floated the notion that impact and rear-foot motion (called pronation) were dangerous. His work helped spur an arms race of experimental technology to counter those risks with plush heels and wedged shoes. Running magazines spread the new gospel. To this day, Runner’s World tells beginners that their first workout should be opening their wallets: “Go to a specialty running store . . . you’ll leave with a comfortable pair of shoes that will have you running pain- and injury-free.”

Nigg now believes mistakes were made. “Initial results were often overinterpreted and were partly responsible for a few ‘blunders’ in sport-shoe construction,” he said in a speech to the International Society of Biomechanics in 2005. The belief in the need for cushioning and pronation control, he told me, was, in retrospect, “completely wrong thinking.” His stance was seconded in June 2010, when The British Journal of Sports Medicine reported that a study of 105 women enrolled in a 13-week half-marathon training program found that every single runner who was given motion-control shoes to control excess foot pronation was injured. “You don’t need any protection at all except for cold and, like, gravel,” Nigg now says.

Of course, the only way to know what shoes have done to runners would be to travel back to a time when no one ever wore them. So that’s what one anthropologist has effectively done. In 2009, Daniel Lieberman, chairman of Harvard’s human evolutionary biology department, located a school in Kenya where no one wore shoes. Lieberman noticed something unusual: while most runners in shoes come down hard on their heels, these barefoot Kenyans tended to land softly on the balls of their feet.

Back at the lab, Lieberman found that barefoot runners land with almost zero initial impact shock. Heel-strikers, by comparison, collide with the ground with a force equal to as much as three times their body weight. “Most people today think barefoot running is dangerous and hurts, but actually you can run barefoot on the world’s hardest surfaces without the slightest discomfort and pain.”

Lieberman, who is 47 and a six-time marathoner, was so impressed by the results of his research that he began running barefoot himself. So has Irene Davis, director of Harvard Medical School’s Spaulding National Running Center. “I didn’t run myself for 30 years because of injuries,” Davis says. “I used to prescribe orthotics. Now, honest to God, I run 20 miles a week, and I haven’t had an injury since I started going barefoot.”

Last fall, at the end of a local 10-mile trail race, I surprised myself by finishing five minutes faster than I had four years ago, when I was in much better shape. I figured the result was a fluke — until it happened again. No special prep, awful travel schedule and yet a personal best in a six-mile race.

“I don’t get it,” I told Cucuzzella this past June when we went for a run together through the Shepherd University campus in Shepherdstown. “I’m four years older. I’m pretty sure I’m heavier. I’m not doing real workouts, just whatever I feel like each day. The only difference is I’ve been 100-Upping.”

It was five months since I discovered W.S. George’s “100-Up,” and I’d been doing the exercise regularly. In George’s essay, he says he invented the 100-Up in 1874, when he was an 16-year-old chemist’s apprentice in England and could train only during his lunch hour. By Year 2 of his experiment, the overworked lab assistant was the fastest amateur miler in England. By Year 5, he held world records in everything from the half-mile to 10 miles.

So is it possible that a 19th-century teenager succeeded where 21st-century technology has failed?

“Absolutely, yes,” says Steve Magness, a sports scientist who works with top Olympic prospects at Nike’s elite “Oregon Project.” He was hired by Alberto Salazar to create, essentially, a squad of anti-Salazars. Despite his domination of the marathon in the ’80s, Salazar was plagued with knee and hamstring problems. He was also a heel-striker, which he has described as “having a tire with a nail in it.” Magness’s brief is to find ways to teach Nike runners to run barefoot-style and puncture-proof their legs.

“From what you’re telling me, it sounds promising,” Magness told me. “I’d love to see it in action.”

Mark Cucuzzella was just as eager. “All right,” he said in the middle of our run. “Let’s get a look at this.” I snapped a twig and dropped the halves on the ground about eight inches apart to form targets for my landings. The 100-Up consists of two parts. For the “Minor,” you stand with both feet on the targets and your arms cocked in running position. “Now raise one knee to the height of the hip,” George writes, “bring the foot back and down again to its original position, touching the line lightly with the ball of the foot, and repeat with the other leg.”

That’s all there is to it. But it’s not so easy to hit your marks 100 times in a row while maintaining balance and proper knee height. Once you can, it’s on to the Major: “The body must be balanced on the ball of the foot, the heels being clear of the ground and the head and body being tilted very slightly forward. . . . Now, spring from the toe, bringing the knee to the level of the hip. . . . Repeat with the other leg and continue raising and lowering the legs alternately. This action is exactly that of running.”

Cucuzzella didn’t like it as a teaching method — he loved it. “It makes so much physiological and anatomical sense,” he said. “The key to injury-free running is balance, elasticity, stability in midstance and cadence. You’ve got all four right there.”

Cucuzzella began trying it himself. As I watched, I recalled another lone inventor, a Czechoslovakian soldier who dreamed up a similar drill: he’d throw dirty clothes in the bathtub with soap and water, then jog on top. You can’t heel strike or overstride on slippery laundry. There’s only one way to run in a tub: the one best way.

At the 1952 Olympics, Emil Zatopek became the only runner ever to win gold medals in all three distance events: 5,000 meters, 10,000 meters and the marathon, the first he ever ran. Granted, “the Human Locomotive” wasn’t a pretty sight. During his final push to the finish line, his head would loll and his arms would grab at the air “as if he’d just been stabbed through the heart,” as one sportswriter put it.

But from the waist down, Zatopek was always quick, light and springy, like a kid swooping across a playground — or like this once-arthritic physician in front of me, laughing with excitement as he hopped up and down in his bare feet in a parking lot