Thursday, May 26, 2016
John Freeman in Literary Hub:
As he begins his seventies, they show no sign of diminishment. Barnes won the Man Booker Prize five years ago for The Sense of Ending, the tale of a man looking back on a botched moment of youth, but to begin reading Barnes here might be misleading.
He has written novels about aviators and detectives, Flaubert and theme-park life, short stories about aging, riffs on life in the kitchen, journalism about French politics, translations of the great diarist Alphonse Daudet, dozens of expertly argued reviews of modern art, tales of sex and revenge, and—for a brief period—a mystery series under the name Dan Kavanagh.
Barnes’s latest novel expands the territory even further. The Noise of Time is a harrowing, swift novel about the composer Dmitri Shostakovich and his attempt to compose music under Stalin. The book begins with the agonized Dmitri smoking a cigarette outside his apartment door, so terrified of the knock in the night he begins sleeping with his clothes on. And then not sleeping at all, just waiting in the hall. He doesn’t want to wake his wife.
Amira Nowaira in The Guardian:
Freethinking is perhaps not one of the strongest suits of modern Islam. For one thing, the list of books that have been banned for challenging prevalent religious orthodoxies and sensibilities during the past hundred years is disconcertingly long.
Modern Islamic clerics and scholars in various Muslim countries are often highly selective of which part of the Islamic heritage to emphasise and bring to light. Out of the countless and varied sources from centuries of vigorous debates, commentaries and controversies, they seem to dig out, and revel in, interpretations that are hopelessly conservative or frustratingly and grotesquely at odds with the life of modern Muslims.
It may therefore come as a surprise to many people that there is a long and vibrant intellectual tradition of dissidence and freethinking going back to the Middle Ages. The Islamic thinkers of the early medieval period expressed ideas and engaged in debates that would appear strangely enlightened in comparison with the attitudes and views adopted by modern Islamic scholarship.
This is the basic argument presented by From the History of Atheism in Islam by the renowned Egyptian thinker Abdel-Rahman Badawi. Published in Arabic in 1945, the book was reprinted only once in 1993. It discusses the work of the Islamic philosopher-scientists of the medieval period and the way they upheld reason, freedom of thought and humanist values, while questioning and often refuting some basic Islamic tenets.
Akim Reinhardt in his own blog:
Well, actually, it was just two hours on Thursday afternoons as a volunteer with the Native men’s group at Nebraska State Penitentiary in Lincoln, Nebraska.
I could gussy up the experience and say I was teaching inmates. But mostly I was just hanging out. Many prisoners, particularly those who’ve been in a while, are starved for new faces and happy to get some fresh conversation.
Sometimes I’d talk to people about serious issues. Other times we’d just shoot the breeze. One day while inside, I was talking to a guy. Nothing serious. I don’t even remember about what. He asked something of me. I said, “You got it, chief.”
Now here’s the thing. Growing up in New York City, “chief” was (and still is) in the same class of words as “boss” and “buddy.” They’re all informal monikers one man might casually give another if you don’t actually know each other’s names, or as a temporary nickname even when you do. It’s a sign of modest respect and affection in the moment. In a typical New York City context, they’re all completely harmless words and have zero racial connotation.
But the moment “chief” slipped out of my mouth in prison, I immediately remembered that of course this particular word has a very heavy connotation for Native people, particularly men.
To get the full picture, you really need to read The Big Picture, but you can get a good taste for some of the content of the book by watching this:
The all-conquering encyclopedia of the twenty-first century is, famously, the first such work to have been compiled entirely by uncredentialled volunteers. It is also the first reference work ever produced as a way of killing time during coffee breaks. Not the least of Wikipedia’s wonders is to have done away with the drudgery that used to be synonymous with the writing of reference works. An army of anonymous, tech-savvy people – mostly young, mostly men – have effortlessly assembled and organized a body of knowledge unparalleled in human history. “Effortlessly” in the literal sense of without significant effort: when you have 27,842,261 registered editors (not all of them active, it is true), plus an unknown number of anonymous contributors, the odd half-hour here and there soon adds up to a pretty big encyclopedia.
One of the most common gripes about Wikipedia is that it pays far more attention to Pokémon and Game of Thrones than it does to, say, sub-Saharan Africa or female novelists. Well, perhaps; the most widely repeated variants of “Wikipedia has more information on x than y” are in fact largely fictitious (https://en.wikipedia.org/wiki/Wikipedia:Wikipedia_has_more…). Given the manner of its compilation, the accursed thing really is a whole lot more reliable than it has any right to be. Like many university lecturers, I used to warn my own students off using Wikipedia (as pointless an injunction as telling them not to use Google, or not to leave their essay to the last minute). I finally gave up doing so about three years ago, after reading a paper by an expert on South Asian coinage in which the author described the Wikipedia entry on the Indo-Greek Kingdom (c.200 BC–AD 10) as the most reliable overview of Indo-Greek history to be found anywhere – quite true, though not necessarily as much of a compliment to Wikipedia as you might think.
This photo is an enigma. Even I can’t say for sure what’s happening. I didn’t know what I had taken at the time. It was only afterwards, when I developed the film, that I saw the handbag.
It was April 1984 and I was on assignment in China, which was just opening up to foreigners. I had no particular commission, though: I could shoot whatever I wanted. On this day, I was visiting a monastery at Xindu in the Sichuan province. There was a symbol on the wall that meant “happiness”. The place was full of Chinese tourists and the tradition was to stand 20 metres from the sign, then walk towards it with eyes closed and try to touch the centre of the four raised points.
As a photographer, I have always been interested in gestures – I was once described as someone who made arms dance. And now I found myself in front of this extraordinary ballet: a young man who has just touched the sign and a second, in a hat, approaching with his hand out. I remember the sensation of something moving, but I really don’t remember the handbag.
In the Buckland household, oddness was next to godliness. Drawing room tables were decorated with lizard feces and clumps of lava from Mount Etna; instead of hobbyhorses, the children had the corpses of dead crocodiles to ride around on; they learned to distinguish between types of animal urine by taste alone. Francis took his father’s gleeful, childlike curiosity about the wondrous variety of life on earth and magnified it into a philosophy for living, and the core of a defiantly strange personality. At his boarding school, he shared his room with rats, an owl, a buzzard, a magpie, and a racoon, and he became popular for providing feasts for the other boys with grilled trout and field mice poached from the land of a neighboring farmer. As a student at Oxford, his menagerie took a turn for the exotic: an eagle, a jackal, a pariah dog, marmots, guinea pigs, snakes, a chameleon, a monkey, and a bear came under his care, some sharing his rooms. The bear and the monkey, in particular, were prone to roaming, and on several occasions Francis had to charge across plush college quadrangles in pursuit of them. It earned him local celebrity, but somehow avoided irking the dons.
Buckland’s interest in wildlife was obviously sincere, but his brood also helped him to deflect scrutiny and cover deficiencies. His father, after all, had achieved great academic success, and there was pressure on Francis to follow in the old man’s shoes. But he struggled constantly in his studies: his time at Oxford was four years of stress, failed exams, and running to stand still. What he lacked in academic sharpness he made up for with zeal and an outsize personality.
Campari and Smoke Rings
In the side street bar,
below the church of St. Catherine,
the drinks are cheap and the music loud.
Girls in tight jeans lean into young men
who might be Zepharelli extras
with their dark halos of curls.
It’s easy sitting here, with my glass
of rough red, watching them laugh
and flirt as the evening gathers
in among the Campari and neon,
to imagine my life just like theirs;
in the bar’s smeared glass.
Yet when I look back
through the fug of smoke rings,
down the dark alley
the way I just came, I realise
that around the next bend
there’s no unknown room
filled with the scent of crushed roses,
where muslin curtains lift
on the evening breeze and clouds
of swallows circle and circle
under the eaves in the growing dark.
by Sue Hubbard
Emily Greco in Moyers & Company:
Ana Maria Archila, one of the Center for Popular Democracy’s two co-executive directors, gleefully introduced Elizabeth Warren at her grassroots organization’s gala Tuesday night. “I have a feeling she’s going to say some really deep stuff, some really inspiring stuff, some really tweetable stuff,” Archila told the diverse crowd of progressive policy wonks and community organizers assembled in the Hyatt Regency Washington on Capitol Hill ballroom. “Those of you who tweet, get your thumbs together. Get ready because we are really going to hear something very, very awesome and important.” The senior senator from Massachusetts didn’t disappoint. Warren tore into Donald Trump, caricaturing the presumptive Republican presidential nominee as a greedy narcissist. She questioned whether the billionaire real estate mogul ever pays a dime in taxes, dismissing him as “a man who will never be president of the United States” because he is prone to “kissing the fannies of poor, misunderstood Wall Street bankers” and “so desperate for power he will say and do anything to get elected.”
“Donald Trump was drooling over the idea of a housing meltdown — because it meant he could buy up more property on the cheap,” Warren said. “What kind of a man does that? What kind of a man roots for people to get thrown out of their house? What kind of a man roots for people to get thrown out of their jobs? To root for people to lose their pensions?” To date, the former financial regulator, a hero to her party’s progressive wing for her tough stands against corporate abuse, has refrained from endorsing either of the candidates for her party’s presidential nomination — opting out of the sparring between supporters of Hillary Clinton and Bernie Sanders. While Warren’s positions on many issues align with Sanders’ platform, some of her comments Tuesday night echoed a damning new Clinton campaign ad that recalls Trump saying he “sort of hoped” a real estate bust would happen before the Great Recession that left millions of Americans underwater.
This synchronization prompted The Washington Post to label Warren Clinton’s “new weapon against Trump.”
Emily Bobrow in More Intelligent Life:
Matt, a father in his early 40s with soulful eyes, thinning hair and a ready smile, is doing his best to explain why he has a more intense relationship with his son than he does with his daughter. Over a mojito at a bar in Brooklyn, near the flat he shares with his wife and two children, he admits that he is not a stereotypically macho guy. Most of his friends are women, he says. He was never much of an athlete and his marriage is a fairly egalitarian two-career juggling act. Yet there is something about his bond with his boy that feels particularly profound. Partly, he thinks, it is because his four-year-old son is older, and therefore more interesting. As the first-born, his son is also teaching Matt how to be a parent, which provokes all sorts of potent new emotions and anxieties. But perhaps the most compelling reason is also the simplest: “I really identify with him,” Matt says. “He just looks a lot like me, and he’s like me in certain ways. Every time I look at him I see myself when I was four years old.” Of course he adores his daughter, “but it’s just different. I don’t know how to make a little girl happy the way I fundamentally know how to make a boy happy, so I worry I’m going to somehow screw that up.”
Such candour can be uncomfortable for parents. In rich countries, where children are more like luxury goods than savvy economic investments, and where gender is simply one attribute among many, parents tend to pride themselves on their open-hearted, unconditional love for every member of their brood. Admitting a stronger emotional connection with one child over another, one sex over the other, is taboo. Yet the presence or absence of children of either sex has a real impact on the dynamics of a family – even, it seems, on whether the family survives as a unit.
Wednesday, May 25, 2016
David Sloan Wilson in Evonomics:
One of the most influential articles published in the field of economics is Milton Friedman’s (1953) “The Methodology of Positive Economics”, in which he argues that people behave as if the assumptions of neoclassical economic theory are correct, even when they are not. One of the most influential articles in the field of evolution is Stephen Jay Gould and Richard Lewontin’s (1979) “The Spandrels of San Marcos and the Panglossian Paradigm”, which argues against excessive reliance on the concept of adaptation.
Different disciplines, different decades. No wonder these two classic articles have not been related to each other. Yet, there is much to be gained by doing so, for one reveals weaknesses in the other that are highly relevant to current economic and evolutionary thought.
The reason they can be related to each other is because Friedman relied upon an evolutionary argument for his “as if” justification of neoclassical economics.
Natalie Wolchover in Quanta:
The boundary does not pass between some huge finite number and the next, infinitely large one. Rather, it separates two kinds of mathematical statements: “finitistic” ones, which can be proved without invoking the concept of infinity, and “infinitistic” ones, which rest on the assumption — not evident in nature — that infinite objects exist.
Mapping and understanding this division is “at the heart of mathematical logic,” said Theodore Slaman, a professor of mathematics at the University of California, Berkeley. This endeavor leads directly to questions of mathematical objectivity, the meaning of infinity and the relationship between mathematics and physical reality.
More concretely, the new proof settles a question that has eluded top experts for two decades...
FROM A DISTANCE, the causes of the Brazilian crisis seem obvious. A corrupt government, after fourteen years in power, begins to suffer the consequences of erratic policies: a deep recession follows, and protesters then take to the streets to overthrow the government. This explanation isn’t so much unfounded as insufficient. The government is corrupt, but so are all the other parties. The economy is in recession, but there have been other periods of turbulence in the past, and not all of them led to a coup. Protesters are on the streets, yet they make up a small demographic, and are unrepresentative of the larger population. To state that a couple of organs in a body have failed says little of the disease that overtook it. I’m not sure, though, that watching the corpse decompose from up close yields any kind of special explanatory power. It may well be that those farther away are better equipped to explain things.
To watch the maggots scuffle and reproduce on the body—think of them as various members of the executive, legislative, and judiciary—is less revolting than it is profoundly boring. And yet none of us here are able to look away. In Rio, where I live and write for a monthly magazine, the crisis often seems to be the only topic of conversation. Once a week, I head to the magazine’s offices in Ipanema. There, for about twenty or thirty minutes, my colleagues and I exchange pleasantries and drink coffee. Then someone will pull up a video of the latest outrage: a right-wing congressman justifying his vote to oust President Dilma Rousseff by paying homage to a deceased torturer from the military dictatorship; a lawyer, responsible for filing the impeachment request against the President, waving the Brazilian flag in her hand as she shouts incantations against the “republic of the snake”; the popular leftist musician and supporter of the ruling Workers’ Party (PT) Chico Buarque confessing that he didn’t in fact write his own songs, but rather bought the lyrics and melodies off the street from a guy named Ahmed.
Refugees, asylees, IDPs (internally displaced persons), PRSs, stateless persons: these are new categories of human beings created by an international state-system in turmoil, human beings who are subject to a special kind of precarious existence. Although they share with other "suffering strangers" the status of victimhood and become the objects of our compassion – or as the UNHCR report puts it, become "persons of concern" – their plight reveals the most fateful disjunction between so-called "human rights" – or "the rights of man", in the older locution – and "the rights of the citizen"; between the universal claims to human dignity and the specificities of indignity suffered by those who possess only human rights. From Hannah Arendt's famous discussion of the "right to have rights" in The Origins of Totalitarianism to Giorgio Agamben's homo sacer to Judith Butler's "precarious lives" and Jacques Rancière's call to "the enactment of rights", the asylum seeker, the stateless and the refugee have become metaphors as well as symptoms of a much deeper malaise in the politics of modernity.
Yet as political fatigue about internationalism has gripped the United States in the wake of the interventions in Afghanistan and Iraq, and president Obama's politics of caution in Syria has created further moral quagmires, we have moved from "the right to have rights" to the "critique of humanitarian reason." Didier Fassin, who for many years worked with Médecins Sans Frontières in a high capacity, and to whom we owe this term, defines it as follows: "Humanitarian reason governs precarious lives: the lives of the unemployed and the asylum seeker, the lives of sick immigrants and people with AIDS, the lives of disaster victims and victims of conflict – threatened and forgotten lives that humanitarian government brings into existence by protecting and revealing them."
“There are more things in heaven and earth, Horatio, than are dreamt of in your philosophy,” says Shakespeare’s Hamlet, a troubled dropout struggling with questions of responsibility, to his best friend. Even by the Elizabethan era, it seems, a discipline that had begun in classical times as a practical method for discerning how best to live life had devolved into something increasingly hermetic. Wind the clock forward, to the late 19th century, and philosophy had become an exclusively academic profession, focused on seemingly arcane questions of aesthetics, epistemology, and ethics.
The 20th century, improbably, changed all that. The Great War, the sweeping away of imperial dynasties, waves of technological revolutions, a worldwide economic collapse, the rise of global totalitarianism, World War II, the atom bomb, the Cold War—all these events destroyed cultural certainties and introduced bewilderment and anxiety. And it was philosophy—as a means of understanding who we are, why the world is as it is, what we ought to do—that came to the rescue.
Or at least that’s the message of At the Existentialist Café, the sprightly, elegant, occasionally unsatisfying new book by Sarah Bakewell, author of the 2010 National Book Critics Circle Award–winning How to Live, or, A Life of Montaigne in One Question and Twenty Attempts at an Answer.
Sara Reardon in Nature:
Children from impoverished families are more prone to mental illness, and alterations in DNA structure could be to blame, according to a study published on 24 May in Molecular Psychiatry1. Poverty brings with it a number of different stressors, such as poor nutrition, increased prevalence of smoking and the general struggle of trying to get by. All of these can affect a child’s development, particularly in the brain, where the structure of areas involved in response to stress and decision-making have been linked to low socioeconomic status. Poor children are more prone to mental illnesses such as depression than their peers from wealthier families, but they are also more likely to have cognitive problems. Some of these differences are clearly visible in the brain structure and seem to appear at birth, which suggests that prenatal exposure to these stressors can be involved2.
From birth to adolescence
But neurodevelopment does not stop at birth. Neuroscientist Ahmad Hariri of Duke University in Durham, North Carolina, suspected that continual exposure to stressors might affect older children as well. He decided to test this idea by studying chemical tags known as methyl groups, which alter DNA structure to regulate how genes are expressed. There is some evidence that methylation patterns can be passed down through generations, but they are also altered by environmental factors, such as smoking. To test whether these mechanisms are involved in the increased likelihood of depression seen in impoverished children, Hariri and his colleagues zeroed in on a gene called SLC6A4, which encodes a protein that transports the brain-signalling molecule serotonin into neurons. The gene has long been known to be involved in depression, and the serotonin receptor is the target of many antidepressant drugs. Hariri and his colleagues collected blood samples from 183 Caucasian children aged 11–15, and tested the children for symptoms of depression. They also examined how the children responded to stress by scanning their brains to monitor activity when shown a picture of a frightened face. People who are highly sensitive to threats show more activity in the amygdala — the brain’s ‘fight or flight’ centre — when they see such an emotion.
I love you
because the Earth turns round the sun
because the North wind blows north
because the Pope is Catholic
and most Rabbis Jewish
because the winters flow into springs
and the air clears after a storm
because only my love for you
despite the charms of gravity
keeps me from falling off this Earth
into another dimension
I love you
because it is the natural order of things
I love you
like the habit I picked up in college
of sleeping through lectures
or saying I’m sorry
when I get stopped for speeding
because I drink a glass of water
in the morning
and chain-smoke cigarettes
all through the day
because I take my coffee Black
and my milk with chocolate
because you keep my feet warm
though my life a mess
I love you
because I don’t want it
any other way
Tuesday, May 24, 2016
Tim Parks in the New York Review of Books:
It has become commonplace, in this age of globalization, to speak of novelists and poets who change language, whether to find a wider audience or to adapt to life in a new country. But what about those writers who move to another country and do not change language, who continue to write in their mother tongue many years after it has ceased to be the language of daily conversation? Do the words they use grow arid and stiff? Or is there an advantage in being away from what is perhaps only the flavor of the day at home, the expressions invented today and gone tomorrow? Then, beyond specifically linguistic concerns, what audience do you write toward if you are no longer regularly speaking to people who use your language?
The most famous candidate for a reflection on this situation would be James Joyce, who left Ireland in 1904 aged twenty-two and lived abroad, mainly in Trieste and Paris, until his death in 1941. Other writers one could speak of would be W. G. Sebald, writing in German while living in England, Dubravka Ugrešić writing in Croatian while living in Holland, or Aleksandr Solzhenitsyn and Joseph Brodsky, who went on writing in Russian after being forced into exile in the United States. One could go back and look at Robert Browning’s fifteen years in Italy, or Italo Calvino’s thirteen years in Paris. There are many others. Yet the easiest example, the only one I can write about with some authority, and, frankly, one of the most extreme, for length of time away and level of engagement with the foreign language and foreign country, is myself. What has happened to my English over thirty-five years in Italy? How has this long expatriation—I would never call it exile—changed my writing?
Jimena Canales in Nautilus:
On April 6, 1922, Einstein met a man he would never forget. He was one of the most celebrated philosophers of the century, widely known for espousing a theory of time that explained what clocks did not: memories, premonitions, expectations, and anticipations. Thanks to him, we now know that to act on the future one needs to start by changing the past. Why does one thing not always lead to the next? The meeting had been planned as a cordial and scholarly event. It was anything but that. The physicist and the philosopher clashed, each defending opposing, even irreconcilable, ways of understanding time. At the Société française de philosophie—one of the most venerable institutions in France—they confronted each other under the eyes of a select group of intellectuals. The “dialogue between the greatest philosopher and the greatest physicist of the 20th century” was dutifully written down.1 It was a script fit for the theater. The meeting, and the words they uttered, would be discussed for the rest of the century.
The philosopher’s name was Henri Bergson. In the early decades of the century, his fame, prestige, and influence surpassed that of the physicist—who, in contrast, is so well known today. Bergson was compared to Socrates, Copernicus, Kant, Simón Bolívar, and even Don Juan. The philosopher John Dewey claimed that “no philosophic problem will ever exhibit just the same face and aspect that it presented before Professor Bergson.” William James, the Harvard professor and famed psychologist, described Bergson’s Creative Evolution (1907) as “a true miracle,” marking the “beginning of a new era.” For James, Matter and Memory (1896) created “a sort of Copernican revolution as much as Berkeley’s Principles or Kant’s Critique did.” The philosopher Jean Wahl once said that “if one had to name the four great philosophers one could say: Socrates, Plato—taking them together—Descartes, Kant, and Bergson.” The philosopher and historian of philosophy Étienne Gilson categorically claimed that the first third of the 20th century was “the age of Bergson.” He was simultaneously considered “the greatest thinker in the world” and “the most dangerous man in the world.” Many of his followers embarked on “mystical pilgrimages” to his summer home in Saint-Cergue, Switzerland.
Jacob Harris in The Atlantic:
One of the joys of modern technology is how easy it is to immerse yourself in the past. Every day, more libraries and archives are pushing pieces of their collections online in easily browsable interfaces.
The New York Public Library, for instance, has historic menus and interactive floor plans. Chronicling America is a searchable repository of newspapers published between 1836 and 1922 from the Library of Congress, which is also one of the many institutions in the Flickr Commons public image archive. Wikipedia has its own Wikimedia Commons, to which anybody can upload images and videos. Project Gutenberg continues to add new public-domain books to its collection every day, and New York’s Metropolitan Museum of Art has posted thousands of images online with metadata as part of its Open Access for Scholarly Collections initiative.
My personal favorite however is TimesMachine, a site available to all New York Times subscribers that lets readers virtually flip through any historical issue of The New York Times all the way up through 2002. The site delivers the reader directly to the past, making you feel like a cross between a tourist and an archaeologist. You might start by visiting a historic event—say, coverage of the Titanic sinking—but the real fun is wandering off the beaten path and exploring all the other news of the day. On the same day the Titanic sank, there was also coverage of a gun battle in Greenwich Village, and a passenger lost in a runaway balloon. On any day, such vignettes sometimes become rabbit holes to the past.
This is the story of how I ended up captivated by a chance encounter with a 135-year-old newspaper advertisement—and how the random face staring back at me from the archives would reveal the surprising origins of ASCII art, a graphic design technique that’s usually associated with 20th-century computer art.
Alex Wellerstein in The New Yorker:
The demonstration began on the afternoon of May 21, 1946, at a secret laboratory tucked into a canyon some three miles from Los Alamos, New Mexico, the birthplace of the atom bomb. Louis Slotin, a Canadian physicist, was showing his colleagues how to bring the exposed core of a nuclear weapon nearly to the point of criticality, a tricky operation known as “tickling the dragon’s tail.” The core, sitting by itself on a squat table, looked unremarkable—a hemisphere of dull metal with a nub of plutonium sticking out of its center, the whole thing warm to the touch because of its radioactivity. It had been quickly molded into shape after the bombing of Nagasaki, to be used in another attack on Japan, then reallocated when it turned out not to be needed for the war effort. At that time, Slotin was perhaps the world’s foremost expert on handling dangerous quantities of plutonium. He had helped assemble the first atomic weapon, barely a year earlier, and a contemporary photograph shows him standing beside its innards with his shirt unbuttoned and sunglasses on, cool and collected. Back then, the bomb was a handmade, artisanal product.
Slotin’s procedure was simple. He would lower a half-shell of beryllium, called the tamper, over the core, stopping just before it was snugly seated. The tamper would reflect back the neutrons that were shooting off the plutonium, jump-starting a weak and short-lived nuclear chain reaction, on which the physicists could then gather data. Slotin held the tamper in his left hand. In his right hand, he held a long screwdriver, which he planned to wedge between the two components, keeping them apart. As he began the slow and painstaking process of lowering the tamper, one of his colleagues, Raemer Schreiber, turned away to focus on other work, expecting that the experiment would be uninteresting until several more moments had passed. But suddenly he heard a sound behind him: Slotin’s screwdriver had slipped, and the tamper had dropped fully over the core. When Schreiber turned around, he saw a flash of blue light and felt a wave of heat on his face.
Edward Docx in Prospect:
Dylan turns 75 on 24th May. For millions of devotees like myself—many of whom consider him the world’s greatest living artist—it is a moment of celebration tinged with apprehension. Joan Baez, his most significant early anointer-disciple (Joan the Baptist), best expresses what might be described as “the Dylan feeling” in the excellent Martin Scorsese 2005 documentary when she says: “There are no veils, curtains, doors, walls, anything, between what pours out of Bob’s hand on to the page and what is somehow available to the core of people who are believers in him. Some people would say, ‘not interested,’ but if you are interested, he goes way, way deep.” I love this for lots of reasons but most of all because it captures not only the religious devotion that many who love him feel, but also the bemused indifference of the sane and secular who do not.
Of course, the first order of business when writing about Dylan is to urge readers to ignore writers who write about Dylan. We are like Jehovah’s Witnesses, forever tramping door to door with our clumsy bonhomie and earnest smudgy leaflets; in all honesty, you would be much better off seeking out the resonant majesty of the actual work. Indeed, you’ll be relieved—and possibly endeared—to hear that Dylan himself considers his disciples to be deranged. “Why is it when people talk about me they have to go crazy?” Dylan asked in a recent interview for Rolling Stone. “What the fuck is the matter with them?”