Thursday, October 01, 2015
In the spring of 1914, one of the most famous images of authorship in English literary history went on public display for the first time. Branwell Brontë’s portrait of his sisters, Charlotte, Emily and Anne, had been discovered in Ireland, on top of a ward-robe at Hill House, Banagher, formerly the home of Arthur Bell Nicholls, Charlotte Brontë’s widower, together with a portrait fragment of Emily Brontë from a lost work by Branwell, known as the “Gun Group” (Nicholls had cut the fragment from the painting and destroyed the rest).
Hurriedly purchased by the trustees of the National Portrait Gallery in London, at “a very moderate cost”, and relined but not restored, the heavily creased painting of the three sisters – folded at one time to an eighth of its original size – was hung next to a portrait of Robert Louis Stevenson. The portrait of Emily, purchased at the same time, was displayed directly beneath. As the public flocked to see the two paintings, articles in the press focused on “The Three Sisters” group, marvelling at its chance rediscovery, “negligible” status as a work of art, and compensating value as a historical relic. A few dissident voices attacked the late Mr Nicholls for his neglect of the painting and the consequent damage to it, as well as for his desecration of the “Gun Group”. “Oh, the barbarism of Charlotte’s husband”, lamented a reporter in the Daily Graphic.
I WENT TO THAT black barbershop for the reason millions like me have done so before—to feel at home. But for years, as Quincy Mills’s fascinating Cutting Across the Color Line reveals, black barbershops in America were unavailable to people of my lineage and color. Though they became a stereotypical image of a black social institution, crystallized best in Barbershop, they began as institutions of segregation and white supremacy. In the antebellum era, but also well into the period of Reconstruction, black barbershops—predominantly in the South but often in the North—only served white men. Prohibiting black men from cutting black hair for a profit allowed slave owners to control their slaves’ relationship to their own and to other black bodies. At the same time, slave owners profited from their enslaved barbers by hiring their slaves out to cut the hair of white townspeople. If the barber was lucky, his owner allowed him to take a percentage of the profits, which he sometimes used to purchase his freedom.
Their distance from harsh, manual labor made these positions relatively privileged ones, leading Mills to argue that barbers initially occupied an unstable class position. “As captive capitalists in a slave society,” Mills writes, free barbers represented “both the possibilities and limits of freedom for African Americans in the antebellum period.”
Besides the complexes that blokes in the 21st century may have about castration and the shivering joy many take in explaining all this to a psychoanalyst, there is another reason the castrato may continue to fascinate us. It is the old idea that while heard melodies are sweet, those unheard are haunting. Feldman writes that castrato voices had ‘strong resonance … understood as relative loudness and intensity, with timbral richness’. If we want to imagine what a castrato sounded like perhaps it would help to listen to a recording by a deep and powerful contralto – Hilde Rössel-Majdan, for example, or Maureen Forrester – and then follow this by listening to a countertenor, David Daniels, for example, or Andreas Scholl, or Iestyn Davies (or go on YouTube and listen to a recording of the last castrato, Alessandro Moreschi, who died in 1922, singing the Bach-Gounod ‘Ave Maria’, with what Feldman called a vibrato that is ‘often lush and plentiful’, and the ‘Crucifixus’ from Rossini’s Petite messe solennelle). But all of these offer merely clues.
Some of the clues are fascinating, however, perhaps because the language used to describe a castrato singing has its own luscious, plaintive sound. The French soprano Emma Calvé wrote in her autobiography about hearing the castrato Domenico Mustafà in 1891: ‘He had an exquisite high tenor voice, truly angelic, neither masculine nor yet feminine in type – deep, subtle, poignant in its vibrant intensity … He had certain curious notes which he called his fourth voice – strange, sexless tones, superhuman, uncanny!’ Another writer wrote of a castrato voice that it was ‘so soft, and ravishingly mellow, that nothing can better represent it than the Flute-stops of some Organs’, which themselves were ‘not unlike the gentle Fallings of Water’.
Variety of life: An effort to sequence thousands of people’s genomes reaches the end of the beginning
Editorial in Nature:
Modern science has a good grip on most of those very few laws that drive life forward, most tellingly on how genetic material copies itself from parent to offspring. The innumerable variations however? Not so much. They are, after all, innumerable. That does not mean that science is not trying, and on pages 68 and 75 of this issue, Nature publishes the latest progress reports from this colossal effort. The papers mark the completion of the 1000 Genomes Project, the largest work yet to sequence the genetic information of hundreds of individuals in an attempt to tune into Mother Nature’s hum of human variation. It completes a set of genomic reference tools — resources of genetic data produced by international collaborations — that dates back 25 years to the start of the Human Genome Project. The bigger job, of tracking the relationships between genetic variation and human disease to help to develop effective treatments, is not finished, and may never be. But it is important from time to time to acknowledge and celebrate landmarks of achievement along the way. This week marks one such landmark.
...The final goal remains to make this flood of population-level genetic research relevant to personal health. Emerson would have approved. He was a proponent of individualism, a political philosophy that emphasizes the moral worth of the individual. He celebrated the non-conformist. And when it comes to the few laws that dictate the repetition of genetics, it is not just the 2,504 people whose variation is detailed this week who are the non-conformists. We all are.
Wednesday, September 30, 2015
Colin Dayan in the Boston Review:
The most striking thing happened as I began reading Lori Gruen’s book, Entangled Empathy: An Alternative Ethic for Our Relationships with Animals. I was sitting on the porch when a baby white-throated sparrow flew inside. Attempting to escape, the sparrow repeatedly dashed itself against the screens, head down in exhaustion. I tried to lead it to the open door. No luck. But then a male cardinal appeared outside. It hovered, went first to one side of the screen, then the other; held tight one moment, moved softly the next. Flying against the screen, it guided the captive bird, gradually, from side to side, up and down—all the while outside the porch—and led it to the open air. For twenty minutes I watched a bird save another not of its brood, and I thought: now that is empathy.
Yet empathy is a word I have always distrusted. Deep and enigmatic, at best it means being present to or with another being; at worst it calls forth a moral surround as exclusive as it is well intentioned. Along with sympathy, and often confused with it, empathy summons an intensely humanized world, where our emotional life—how much we feel for or with—matters more than the conditions that cause suffering and sustain predation. Examples are all around us. To consider but one, we all know the sad excesses of sentiment that followed the 2010 Haiti earthquake. Money flowed to the coffers of international aid organizations and NGOs, but it never reached the hundreds of thousands of Haitians who continued to live as displaced persons in camps. Inhumanity can easily be moderated, legitimized, and even reproduced by the humanitarian concern that is analogous to it.
As an Americanist, I learned from Edgar Allan Poe how the language of sentiment animates subordination. A slave, a piece of property, a black cat—once loved in the proper domestic setting, they arouse a surfeit of devotion, bonds of dependence that slavery apologists claimed could never be felt by equals.
From Science Alert:
Despite research telling us it’s a really bad idea, many of us end up working 50-hour weeks or more because we think we’ll get more done and reap the benefits later. And according to a study published last month involving 600,000 people, those of us who clock up a 55-hour week will have a 33 percent greater risk of having a stroke than those who maintain a 35- to 40-hour week.
With this in mind, Sweden is moving towards a standard 6-hour work day, with businesses across the country having already implemented the change, and a retirement home embarking on a year-long experiment to compare the costs and benefits of a shorter working day.
"I think the 8-hour work day is not as effective as one would think. To stay focused on a specific work task for 8 hours is a huge challenge. In order to cope, we mix in things and pauses to make the work day more endurable. At the same time, we are having it hard to manage our private life outside of work," Linus Feldt, CEO of Stockholm-based app developer Filimundus, told Adele Peters at Fast Company.
Filimundus switched to a 6-hour day last year, and Feldt says their staff haven't looked back. "We want to spend more time with our families, we want to learn new things or exercise more. I wanted to see if there could be a way to mix these things,"he said.
To cope with the significant cut in working hours, Feldt says staff are asked to stay off social media and other distractions while at work and meetings are kept to a minimum.
For doubters, the enduring renown of The Great Gatsby is mystifying. It seems a wonder to them that Gatsby should cling to its lofty place on lists of Great American Novels, despite being so slender and so dated, and not withstanding its ham-handed symbolism (the Valley of the Ashes, the Eyes of Doctor Eckleburg), simplistic structure (a series of set-pieces), clunky plot machinery (fancy cars roaring back and forth to Manhattan, merely to move pieces around the board), and flat characters (Tom Buchanan tilts toward caricature and Meyer Wolfsheim tips all the way over).
There is a solution to the mystery of Gatsby’s lasting fame, as believers know, and to my mind that solution is voice. The elixir that transforms the novel’s inert matter into music—that turns its static iconography into poetry—is its first-person narration: the subtle, compounded, compromised voice of Nick Carraway. A voice of hope infused with despair, of belief corroded by doubt. A voice suave and dapper on its surface but roiled and dark in its depths. It is the inviting but evasive voice of a new best friend who draws you into his confidence and promises alluring secrets, only to turn away from you, agitated, distracted, and weary.
Paradox: If Trotsky was correct at Kronstadt, then his own murder could also be construed as right. If his murder stinks (as I most certainly believe), then he was wrong at Kronstadt, in which case his murder again becomes justified so long as he supports Kronstadt-like actions. Like most paradoxes, this one ultimately fails to hold together—but only in the “real world.” Rostov is a reduction of a far more interesting and ambiguous man. But the protagonists of parables must be types, emblems, tropes. Rostov represents not who Trotsky was, but a certain principle that Trotsky stood for. If we feel willing to generalize and simplify, then this parable with its paradox does have something to tell us—for the events that haunted Bernard Wolfe reincarnate themselves endlessly.
“Then it amounts to this,” says a Mexican official to the dying Rostov’s wife. “Those who use all means will win, those who reject some means will lose. There is no remedy …” Can it be so? Trotsky believed it. Sometimes, so do I. (That is why I prefer to lose.) Exactly here we come face to face with Wolfe’s defective, unlikely greatness. His formulation must never be forgotten.
Among the many questions that surround the Cambridge spies, one has occupied historians ever since the scale of their treachery became fully known. Why did they choose to betray their country? Several reason are given why Guy Burgess, Kim Philby, Donald Maclean, Anthony Blunt and John Cairncross – commonly known as the Cambridge Five, though there may have been others – decided to serve the Soviet state. In the 1930s they saw the USSR as the chief bulwark against the advance of Nazism and fascism; in the Second World War, they acted in response to Britain and the USSR being allies; during the cold war, they viewed the United States as the chief threat to world peace. Above all, the spies had an overriding ideological commitment to communism. Acting on this was more important for them than clinging to old loyalties of king and country.
No doubt all of these factors played a part, but they are less than thoroughly convincing. The spies were recruited in the 1930s, when the danger of Nazism was becoming clear; but they continued to serve the Soviet Union after it entered into a pact with Nazi Germany, when many other communist sympathisers fell away, and went on serving the Soviet state after it ceased to be Britain’s ally.
Yasmin Alibhai-Brown in The Independent:
Iran is seriously mistrusted by Israel and America. North Korea protects its nuclear secrets and is ruled by an erratic, vicious man. Vladimir Putin’s territorial ambitions alarm democratic nations. The newest peril, Isis, the wild child of Islamists, has shocked the whole world. But top of this list should be Saudi Arabia – degenerate, malignant, pitiless, powerful and as dangerous as any of those listed above.
The state systematically transmits its sick form of Islam across the globe, instigates and funds hatreds, while crushing human freedoms and aspiration. But the West genuflects to its rulers. Last week Saudi Arabia was appointed chair of the UN Human Rights Council, a choice welcomed by Washington. Mark Toner, a spokesperson for the State Department, said: “We talk about human rights concerns with them. As to this leadership role, we hope that it is an occasion for them to look into human rights around the world and also within their own borders.”
The jaw simply drops. Saudi Arabia executes one person every two days. Ali Mohammed al-Nimr is soon to be beheaded then crucified for taking part in pro-democracy protests during the Arab Spring. He was a teenager then. Raif Badawi, a blogger who dared to call for democracy, was sentenced to 10 years and 1,000 lashes. Last week, 769 faithful Muslim believers were killed in Mecca where they had gone on the Hajj. Initially, the rulers said it was “God’s will” and then they blamed the dead. Mecca was once a place of simplicity and spirituality. Today the avaricious Saudis have bulldozed historical sites and turned it into the Las Vegas of Islam – with hotels, skyscrapers and malls to spend, spend, spend. The poor can no longer afford to go there. Numbers should be controlled to ensure safety – but that would be ruinous for profits. Ziauddin Sardar’s poignant book Mecca: The Sacred City, describes the desecration of Islam’s holiest site.
From The New Yorker:
The first installment in our For Your Consideration series is “Pink Grapefruit,” a ten-minute short by the writer-director Michael Mohan. The film—which premièred at Sundance, in January, and went on to win a jury award at South by Southwest—takes place in a serene vacation home in the Palm Springs desert. A young woman (Wendy McColm) arrives there with her friends, a slightly older married couple (Nora Kirkpatrick and Matt Peters), and we quickly learn that they are subjecting her to a rather intense version of a blind date: a single man she’s never met (Nathan Stewart-Jarrett) will soon be joining them for the weekend. Like any jaded millennial, the woman greets the impending setup with a sense of dread: “These things never work out!” she says on the car ride out. But, when her suitor arrives, things don’t go quite as expected. (And without spoiling anything, we hope, we should note that this film contains sexual situations.)
...But the story in “Pink Grapefruit,” of a young couple’s first encounter, turns out to be, as Mohan has put it, a cinematic Trojan horse. Shot in lush colors, with lingering images of the arid California hills, the film also makes use of an eerie desert silence, and the voyeurism of the glass-walled vacation home suggests that something pernicious is afoot between the two couples. What Mohan was really interested in exploring, he said, is how young adults “measure our happiness and success by comparing it to those around us.” Mohan, who also directs music videos and commercials (like a pair of very fun short films for Kate Spade, starring Anna Kendrick and Lily Tomlin), is currently beginning work on a new film project called “The Ends.” Co-written with Chris Levitus, who also co-wrote “Pink Grapefruit,” the film portrays the life of a young woman by examining her past breakups. Mohan said, “We want to show how our past relationships shape the person we ultimately become.”
Ellie Lee in Spiked:
The first episode of the new BBC TV series Countdown to Life: the Extraordinary Making of You, broadcast on Monday, showed us how this process works. The programme as a whole placed great emphasis on how ‘what you are’ is determined in the womb. Part of this argument for womb determinism drew on the alleged ‘amazing significance of what a mother-to-be eats’. The programme’s amazement at the profound import of maternal diet began with a section exploring the (sound) findings of the Dutch Famine Birth Cohort Study. This study showed how babies born to Dutch women who were literally starved during the Second World War were more likely to suffer from a range of serious diseases later in life; the environment in which fetal development occurred had serious detrimental effects for the health not only of the women, but also their children. This, combined with a Medical Research Council study about diet and health in Gambia, led programme presenter Michael Mosely to conclude: ‘You really are what your mother eats. Or more precisely, you really are what your mother ate when you were just a tiny little embryo, just a few cells big.’ Thus ends the article he wrote for BBC News to promote the programme: ‘If you are thinking of having a baby, then eating lots of leafy green vegetables, which are rich in B vitamins and folates, is certainly a good thing to do.’
Despite its gripping footage of life before birth – who could not be blown away by a film of the transformation of a ball of cells into a living, waking human being? – Countdown to Life is entirely in line with today’s propensity for parental determinism and scientism. The programme’s scientific content is neither new nor that interesting. Epigenetics has been around for a long time and the effects of the Dutch famine are well known. What is most telling is the ease with which the programme segues from discussing the extraordinary (the Dutch famine) through to the everyday (all women, the world over). You end up with what is really quite a bizarre message: that if pregnant women don’t eat what is today considered to be ‘good food’, then their babies will be damaged. But we are not ‘what our mothers ate’, and the suggestion that women should eat a lot of spinach if they are even thinking about having a baby burdens women with yet more health hectoring.
The Elusive Jellyfish Nebula
At the aquarium, the jellyfish are lit
from below—blue and pink hues
flash in time with the ebb
and flow of visitors come to see
The true sea is not so bright, though,
nor so clear—
Infinity reaches down from space
to the center of our waters
where jellyfish live in truth,
countless billions upon billions
of dead stars and living organisms
recycled into dust upon dust.
Near bright star Eta Geminorum,
the Jellyfish Nebula emits faint strands
of light, the remnants of a supernova gone
rogue, leaving only a neutron star to see
how the universe changes over time.
It is too far away, too large
to imagine what it would feel
like to touch those strands,
though the ones in the water sting
We imagine we know why jellyfish
are so fragile, dying easily or not at all,
but they say even stars die. We have faith
that’s true. When the aquarium closes,
the lights go out.
A new book, available at Amazon
Tuesday, September 29, 2015
Tim Flannery reviews Carl Safina's Beyond Words: What Animals Think and Feel and Hal Whitehead and Luke Rendell's The Cultural Lives of Whales and Dolphins in the New York Review of Books:
The free-living dolphins of the Bahamas had come to know researcher Denise Herzing and her team very well. For decades, at the start of each four-month-long field season, the dolphins would give the returning humans a joyous reception: “a reunion of friends,” as Herzing described it. But one year the creatures behaved differently. They would not approach the research vessel, refusing even invitations to bow-ride. When the boat’s captain slipped into the water to size up the situation, the dolphins remained aloof. Meanwhile on board it was discovered that an expeditioner had died while napping in his bunk. As the vessel headed to port, Herzing said, “the dolphins came to the side of our boat, not riding the bow as usual but instead flanking us fifty feet away in an aquatic escort” that paralleled the boat in an organized manner.
The remarkable incident raises questions that lie at the heart of Carl Safina’s astonishing new book, Beyond Words: What Animals Think and Feel. Can dolphin sonar penetrate the steel hull of a boat—and pinpoint a stilled heart? Can dolphins empathize with human bereavement? Is dolphin society organized enough to permit the formation of a funeral cavalcade? If the answer to these questions is yes, then Beyond Words has profound implications for humans and our worldview.
Beyond Words is gloriously written. Consider this description of elephants:
Their great breaths, rushing in and out, resonant in the halls of their lungs. The skin as they moved, wrinkled with time and wear, batiked with the walk of ages, as if they lived within the creased maps of the lives they’d traveled.
Not since Barry Lopez or Peter Matthiessen were at the height of their powers has the world been treated to such sumptuous descriptions of nature.
Safina would be the first to agree that anecdotes such as Herzing’s lack the rigor of scientific experiments. He tells us that he is “most skeptical of those things I’d most like to believe, precisely because I’d like to believe them. Wanting to believe something can bias one’s view.” Beyond Words is a rigorously scientific work. Yet impeccably documented anecdotes such as Herzing’s have a place in it, because they are the only means we have of comprehending the reactions of intelligent creatures like dolphins to rare and unusual circumstances. The alternative—to capture dolphins or chimpanzees and subject them to an array of human-devised tests in artificial circumstances—often results in nonsense. Take, for example, the oft-cited research demonstrating that wolves cannot follow a human pointing at something, while dogs can. It turns out that the wolves tested were caged: when outside a cage, wolves readily follow human pointing, without any training.
Claude S. Fischer in Boston Review (image: "A U.S. Department of Agriculture photo showing a family grocery shopping using the SNAP (food stamp) program. Photo: USDA."):
Now that growing economic inequality is widely accepted as fact—it took a couple of decades for the stubborn to acknowledge this—some wonder why Americans are not more upset about it. Americans do not like inequality, but their dislike has not increased. This spring, 63 percent of Gallup Poll respondents agreed that “money and wealth in this country should be more evenly distributed,” but that percentage has hardly changed in thirty years. Neither widening inequality nor the Great Recession has turned Americans to the left, much less radicalized them.
This puzzle recalls the hoary question of why there is no socialism in America. Why is the United States distinctive among Western nations in the weakness of its labor movement, absence of universal health care and other public goods, and reluctance to redistribute income where the elderly are not concerned? Generations of answers have ranged from the American mindset (say, individualism) to exercises of brute political power (e.g., strike-breakers, campaign money) to the formal structure of government (such as single-member districts). Some recent research presents a cultural explanation—specifically, Americans’ tendency to see issues of inequality in terms of deservingness. Even economist Thomas Piketty, author of Capital in the Twenty-First Century, insists on the “key role” of “belief systems.”
Notions of who deserves what shape the American welfare state. The economic demographer Robert Moffitt has shown that, despite common misperceptions, total U.S. welfare support—social security, food stamps, disability insurance, and so on—has notdeclined since the days of the Great Society. Even bracketing health expenditures, per capita government spending on means-tested programs rose pretty steadily over the last forty-plus years. What has changed, Moffitt argues, is who gets help. Spending has shifted away from the jobless, single, childless, and very poor toward the elderly, disabled, working, married, parents, and those who are not poor.
Ta-Nehisi Coates in The Atlantic:
I want to respond to Greg Weiner’s contention that I’ve offered a distorted picture of Daniel Patrick Moynihan. There’s a lot wrong with Weiner’s note. I specifically object to the idea that the Moynihan Report left its authors reputation “in tatters.”
It is certainly true that Moynihan suffered through more than his share of unfair criticism after the release of The Case for National Action. It is also true that within two years of the Moynihan Report’s release, the author was being hailed on the cover of TIME magazine as America’s “urbanologist.” That same year Lifemagazine lauded Moynihan as the “idea broker in the race crisis.” After leaving the Johnson administration, Moynihan went on to a lucrative post at Harvard, became the urban affairs guru for one president and the UN ambassador for another, and then served for an unbroken four terms in the Senate. Furthermore, Moynihan’s central idea—that the problems of families are key to ending the problems of poverty—dominates the national discourse today. I suspect the president would take no insult in being described as a disciple of Moynihan. If this is all part and parcel of having your reputation destroyed, it is an enviable specimen of the genre.
Weiner’s claim is, of course, much larger. He accuses me of merely hinting at Moynihan bearing some responsibility for mass incarceration, and cleverly leaving the nasty work to the editor’s note written by James Bennet:
Coates demonstrates that white Americans’ fear of black Americans, and their impulse to control blacks, are integral to the rise of the carceral state. A result is that one of every four black men born since the late 1970s has spent time in prison, at profound cost to his family. For this, Coates holds Moynihan, in part, responsible.
Since Weiner believes I was being coy, let me directly state that I wholly concur with this interpretation. My argument is that mass incarceration is built on a long history of viewing black people as unequal in general, and criminal in the specific. Both of these trends can be found in Moynihan’s arguments.
Sam Leith in The Guardian:
A couple of weeks ago I saw David Crystal give an after-dinner speech at the august annual conference of the Society of Indexers and the Society for Editors and Proofreaders. In it, he recalled having been an adviser on Lynne Truss’s radio programme about punctuation. She told him she was thinking of writing a book on the subject. He advised her not to: “Nobody buys books on punctuation.” “Three million books later,” he said, “I hate her.”
Making a Point is this prolific popular linguist’s entry into the same, or a similar, market. Truss’s book, Eats, Shoots & Leaves, was energised by her furious certainties about the incorrect use of all these little marks. Crystal’s is a soberer and, actually, more useful affair: he puts Truss’s apostrophe-rage in its sociolinguistic context, considers the evolution of modern usages, and gently encourages the reader to think in a nuanced way about how marks work rather than imagining that some Platonic style guide, if only it could be accessed, would sort all punctuation decisions into boxes marked “literate” and “illiterate”. (Or literate and illiterate, if you prefer.)
As Crystal writes, scribes started to punctuate in order to make manuscripts easier to read aloud: they were signalling pauses and intonational effects. Grammarians and, later, printers adopted the marks, and tried to systematise them, as aids to semantic understanding on the page. The marks continue to serve both purposes. “This,” Crystal writes, “is where we see the origins of virtually all the arguments over punctuation that have continued down the centuries and which are still with us today.”
His central argument, buttressed by countless well-chosen examples and enlivened by the odd whimsical digression, is that neither a phonetic, nor a semantic, nor a grammatical account of our punctuation system is singly sufficient.
Ken Roth in The Guardian:
The need to negotiate with leaders as unsavoury as Syria’s Bashar al-Assadis an unfortunate reality of diplomacy. But western leaders should be careful not to confuse that necessity with the idea promoted by Russiathat the Syrian crisis can be resolved only if Assad stays in power. Nor should they believe that Assad’s ongoing rule is the only way to prevent the collapse of the Syrian state and protect Syria’s diverse communities.
Vladimir Putin has long sought to portray Assad as a bulwark against the self-declared Islamic State. But far from a stabilising factor or a solution to the Isis threat to basic rights, Assad is a major reason for the rise of extremist groups in Syria. In the early days of Syria’s uprising, between July and October 2011, Assad released from prison a number of jihadists who had fought in Iraq, many of whom went on to play leading roles in militant Islamist groups. These releases were part of broader amnesties, but Assad kept in prison those who backed the peaceful uprising.
These releases helped to change the complexion of the Syrian rebellion from one with largely democratic aims, to one dominated by jihadists. That transformation has enabled Assad to refocus the narrative from his vicious rule to his claimed indispensability in the fight against Isis.
Clive Cookson in the Financial Times:
The prosthetic, developed at the University of Southern California and Wake Forest Baptist Medical Centre in a decade-long collaboration, includes a small array of electrodes implanted into the brain.
The key to the research is a computer algorithm that mimics the electrical signalling used by the brain to translate short-term into permanent memories.
This makes it possible to bypass a damaged or diseased region, even though there is no way of “reading” a memory — decoding its content or meaning from its electrical signal.
“It’s like being able to translate from Spanish to French without being able to understand either language,” said Ted Berger of USC, the project leader.
The prosthesis has performed well in tests on rats and monkeys. Now it is being evaluated in human brains, the team told the international conference of the IEEE Engineering in Medicine and Biology Society in Milan.
More here. [Thanks to Ali Minai.]
Elite higher education in America has long been a Veblen good—a commodity that obeys few, if any, conventional laws of economic activity. In some cases (chiefly among the children of the serene professional elders perusing the Sunday New York Times), the higher the sticker price of a particular college or university, the more attractive it is. Raise the price and then offer a “discount,” and applications will fly in and better students will enroll. Private colleges and universities figured out this marketing strategy about twenty years ago. That’s a major reason that private college tuition has skyrocketed over the same time span, often at more than double the rate of inflation. Because university administrators know they have an essentially captive client base, they can mark up their sticker prices with impunity.
Economists call things “Veblen goods” when they violate standard models of supply and demand—mainly in cases when an ongoing spike in price works, perversely, to increase demand. Veblen goods are usually luxuries, or at least luxury versions of goods that might be considered necessities in general. Higher education seems to comport with the trend: as the prospects dim for earning a decent wage and forging a comfortable life without a bachelor’s degree, we are told we must increase the number of bachelor’s degrees floating around the economy. And as that number increases, some versions of the degree have become even more valuable in the eyes of tastemakers and nervous wealthy people.
Germany's new European hegemony is a product of the European Monetary Union in combination with the crisis of 2008. It was not Germany, however, that had wanted the euro. Since the 1970s, its export industries had lived comfortably with repeated devaluations of the currencies of Germany's European trading partners, in response to which much German manufacturing moved out of price-sensitive and into quality-competitive markets. It was above all France that sought a common European currency, to end the humiliation it felt at having to devalue the franc against the deutschmark and, after 1989, to bind united Germany firmly into a, hopefully French-led, united Europe. From its conception, the euro was a highly contradictory construction. France and other European countries, such as Italy, were tired of having to follow the hard-currency interest rate policy of the Bundesbank, which had de facto become the central bank of Europe. By replacing the Bundesbank with a European central bank, they expected to recapture some of the monetary sovereignty they felt they had lost to Germany. Clearly the idea was also to make monetary policy in Europe less obsessed with stability and more accommodating of political objectives like full employment. At the same time, Mitterand and his finance minister Jacques Delors, but also the Bank of Italy, hoped to gain political clout against national Communist parties and trade unions by foreclosing external devaluation and thereby forcing the Left to renounce its political-economic ambitions under the constraints of a harder, if not hard, currency.
In the two decades following the Second World War, depression was considered a relatively rare disorder, more likely to be experienced by hospitalized patients than otherwise healthy people. Today, however, the Centers for Disease Control and Prevention estimates that 9.1 percent of adults in the United States are currently experiencing depression. A recent editorial in Nature claimed that “measured by the years that people spend disabled, depression is the biggest blight on human society — bar none.” What accounts for this change?
It will help to identify two broad periods in psychiatry’s standard conception of depression: before 1980, when psychoanalysis still held sway, and after 1980, when depression became defined according to symptom-based classification. These two periods are marked by contrasting criteria for diagnosis in the DSM (Diagnostic and Statistical Manual of Mental Disorders), the “bible” of clinical psychiatry published by the American Psychiatric Association. While the use of the DSM in the everyday practice of clinical psychiatry varies greatly and some psychiatrists hardly use it at all, it standardizes definitions of mental disorders and supplies a lingua franca for research, thereby providing a basis for measuring the prevalence of mental disorders and agreeing on their diagnoses.
The change that occurred in 1980 was pivotal for two reasons: first, it introduced a qualitatively different notion of depression, one that focused on overt symptoms rather than internal psychological stresses; second, in ignoring patient history and social context as criteria for diagnosis, it unintentionally led to an increase in the number of diagnoses.