Friday, March 27, 2015
Carl Zimmer in his excellent blog, The Loom:
Malaria is caused by single-celled parasites called Plasmodium. A female mosquito carries them in its gut as it flies around in search of a victim to bite. After the parasites mature, they push through the insect’s gut wall, eventually making their way into its salivary glands. When the mosquito lands on a person and drills into the skin, it pushes some of its Plasmodium-laden saliva into the wound.
The parasites now begin their long journey through the human body. They get pushed by the surges of the bloodstream to the liver, where they invade cells and multiply inside them. The infected liver cells erupt with the next stage of Plasmodium’s life cycle, called merozoites. The merozoites end up back into the bloodstream, where they now invade red blood cells. They multiply yet again, rupturing the blood cells and invading new ones. Eventually the parasites achieve the next stage in their life cycle, when they’re ready at last to get sucked up by a hungry mosquito in a meal of blood.
If Plasmodium can’t get into a mosquito, all of this multiplication is for naught. So anything that the parasite can do to increase the odds of a successful exit can potentially be favored by natural selection. Last year, for example, a team of researchers found that mosquitoes were attracted to mice infected with Plasmodium parasites–but only when they were ready to leave their rodent host. The scientists found evidence that the parasites engineer this attraction by changing the odor of the mice. Infected mice give off odor molecules that draw mosquitoes to them.
Simon Radford in HIPPO Reads:
It’s not often that a working paper published on an academic website creates a stir, but it seems ours has! As the Guardian Observer reports, our paper shows that for the number of big donors being nominated for positions in the UK’s House of Lords to be coincidental, it would be the equivalent of entering the National Lottery five times in a row and winning the jackpot each time. 1 in 22 Tory big donors, 1 in 14 Labour big donors, and 1 in 7 Lib Dem big donors, have been nominated for a peerage (a position in the House of Lords). Rumors have abounded in the vicinity of Westminster for some time that party leaders exchanged patronage for political financing, but denying that claim just got a whole lot harder. Calls for a wholly elected UK Second Chamber must now be deafening. However, while elections are less prone to corruption than a system of appointment, we can’t stop there. Developed democracies need to rethink the role of public financing in elections if they are going to eliminate private favors and serve the public good.
Critics might argue that a few shady characters shouldn’t be spun into a general lesson. However, by using a statistical analysis across time, across governments, and across different party leaders, we show that there is a structural problem—not a case of the corruption of a few individuals. This is the first time that academics have been able to show a direct link between donations and power over voting on specific laws, but, by adding to a literature that all points in the same direction, our work is also a window on a larger issue facing all developed democracies: inequality in economic power means that our political class responds to the wishes of the rich, not the average voter. This should worry voters in Los Angeles as much as in London, Brussels as much as in Bristol.
Of course, we’re not the first to tackle the subject: academics and activists have been doing their best to sound the alarm for the last few years.
The position once held by the European Left – that solidarity is to be valued above thehomo homini lupus, and that the concept of freedom doesn’t merely have a negative character – has been abandoned. The attitude which Mark Fisher defines as ‘capitalist realism’ appears to have engulfed most of the mainstream Left. Although the recent successes of Syriza in Greece and Podemos in Spain seem timidly to hint at a possible revival of a radical Left, all the major democratic/labour parties in the West appear to converge towards a neoliberal and bleakly anti-humanist consensus. Only in Latin America does the Left still enjoy a comfortable hegemonic status, while also being able to present the future as a land of opportunities rather than a hostile wasteland. Although it is unlikely that Franciscus has read Berardi’s remarks on ‘the end of the future’ and on the consequences of its demise, he has grasped the immense political potential of reopening – and monopolising – the very concept of the time to come.
Consistently with these considerations, Franciscus has placed his pontificate under the bright red star of what was once considered a revolutionary Leftist worldview. In doing so he has been able at the same time to reinforce his presence in the Latin American countries – partly through a revival of the rhetoric and politics of Liberation Theology – and to present himself as the only credible candidate to occupy the gaping hole vacated on the left of the Western political spectrum. He has founded his attack on spectacularly populist tactics, made even more universally appealing by his repeated (yet slyly ambiguous) claim that many call him a Communist, but that he is no Communist – only a true Christian, faithful to the call of Love.
My European friends in China had largely been agreed in their envy of my departure to the ‘civilized’ world. When I’d expressed any apprehensions about the move they had rushed to assure me. Things would be so much easier than in China, they’d stressed. Everything worked. You flushed the toilet and watched the toilet paper disappear instead of the water rising ominously out of the bowl. You might pay more for food and clothes but what you purchased was of assured quality. People in Europe were ethical. None of that lying and cheating that went on in China with its get-rich-quick culture. The air was clean, the neighbourhoods green. People queued at bus stops and didn’t spit up foaming gobs of phlegm on the roads.
Efficiency, quality, honesty: these words echoed in my head as our plane prepared for landing in Brussels on a late April day in 2009. An hour or so later I was desperately knocking at the door of the airport police station, wild-eyed and begging for help, having been robbed of my handbag and laptop case while expertly distracted by the thief’s accomplice. ‘Is this arrivals or departures?’ the partner in crime had asked, and when I’d turned to answer, his friend had quietly made off with my belongings.
The British Security Service, better known as MI5, released its file on Eric Hobsbawm last autumn. Hobsbawm, who had long desired to see it, had died two years earlier, at the age of 95. In his memoir, Interesting Times, he warned against autobiographical ‘post-mortem inquests in which the corpse pretends to be the coroner’, but whatever self-justifications he might have entered as evidence, the reading of his file is hampered by his absence. It is an unwritten rule of MI5 that Personal Files (PFs) are only released after their subjects have died. Another unwritten rule, among so many, is that it only releases such material after fifty years, which explains why the Hobsbawm file deposited at the National Archives in Kew ends in the mid-1960s. The rest is withheld, and researchers who ask for more will fare no better in their feeble supplications to the state than Hobsbawm, one of the pre-eminent British historians of the 20th century.
To this deficit must be added the blanks in the file left by the declassifiers (a posh word for ‘censors’), the silent deceptions by which deception is itself concealed. Many names are redacted, and some pages have been removed in toto and replaced with a white sheet on which is stamped this grammatically unappealing message: ‘THE ORIGINAL DOCUMENT RETAINED IN DEPARTMENT UNDER SECTION 3(4) OF THE PUBLIC RECORDS ACT 1958.’ Section 3(4) allows for the retention of a record for a ‘special reason’, which does not have to be given. No reason is given, either, for the absence of an entire folder of the Hobsbawm file. Retained? Lost in transit? Destroyed? Also withheld, as standard practice, is MI5’s intelligence assessment, the casework on the material collected (through surveillance, informers, plants etc) in a file.
Novelist Akhil Sharma on why his first response to winning the 2015 Folio Prize was not joy but shame
Gaby Wood in Telegraph:
Akhil Sharma’s deadpan autobiographical novel, Family Life, ends with a kind of beginning. The narrator has taken his beautiful new girlfriend to a resort hotel. As they lounge by the pool and she leans against him, he feels happier and happier. “The happiness,” Sharma writes, “was almost heavy.” And then comes the last line: “That was when I knew I had a problem.” When I ask Sharma how he’s feeling, the morning after he has won the £40,000 Folio Prize, he responds with a brief smile, a shrug and a flat-toned explanation of his tendency to pan the world for disappointment. “My mind is like a police scanner,” he says, “wondering what’s wrong.” The first thing he felt when he heard he’d won, he says, was shame.
If that sounds melodramatic, or inappropriately comic, the book itself goes some way towards explaining the background. Sharma’s novel (his second) tells the story of an Indian family who move to New Jersey to begin what they hope will be a better life. Just after the elder brother is granted a place at a distinguished high school in New York, he dives into a swimming pool, hits his head and remains underwater for long enough to provoke a coma and lifelong brain damage. All of this happened to Sharma’s family, and the story is told from the point of view of the younger sibling - Sharma’s alter ego, Ajay - with all the naive hope and pointed perception of a child. “I wondered if he was dead,” Ajay thinks when his aunt says she has to go to the hospital. “This last was thrilling. If he was dead, I would get to be the only son.” Ajay lives through the wreckage of his parents’ aspirations: his mother’s misery, his father’s alcoholism, the daily burden of caring for his brother. “Daddy, I am so sad,” he says at one point. “You’re sad?” comes the furious response. “I want to hang myself every day.” The book is so funny you almost feel guilty for laughing – some sort of alchemical transfer, one presumes, of Sharma’s shame into fictional gold. When his mother asks for a hearing aid, Ajay’s father replies: “Why? If by mistake some good news does come for you, I’ll write it down.”
Pierre-Alain Clavien and Joseph Deiss in Nature:
The academic world has changed greatly in recent decades, so demands on its leaders have too. Departmental chairs, deans, facility directors and other leaders are now expected to power research, attract funding, manage investments, engage with policy-makers, woo the media and train personnel. Finding people who can manage these demands simultaneously is difficult. Botched appointments are costly — intellectually, emotionally and financially — for universities, students, research and sometimes for hospitals and patients too. Surprisingly, there is little data on the selection processes of academic chairs1, 2.
Here, we relate a recent exercise in selecting a chair for a position in clinical academic medicine that in our view holds lessons for the appointment of science leaders more generally. Through a formal consensus process involving leaders from industry, policy and academia, we have distilled a set of principles — telegraphed here (see ‘Checklist for high-level hiring’) — for making high-level hires. Although many seem unsurprising, they are too often ignored.
Seek strong emotional, personal and social skills. Leaders need to be highly intelligent in communication and relationship-building to support and motivate interdisciplinary teams, convey integrity, adapt to change9 and to empathize with patients. This feature cannot be compensated for by other qualities. People succeed when they treat the individuals around them well.
Find someone with fire in their belly and stoke it. Chairs need to be ready to fight for their academic mission and to identify strategies to minimize the administrative burden imposed on them and their academic colleagues. The passion of a new chair should be maintained by academic freedom, good infrastructure and room for development. These factors are much more important than salary benefits in attracting — and keeping — highly qualified individuals.
Christa Gray in OUPblog:
The Renaissance vision of Jerome (c. 347-420 AD), as depicted by Albrecht Dürer in a world-famous engraving of 1514, seems to represent an ideal type of the scholar: secluded in the desert, far removed from the bustle of ordinary life (with a lion to prove it), well-established in his institution (as shown by the cardinal’s hat), and devoted to his studies. However, even a casual reader of Jerome’s letters and pamphlets can see that the reality was much more tumultuous. Jerome left Rome for Bethlehem in 384 AD not out of pious devotion but because of a feud with the Roman clergy, who resented his ascetic programme. Even his Hebrew biblical translations, which would later form the core of the authoritative Latin version of the Catholic Church, were frowned upon by contemporaries, including Augustine, who upheld the sacred status of the Greek Septuagint. Moreover, Jerome’s close attachment to a rich and noble Roman widow, Paula, had given rise to salacious gossip. What sort of model can such a man be?
John Norman Davidson Kelly’s classic biography, Jerome: His Life, Writings, and Controversies, depicts him as a quarrelsome man rarely at peace with himself and whose writings were often produced in a rush and could be severely lacking in tact. A case in point is the attack against the priest Jovinian in 393 AD, who had dared to claim that Christian virgins were not automatically superior in holiness to Christian married women. Jerome’s exaggerated and aggressive response caused embarrassment even to his supporters, who had urged him to respond to Jovinian’s claim in the first place. To us, his text reads like a choice piece of misogyny, the sort which many still associate with the Catholic Church. Yet at the time, the official church failed to embrace his stance. More interestingly, many of Jerome’s arguments in favour of celibacy have their roots in the classical–that is to say, the ‘pagan’– tradition, which abounds in misogynistic treatises raging against marriage. In short, anyone who was prepared to be offended could find something to offend in Jerome.
But is there a way to combine Dürer’s idealised picture of Jerome with the one outlined by Kelly? Andrew Cain’s monograph, The Letters of Jerome: Asceticism, Biblical Exegesis, and the Construction of Christian Authority in Late Antiquity, has taught us how to read Jerome’s often immodest and immoderate statements. They are in fact part of a deliberate strategy to advertise his abilities as a writer and his authority as an ascetic scholar as widely as possible. Cain shows that, for Jerome, it was an essential necessity to attract patrons and sponsors if he wanted to continue his monastic life. He had little wealth of his own and even the vast resources of his friend Paula dried up in the process of supporting Jerome and maintaining the Bethlehem monastery they had founded together. Jerome’s outrageous provocations can be seen as part of a wider effort to draw attention to himself and his projects. It appears that there were just enough people at the time with an interest–political or otherwise–in feeding this particular type of troll.
Read the rest here.
In the morning
After the loaded trucks
That shattered the doors of sleep.
And the final ‘adieu’ of the day before
And the final steps on the damp tiles
And your last letter
In the arithmetic notebook from your childhood
Like the grill on the small window
Which slides up the parade of the morning’s
Joyous sun with perpendicular black lines.
by Manolis Anagnostakis
translation: Philip Ramp
Thursday, March 26, 2015
Dustin Illingworth in 3:AM Magazine:
Shrill and appalling, the words still hold something of their concussive effect: “God is dead.” A particular strain of modern agony, crystallized. But if Thus Spake Zarathustraheralded deicide, it was only in the context of a larger rebuttal of metaphysical tradition. Indeed, Nietzsche’s most quotable proclamation has the dubious distinction of also being his most vulgarly misunderstood. Popularly accepted as an incursion on religious belief as such, Zarathustra’s famous utterance has seen the broader implications of its meaning dissolved within a caricatured nihilism. For Nietzsche, God was dead – but so, too, was German Idealism, the polished systems of Hegel and Schelling, to say nothing of the Enlightenment project of an eminently rational progress. It was a disintegration of the reigning spiritual and intellectual frameworks as much as it was a rooting out of God from his many hiding places: morality, culture, grammar and art, to name but a few. Surmounting the God-shaped void, our lonely hero knew, was a task for a theorized posthumanity (or, at the very least, a hardier variety of late European).
The enormous difficulty of this challenge – of discovering a surrogate commensurate with the social, moral and political power of a departed Almighty – is the provenance of Terry Eagleton’s bracing intellectual history Culture and the Death of God. Its central argument – that genuine atheism is both difficult and rare – seems at first blush a bit of wishful apologism, the death rattle of a proud but exhausted cultural model. After all, the diminishment of the sacred is no longer merely the overbold conjecture of an intellectual fringe element. Withered by the profound secularization of capitalist culture, and bolstered by positivism’s new vogue beneath the banners of Dawkins and Harris, God seems, if not dead, than irreparably reduced – something approaching an antiquated curio, or the equivalent of a harmless knocking on dusty wood.
And yet, by way of an ironically Darwinian feat of cultural adaptation, He remains alive and well – if, admittedly, much transformed. His many secular guises constitute and complicate the last 300 years of European thought – from Enlightenment rationalism, to Romantic intuition, to the Modernist culture industry. Eagleton’s oeuvre, a formidable body of literary and cultural criticism deeply informed by his Marxist-Catholic convictions, can be taken as a hostile interrogation of this secularizing tradition. His lively 2009 book Reason, Faith, and Revolution: Reflections on the God Debate, adapted from his Yale lectures, was a polemical broadside against the liberal-humanist prejudices of New Atheism. Culture and the Death of God can be usefully read as a kind of companion volume to this previous work, as it guides the reader through a brisk circuit of recent European history to compile a damning index of secular failure.
Jo Littler interviews Nancy Fraser, in Eurozine (Photo: Scott Robinson. Source: Flickr):
JL: In your work you've often warned against trading-in a truncated economism for a truncated culturalism, and stressed the importance of combining both approaches. How would you locate yourself in relation to that paradigm? How have you yourself been shaped by the politics of recognition and redistribution?
NF: I grew up in Baltimore, Maryland in the days when it was a Jim Crow segregated city. The formative experience of my life, in my early teenage years, was the struggle for racial desegregation – to dismantle Jim Crow. This was a struggle for recognition of the most compelling and obviously just kind. And like many people of my generation, I moved in quick sequence from there to anti-Vietnam War struggles. I encountered Marxism in unorthodox, democratic, New Left form. That gave me a way to try and think conceptually about the various battles against different forms of domination that were so intense in that period. And soon second-wave feminism erupted and came in to the mix. Now, all of this was going on in a time of relative prosperity. I don't think we in the New Left and the early second-wave feminist movement worried very much about how we would support ourselves. Of course we were young, and we often didn't have children; but there was very much a sense – which proved to be an illusion, but was a felt sense nonetheless – that the first-world model of Keynesian capitalist prosperity would continue. We certainly had a perspective about class, and we understood very well that racism correlated with poverty and exploitation. But we thought, looking through a quasi-Marxian socialist-feminist analytical lens, that what seemed to be a secure social-democratic drift meant that redistribution was relatively unproblematic, and that what we had to do was to fight to introduce the importance of recognition into the forms of traditional Marxism and economistic thinking that dominated even social democracy at the time. That proved to be wrong. I soon found myself getting more and more nervous, as the 1980s wore on into the 1990s, that the critique of political economy was being lost amongst the new social movements, the successor movements to the New Left – including feminism. I felt we were getting a one-sided development of the politics of recognition. To me, recognition always only made sense when it was connected to the political economic dimension of society. Otherwise – as with feminism – you get women put on a pedestal and lots of lip service about how important care work is, but it's a sentimentalized, almost Victorian ethos unless you connect it to political economy. That's when I started saying "We had a great critique of economism of a vulgar sort – let's not make the same mistake and end up ourselves with some kind of a vulgar culturalism".
Above all in the US, but also elsewhere throughout the world, there was a paradigm shift towards the dimension of recognition, and it arose exactly at the moment – it's quite ironic – when the Keynesian social-democratic formation was beginning to unravel. We got the astonishing resurrection of liberal free-market ideas that everyone had assumed were in the dustbin of history forever.
Sean Carroll in Preposterous Universe:
Don Page is one of the world’s leading experts on theoretical gravitational physics and cosmology, as well as a previous guest-blogger around these parts. (There are more world experts in theoretical physics than there are people who have guest-blogged for me, so the latter category is arguably a greater honor.) He is also, somewhat unusually among cosmologists, an Evangelical Christian, and interested in the relationship between cosmology and religious belief.
Longtime readers may have noticed that I’m not very religious myself. But I’m always willing to engage with people with whom I disagree, if the conversation is substantive and proceeds in good faith. I may disagree with Don, but I’m always interested in what he has to say.
Recently Don watched the debate I had with William Lane Craig on “God and Cosmology.” I think these remarks from a devoted Christian who understands the cosmology very well will be of interest to people on either side of the debate.
In 1902, Jack London lost himself in the East End of his urban namesake. This Klondike adventurer’s temporary disappearance from relatively polite society – of oyster pirates and tramps, among others – was the result of a fateful improvisation on his part. Deeply in debt and love with the socialist writer Anna Strunsky, London had been due to undertake a journalistic commission in South Africa. The commission fell through, however, and, as Earle Labor relates in his biography, London negotiated with his publisher for him to write a book instead, about life in what was reputed to be one of the worst slums on earth. This, characteristically, the young author knew he had to see from the inside.
On August 9, 1902, in the guise of an American sailor down on his luck, London walked into Trafalgar Square and joined the crowds celebrating the coronation of Edward VII. Heading east, he quickly found himself immersed in a “human hellhole” – “a vast shambles”, “utterly unnatural”, “a huge killing-machine”. Malnutrition, cramped and unhygienic lodgings (or no lodgings at all), hopeless insobriety, even the thought that the slightly better-off are unlikely to bequeath any security in life to their children: all of this London notes in horror, while counting his own blessings and reminding the reader of the unspeakable affluence to which the city is also home.
There is a well-established Dylan Thomas myth that goes something like this: early brilliance as an instinctive versifier drunk on words, which was then wasted through more conventional drunkenness, redeemed somewhat by the glamour associated with an early death. Thomas both recognised and helped fuel it by his ironic self-description as the “Rimbaud of Cwmdonkin Drive”. The myth fed into the centenary celebrations of his birth last year, which became a kind of year-long Bloomsday, a celebration more of Thomas the character than Thomas the obscure, often difficult, experimental poet.
Thomas’s public reputation rests mainly on the enduring popularity of Under Milk Wood, “A Child’s Christmas in Wales” and a handful of anthology-piece poems. His critical reputation, such as it is, is that of a somewhat marginal figure who was overshadowed in his youth by Auden and the New Country poets and then dismissed after his premature death by the Movement. Indeed, such is the legacy of this dismissal that he is now often bracketed with the poets of the “dismal 40s”, despite the fact that the bulk of his poems were written and published in the 1930s.
However, centenaries can also be occasions for reappraisal, a time when texts are republished and critically re-examined in the light of current scholarship. In the case of Thomas, the most significant act of rediscovery was the publication of this New Centenary Edition of the collected poems, the most comprehensive of the four collecteds to appear to date, and the first since 1988.
Williams’s legacy and influence, which had once seemed assured, have gradually shrunk. If, more than a quarter-century after his death, he is to become a vital rather than remembered or spent force it is necessary to do two things that might appear contradictory: to concede that, with the exception ofBorder Country, the fiction to which he devoted so much energy was dull; and to free the rest of his work from the once-modish tundra of cultural studies, let alone the pack ice of theory. Perhaps then he will be read with the same passion and adoration that still attends the discovery of John Berger.
A perverse and ironic fate: Williams, the internationalist, is seen as the worthy relic of a vanished, pre-Thatcherite Britain, a socialist writer read by a diminishing audience of Marxists, academics and students. It was the least surprising thing in the world to see, in the Occupy Camp at St Paul’s a few years ago, a much-pierced protester reading Berger’s Hold Everything Dear; it was equally unsurprising that no one was holding Williams’s The Country and the City.
Maryam Omidi in The Guardian (h/t Chapati Mystery):
Dubbed “the world’s most violent megacity”, armed muggings, carjackings and extortion are part of everyday life in Karachi, where political and criminal forces vie for ownership of the city. The result is a pervasive sense of fear – one that prevents many Karachiites from even leaving their own neighbourhoods, which are carved along wealth and ethnic lines.
“The culture of driving, and the security issue, disable you from visiting these other places,” says Farzana Mukhtar, an HR consultant. It’s 8am on Sunday, and the places Mukhtar is referring to are the streets of Saddar Town, Karachi’s former colonial centre. In contrast to the mid-week traffic, it is virtually deserted, leaving Mukhtar and his group of camera-wielding tourists to admire the remnants of the city’s colonial architecture and daily life with a sense of wonderment: the few hawkers who have woken early, and the tea shop owners preparing for the breakfast crowd. It’s a scene common to tourist sites everywhere; what’s unusual about this group is that many of them are from Karachi itself, on a tour to explore their own city.
Mukhtar and the group are part of a city bus tour organised by Super Savari Express, the first of its kind in the city. At 2,000 Pakistani rupees per ticket (£13), the tour, which launched late last year, attracts a relatively wealthy clientele: Mukhtar, who lives in Clifton, one of the city’s most affluent neighbourhoods, is typical.
“We have about 30 to 40 people on each tour, and they all know the political situation and the safety situation – and yet they’re here because they’re hungry to see they can explore,” says Atif bin Arif, managing director of Super Savari Express. “These are the same people who fly to the Vatican to see the Sistine Chapel, even though we have beautiful churches here; or go to India to see temples, when we have Hindu temples here.”
Read the rest here.
Sophia Nguyen in Harvard Magazine:
On November 11, 1953, psychology professor B.F. Skinner sat in a fourth-grade math class, perturbed. It was Parents Day at his daughter Deborah’s school. The lesson seemed grossly inefficient: students proceeded through the material in lock-step, at the same pace; their graded assignments were returned to them sluggishly. A leading proponent of what he called “radical behaviorism,” Skinner had devoted his career to studying feedback. He denied the existence of free will and dismissed inner mental states as explanations for outward action. Instead, he focused on the environment and the organism’s response. He had trained rats to push levers and pigeons to play Ping-Pong. A signed photo of Ivan Pavlov presided over his study in Cambridge. Turning his attention to a particular subset of the human animal—the schoolchild—Skinner invented his Teaching Machine.
Roughly the size and shape of a typewriter, the machine allowed a student to progress independently through a curriculum, answering test items and getting instant feedback with a few pulls of a lever. “The student quickly learns to be right. His work is pleasurable. He does not have to force himself to study,” Skinner claimed. “A classroom in which machines are being used is usually the scene of intense concentration.” With hardly any hindrance from peers or teachers, thousands of students could receive knowledge directly from a single textbook writer. He told The Harvard Crimson, “There is no reason why the school room should be any less mechanized than the kitchen.” Sixty years later, Skinner’s reductionist ideas about teaching and learning continue to haunt public education—especially as it’s once again being called upon to embrace technology. In December 2014, as part of a nationwide event promoting computer-science education called Hour of Code, Barack Obama hunched over a laptop alongside a group of New Jersey middle-schoolers, becoming the first president to write a line of code. The public-policy world frames computer science in K-12 education as a matter of economic urgency. Digital fluency is often called a twenty-first-century skill, equally necessary for personal workplace success and for the maintenance of America’s competitive edge.
Teaching machines with capabilities beyond Skinner’s imagining have proliferated in this century.
Tunku Varadarajan interviews Ayaan Hirsi Ali in the New York Times:
Ayaan Hirsi Ali is Islam’s best known—her critics would say most incendiary—dissident in the West. Born in Somalia, she escaped an arranged marriage by seeking asylum in the Netherlands, where she studied furiously, assimilated with a vengeance, and became a member of the Dutch parliament. Her political views attracted scandal from the very start. She was blunt in her condemnation of Islam and Islamists, earning her the ire not just of the Muslim objects of her criticism, but also of a liberal political establishment that found her vehemently pro-Western views impossible to digest.
Her life changed forever in November 2004: Theo Van Gogh, a Dutch filmmaker with whom she had collaborated on “Submission,” a film that excoriated the treatment of women in Islam, was stabbed to death in broad daylight by an Islamist assassin. Pinned by a knife to Van Gogh’s chest was a note that threatened Hirsi Ali with death. Not long after, she moved to the United States, where she now lives under round-the-clock protection.
Hirsi Ali is the author of four books, the most recent being Heretic: Why Islam Needs a Reformation Now...
Wednesday, March 25, 2015
Andreas Wagner in Aeon:
Classification requires comparison. In the process, we see how deeply similar the legs of birds and lions are, or the flowers of roses and marigolds. Such resemblances form a cornerstone of Darwin’s great insight that all life forms a grand family. Yet scientists such as Cuvier rejected the idea of evolution’s great chain of living beings, drawing support from the large gaps that then existed in the fossil record. ‘If the species have changed by degrees,’ he wrote in 1827, ‘we should find some traces of these gradual modifications.’ If he had seen the intermediate steps that we have now seen, perhaps he would have changed his mind.
But perhaps not. For the reasons to reject evolution go deeper than incomplete knowledge. In fact, we can follow them all the way back to Plato, whose influence looms so large that the 20th-century thinker Alfred North Whitehead could relegate the entirety of European philosophy to a ‘series of footnotes’ to his work.
For Plato, the perceptible material world is like a faint shadow of a higher reality. What really matters is the realm of abstract concepts. To a Platonist, the essence of soccer balls, golf balls and tennis balls is their ball-like shape. It is this pure, abstract and unchanging essence that is real, not the physical balls, whose existence is as fleeting and impermanent as a shadow.
A systematist’s task might be daunting, but it becomes manageable if each species is distinguished by its own Platonic essence. For example, a legless body and flexible jaws might be part of a snake’s essence, different from that of other reptiles. The task is to find a species’ essence. Indeed, the essence really is the species in the world of Platonists. To be a snake is nothing other than to be an instance of the form of the snake.
The only problem: the glass lizard. And hundreds of other creatures that defy easy categorisation, such as Eupodophis, from the late Cretaceous period, a snake with rudimentary hind legs. In an ever-changing Darwinian world, species incessantly spew forth new species whose traits can shade into one another. The 20th-century biologist Ernst Mayr called Plato the ‘great antihero of evolutionism’, and in fact it was Mayr who replaced the essentialist concept of species with a modern biological alternative, based on individuals in the same population that can interbreed.
But as has happened many times before, Plato might have the last word. We just need to look deeper than the ephemeral appearance of living things.
He’s sold more books than most of your favourite authors combined, but the master of the suburban thriller, Harlan Coben, isn’t getting complacent
Lydia Kiesling in The Guardian:
It’s odd to hear this from a man whose biography is studded with the kind of numbers that torture the more penurious sort of writers. He’s written 27 novels, seven of them New York Times No 1 bestsellers. He has 60m books in print in 41 languages, and his advances are well into seven figures. He’s won the big three in mystery awards – the Edgar, the Shamus and the Anthony. The blockbuster French film based on his novel, Tell No One, was nominated for nine Cesars.
In short, Harlan Coben has more readers, and makes more money, than every writer I follow on Twitter combined.
Readers first fell in love with Coben in the 1990s through Myron Bolitar, a hapless former basketball star who solves mysteries with a waspy sociopath named Win.Coben’s atmospheric, twist-laden stand-alone novels cemented his popularityand earned him an annual spot at the top of the bestseller lists. The latest of these, The Stranger, which comes out in the United States and United Kingdom today, documents desperate acts in a serene suburban hamlet populated with lacrosse moms, and grapples with technological and moral dilemmas taken straight from the headlines.
Coben’s gregarious and voluble personality stands in a direct contrast to his occasionally grim books.
Matthew Bishop in the New York Times:
For many people, Bretton Woods stands for that rarest of moments: when governments and experts come together to restore order to a chaotic global economy. After the financial meltdown of 2008, the president of the World Bank and the financier George Soros joined Bill Clinton’s and Tony Blair’s earlier call for a “new Bretton Woods.” It didn’t happen. The world and especially America may yet come to regret that.
To its admirers, many good things were achieved at the Bretton Woods conference over three hectic weeks in the summer of 1944. As the Allies made their final push to liberate Europe, 730 representatives of 44 countries gathered in New Hampshire to set the rules for the postwar economy. Crowded into the half-restored grandeur of a hotel named after nearby Mount Washington, they agreed to create two new institutions to oversee the world economy, the International Monetary Fund and World Bank, and to establish a managed system of exchange rates.
Anchored semi-rigidly to the dollar (which was pegged to the price of gold), this new system was intended to be fairer and more economically rational than the old gold standard, which had collapsed in 1933. It would also be more orderly and sustainable than the endless beggar-thy-neighbor currency devaluations of the subsequent foreign-exchange market free-for-all. In agreeing to this, according to the fans of Bretton Woods, the Allied governments had learned important lessons after World War I, when the determination of the victors to punish the vanquished, rather than rebuild their devastated economies, only added to the pressures that resulted in World War II.
“The Economic Consequences of the Peace,” John Maynard Keynes’s pamphlet pointing out the likely disastrous consequences of the victor-friendly policies adopted after World War I, had turned Keynes into the century’s first celebrity economist. At Bretton Woods, he led the British delegation. He was the dominant intellectual force at the conference, though that did not stop him from losing many of the crucial political battles to his American counterpart, Harry Dexter White. A Jew from a rough part of Boston with a dislike for British elitism, White was later accused of spying for the Soviets, but only after he had won some notable victories over the Cambridge don.
Bente Scheller at the Heinrich Böll Stiftung:
If you cannot overthrow the tyrant, co-operate with him – after four disastrous years in Syria this seems to be the conclusion the international community has arrived at. While back in 2011 Bashar al-Assad’s days appeared to be drawing to a close, a growing number of people are now suggesting to see him as part of the solution, as illustrated recently by UN Special Envoy Staffan de Mistura in Vienna.
The more methodical and brutish Syria’s dictator disregards human rights, the more he seems to assume the role of a potentially reliable partner in the eyes of some. That is primarily due to the Islamist terror army ISIS. Albeit there are few atrocities with civilian victims the regime is not responsible of committing and although it commits these crimes to a much greater, deadlier extent - Assad is readily seen as the “lesser evil”.
The implication that the situation in Syria could be pacified through a co-operation with Assad in the battle against terrorism is as plain as it is ill-conceived when it comes to the actual implementation. The fight against ISIS requires three things: the means, the will and a strategy.
Assad’s regime is subject to international sanctions. However, it has been receiving vast amounts of financial and military support by Iran and Russia. How likely would Damascus’ current allies be to maintain this support if Assad was rehabilitated by the West? In view of the weak rouble and the economic consequences of the oil price decrease for Iran it would be of interest for both to scale down the liability caused by their involvement in Syria. Particularly the history of Russian-Syrian relations indicates furthermore that it was only of interest for Moscow to co-operate with Syria if this meant making a political statement against the West. A rehabilitation of Syria would come at an exorbitantly high price, politically as well as financially. How much is the West prepared to pay?
Approaching Those ‘Ruddy’ Belisha Beacons
Near the Post Office Again
You can see them from a long way off,
From when you pass the half-visible ponies
In the field where the school was
By the bus shelter with the bloke in it,
The bloke whose face is lit by his iPhone
Like a tallow-maker’s face is lit in an old master.
One Belisha Beacon off. One Belisha Beacon on.
Small parcels of light sent first class to each other;
Moons chucking glowing balls across the road’s net.
A car slows by the Post Office and a woman jumps out
And gives me a letter. ‘Can tha stick this in’t box for mi?’
She asks. I will, in a minute. Jogger walks by, gasping-gasp.
First I’ll hold the envelope up to the Belisha Beacon.
Not to read the letter inside, you understand,
Just to gaze at light on paper, light on writing.
by Ian McMillan
first published on Poetry International, 2014
A Belisha beacon (/bəˈliːʃə/) is an amber-colored globe lamp on a tall black and white pole, marking pedestrian crossings of roads in the United Kingdom, Ireland and in other countries influenced by Britain