Sunday, March 09, 2014
A nice empirical study of vaccine risk communication--and an unfortunate, empirically uninformed reaction to it
Dan Kahan at the Cultural Cognition Project at Yale Law School:
Pediatrics published (in “advance on-line” form)an important study yesterday on the effect of childhood-vaccine risk communication.
The study was conducted by a team of researchers including Brendan Nyhan and Jason Reiﬂer, both of whom have done excellent studies on public-health risk communication in the past.
NR et al. conducted an experiment in which they showed a large sample of U.S. parents with children age 17 or under communications on the risks and benefits of childhood vaccinations.
Exposure to the communications, they report, produced one or another perverse effect, including greater concern over vaccine risks and, among a segment of respondents with negative attitudes toward vaccines, a lower self-reported intent to vaccinate any “future child” for MMR (mumps, measles, rubella).
The media/internet reacted with considerable alarm: “Parents Less Likely to Vaccinate Kids After Hearing Government’s Safety Assurance”; “Trying To Convince Parents To Vaccinate Their Kids Just Makes The Problem Worse”; “Pro-vaccination efforts, debunking autism myths may be scaring wary parents from shots”. Etc.
Actually, I think this a serious misinterpretation of NR et al.
From The Telegraph:
'I never forget a face, but in your case I’d be glad to make an exception.'
Les Dawson (1931-1993):
'My wife sent her photograph to the Lonely Hearts Club. They sent it back saying they weren't that lonely.'
Bob Newhart ( September 5, 1929-):
'I don't like country music, but I don't mean to denigrate those who do. And for the people who like country music, denigrate means 'put down'.
Oscar Wilde (1854-1900):
'The English country gentleman galloping after a fox is the unspeakable in full pursuit of the uneatable.'
Dorothy Parker (1893-1967):
'If you want to know what God thinks of money, just look at the people he gave it to.'
W.C Fields (1880-1946):
'Start every day off with a smile and get it over with.'
The Distracted Public: Saul Bellow on How Writers and Artists Save Us from the “Moronic Inferno” of Our Time
Maria Popova in Brain Pickings:
“The writer cannot make the seas of distraction stand still, but he [or she] can at times come between the madly distracted and the distractions.”
In 1990, fourteen years after he received the Nobel Prize in Literature and the Pulitzer Prize, and two years after being awarded the National Medal of Arts, Saul Bellow delivered a lecture at Oxford University titled “The Distracted Public.” Eventually included in It All Adds Up: From the Dim Past to the Uncertain Future(public library), Bellow’s talk laments the “moronic inferno” — a phrase he borrowed from Wyndham Lewis — produced by the “contemporary crisis” of distraction, “the apocalypse of our times,” calling on artists and writers to raise their voices in countering that “massive and worldwide” “hostile condition” of humanity.
Bellow begins by considering the role of the artist — the writer — in society, and in societies of various regimes:
The writer cannot make the seas of distraction stand still, but he [or she] can at times come between the madly distracted and the distractions. He [or she] does this by opening another world. “Another world,” I am fully aware, carries suggestions of never-never land, and people will be asking themselves how seriously any man can be taken who still believes that the moronic inferno can be put behind us, bypassed or quarantined by art. It isn’t as though the champions of art had won any great victories. Madame Bovary dies of arsenic, and Flaubert the artist-chronicler is dangerously wounded too. Tales of love and death can be mortal to the teller. Yet for many people … the abandonment of art cannot happen. Dictatorships did not succeed in frightening artists to death, nor has democracy done them in altogether, although some observers consider democracy to be by far the greater threat. In the West, Stalinism is sometimes seen as a political disaster but, to artists, a blessing in disguise. It kept them serious. They died, leaving us great works. With us, the arts sink into the great, soft, permissive bosom of basically indifferent and deadly free societies…
Simon Kuper in the FT:
In two recent columns I explained how to save France and the UK. Now that’s done, it’s time to save America. The solution is obvious. The US needs to model itself on its most sanctified institution: the military. I speak from experience. In 2007 and 2008 I spent time on a US military base in a southern state, giving seminars to officers. Being a typical pinko anti-war European, I’d expected to hate the place. Instead I found it idyllic, intellectual and safe. Pottering about the base, I saw several things that the US could learn from its military:
1. Build socialism. Life in the US military is much like life in Sweden (unless you’re off in Afghanistan spreading democracy). The officers in my seminars spent a quarter of their careers in education, because the US military believes in life-long learning. The military also provides socialised healthcare, subsidised childcare, early pensions etc. I’ve never seen a socialist paradise like it, and I grew up in the Netherlands in the 1970s. Most of the military’s entitlements will survive the budget cuts now being proposed by Chuck Hagel, the defence secretary.
2. Ban guns. I was surrounded by fearsome warriors yet I felt perfectly safe, partly because hardly anyone is allowed to carry guns on US military bases. The “right to bear arms” just doesn’t apply there.
3. Believe in science. Any institution that spends its time firing drones from Nevada at pedestrians in Yemen is going to be pro-science. The Pentagon frets about climate change, and the army aims to be “net zero energy” by 2030. The superhero-like Navy Seals are already fuelled partly by solar power.
Pagan Kennedy in The NYT:
IF you walk into a farm-supply store today, you’re likely to find a bag of antibiotic powder that claims to boost the growth of poultry and livestock. That’s because decades of agricultural research has shown that antibiotics seem to flip a switch in young animals’ bodies, helping them pack on pounds. Manufacturers brag about the miraculous effects of feeding antibiotics to chicks and nursing calves. Dusty agricultural journals attest to the ways in which the drugs can act like a kind of superfood to produce cheap meat.
But what if that meat is us? Recently, a group of medical investigators have begun to wonder whether antibiotics might cause the same growth promotion in humans. New evidence shows that America’s obesity epidemic may be connected to our high consumption of these drugs. But before we get to those findings, it’s helpful to start at the beginning, in 1948, when the wonder drugs were new — and big was beautiful.
That year, a biochemist named Thomas H. Jukes marveled at a pinch of golden powder in a vial. It was a new antibiotic named Aureomycin, and Mr. Jukes and his colleagues at Lederle Laboratories suspected that it would become a blockbuster, lifesaving drug. But they hoped to find other ways to profit from the powder as well. At the time, Lederle scientists had been searching for a food additive for farm animals, and Mr. Jukes believed that Aureomycin could be it. After raising chicks on Aureomycin-laced food and on ordinary mash, he found that the antibiotics did boost the chicks’ growth; some of them grew to weigh twice as much as the ones in the control group.
Mutations in a gene associated with leukaemia cause a newly described condition that affects growth and intellectual development in children, new research reports. A study led by scientists at The Institute of Cancer Research, London, identified mutations in the DNA methyltransferase gene, DNMT3A, in 13 children. All the children were taller than usual for their age, shared similar facial features and had intellectual disabilities. The mutations were not present in their parents, nor in 1,000 controls from the UK population.
The new condition has been called 'DNMT3A overgrowth syndrome'. The research is published today (Sunday) in the journal Nature Genetics and is a part of the Childhood Overgrowth Study, which is funded by the Wellcome Trust, and aims to identify causes of developmental disorders that include increased growth in childhood. The DNMT3A gene is crucial for development because it adds the 'methylation' marks to DNA that determine where and when genes are active. Intriguingly, DNMT3A mutations are already known to occur in certain types of leukaemia. The mutations that occur in leukaemia are different from those in DNMT3A overgrowth syndrome and there is no evidence that children with DNMT3A mutations are at increased risk of cancer.
The Iron Bridge
I am standing on a disused iron bridge
that was erected in 1902
according to the iron plaque bolted to a beam,
the year my mother turned one.
Imagine—a mother in her infancy,
and she was a Canadian infant at that,
one of the great infants of the province of Ontario.
But here I am leaning on the rusted railing
looking at the water below,
which is flat and reflective this morning,
sky-blue and streaked with high clouds,
and the more I look at the water,
which is like a talking picture,
the more I think of 1902
when workmen in shirts and caps
riveted this iron bridge together
across a thin channel joining two lakes
where wildflowers now blow along the shore
and pairs of swans float in the leafy coves.
1902—my mother was so tiny
she could have fit into one of those oval
baskets for holding apples,
which her mother could have lined with a soft cloth
and placed on the kitchen table
so she could keep an eye on infant Katherine
while she scrubbed potatoes or shelled a bag of peas,
the way I am keeping an eye on that cormorant
who just broke the glassy surface
and is moving away from me and the bridge,
swiveling his curious head,
slipping out to where the sun rakes the water
and filters through the trees that crowd the shore..
And now he dives,
disappears below the surface,
and while I wait for him to pop up,
I picture him flying underwater with his strange wings,
as I picture you, my tiny mother,
who disappeared last year,
flying somewhere with your strange wings,
your wide eyes, and your heavy wet dress,
kicking deeper down into the lake
with no end or name, some boundless province of water.
by Billy Collins
from Sailing Alone Around the Room
Random House 2002
Saturday, March 08, 2014
In some ways, Stanley Crouch is the perfect candidate to write Bird’s biography. He’s been one of the boys on the beat of American culture for quite some time, with a Macarthur grant, several provocative essay collections, and a fine novel to his credit. Even better, Crouch has been one of the precious few public intellectuals to valorize jazz and insist and demonstrate how jazz can be seen as not only one of the pure products of America gone crazy but also its historic pulse, its backbeat, a trope that swings. One of the themes Crouch emphasizes is reflected in a quote from the Austrian novelist Hermann Broch: “the civilization of an epoch is its myth in action.” This insight is useful not only in giving a background for Parker’s eventual triumph and decline but also in showing how his music promised a certain kind of freedom one might have felt at a certain time and place, if you were willing to let it take you over. It’s the kind of democratic promise implicit in what they used to call American classical music, with collective improvisation and individual expression put in constant interplay, an offspring of the blues that reckoned with classical structures, music made for and by people who, with some notable exceptions, never found satisfaction anywhere else.
It’s for the best that Kansas City Lightning: The Rise and Times of Charlie Parker is the first volume of two. Some reviewers have complained about the novelistic, occasionally montage-like approach Crouch takes in telling the story of Parker‘s youth and adolescence. It’s been suggested that Crouch is padding his material or being self-indulgent. I see the point, but I would argue that this stylistic choice isn’t even Crouch’s fault.
In the winter of 1933, an 18-year-old named Patrick Leigh Fermor set out from the Hook of Holland to cross Europe on foot. His goal was Istanbul, which he bookishly insisted on calling Constantinople. He had little more in his rucksack than a volume of Horace and a few blank notebooks. He also had a bad reputation: The masters who expelled him from school — for a flirtation with a local girl — saw only “a dangerous mixture of sophistication and recklessness.” He spent the next year charming his way through a doomed prewar landscape of landed aristocrats, feudal peasants and benevolent monks, sleeping alternately in schlosses and hayricks. It was a journey that would become legendary, not so much for the extraordinary things he saw and recorded as for his prose — an utterly unique, hybrid vehicle that combines youthful exuberance with a dense, dauntingly erudite display of verbal artifice. Unlike most authors of travel literature (a rattlebag genre that doesn’t really do him justice) Leigh Fermor does not confine his role to that of camera obscura. He builds dense whorls of wordplay to echo the carvings in an old church door; he slips into baroque historical fantasias, scattering a shrapnel of words like “gabions,” “hydromel,” “eyot” and “swingletrees” at the unsuspecting reader. In between salvos, there are moments of ferocious humor and quiet, lyrical beauty.
In part, this richness is a measure of the extraordinary gap between the experience and its narration.
Ultimately, however, none of these provides a satisfactory answer because it is indeed a genuine paradox, an irresolvable contradiction at the heart of human existence. The best we can do, suggests Slingerland, is “to not push too hard when trying is bad, and not think too much when reflection is the enemy”. If we do that, “the flow of life is always there, eager to pull us along in its wake.”
There is an important insight connecting all three of these books, one that much smart thinking neglects. It is that you cannot reduce anything truly worthwhile simply to a technique you can learn and use to get your desired result. Rather, what is most profoundly rewarding always springs from deeply held values. Klein, for example, mentions that to gain insight, it is important that the thing we are thinking about flows from our own interests. Slingerland also says that wu-wei involves “the absorption of the self into something greater” than yourself, and it is our values that tell us what we truly believe is greater. And Epley’s account suggests that unless you genuinely value the perspectives of others, and not just those that conform to your own, you are not going to understand them. Really effective smart thinking is not, therefore, just a means to an end: it has to be rooted in what we see as ends in themselves, the values by which we live.
Morgan Meis in The Smart Set:
What if happiness is impossible? What if “men are always discontented because they are always unhappy?” What if, in their hearts, “they feel and they are well aware that they are unhappy, that they suffer, that they do not find enjoyment, and in that they are not wrong?” What if this unhappiness is increased by the fact that men “think they have the right to be happy, to enjoy life, not to suffer, and in that too they would not be wrong, if it were not for the fact that what they expect is, if nothing else, impossible?”
Hard thoughts, especially for those of us who live in a country that declared, in one of its founding documents, that the pursuit of happiness is an inalienable right.
These and other fairly depressing thoughts about happiness can be found in a new English translation of a book called Zibaldone. Zibaldone — which translates roughly as “mental hodge-podge” — is the life’s work of the 19th-century Italian poet Giacomo Leopardi. The central thesis of Zibaldone is that life is miserable and there is nothing to be done about it. The work consists of interrelated notebook entries from throughout Leopardi’s life. The recent English version runs to a little over 2,000 pages, in very small font size. Last year it was released to what one would have expected to be complete silence.
Unexpectedly, people liked it. The book was the surprise hit of 2013. It was reviewed by prominent intellectuals in the New York Times, the New Statesman, Harper’s Magazine, the New Republic, the Financial Times, the New York Review of Books, and even here inThe Smart Set.
The best way to read Zibaldone is to skip around on a theme. There’s no way to read the book in linear fashion. A person who attempted to read Zibaldone cover to cover would more than likely go insane. Since the book may cause you to blow a gasket anyway, why not do it on your own terms? Flip to the editorial index, find a subject that interests you, then go to the relevant section in the body of the text. Sooner or later you’ll hit a footnote, which will refer you to another section of the book. You can proceed in this way more or less indefinitely, or until you decide to pick up a new thread.
Colin Robinson in The Guardian:
A great deal has been written recently about the frustrations ofpublishing a book with Julian Assange, mainly in a widely discussed, marathon article for the London Review of Books by Andrew O'Hagan.O'Hagan relates his experiences when working as a ghostwriter on an autobiography of the WikiLeaks leader that ended up being published in opposition to its subject's wishes. I'm the co-publisher of Assange's most recent book (Cypherpunks: Freedom and the Future of the Internet) and I, too, have found the experience frequently exasperating.
Let me give an illustration. It's June of last year and I'm at a party in New York when a friendly, youngish man with a beard and a beer engages me in conversation. He tells me he is a journalist on one of the city's listings magazines and asks what I do for a job. I reply that I'm a publisher and he asks whose books I'm working on. I pick the one writer of whom I'm pretty certain he will have heard. "Well," I say, shouting to make myself heard above the music, "I've just published Julian Assange." The young man's demeanour changes abruptly and he fixes me with a sneer. "Assange," he echoes, "he's a bit of a cunt isn't he?"
I've become wearily accustomed to this over my time working with Assange: the vituperation heaped on my author, the scorn directed at me for giving him a platform. I know the general script that will follow. And, sure enough, here it so often comes, as if read from the page: "I mean, he's a weirdo isn't he? That massive ego. And the sex offences in Sweden."
It's almost impossible to counter this kind of attitude, with its shallow presumptions about the character of someone never met and the guilt of someone never tried.
Oliver Burkeman at CNN:
At first glance Pinker's implacable optimism, though in keeping with his sunny demeanour and stereotypically Canadian friendliness, presents a puzzle. His stellar career -- which includes two Pulitzer Prize nominations for his books How the Mind Works (1997) and The Blank Slate: The modern denial of human nature (2002) -- has been defined, above all, by support for the fraught notion of human nature: the contention that genetic predispositions account in hugely significant ways for how we think, feel and act, why we behave towards others as we do, and why we excel in certain areas rather than others.
This has frequently drawn Pinker into controversy -- as in 2005, when he offered a defense of Larry Summers, then Harvard's President, who had suggested that the under-representation of women in science and maths careers might be down to innate sex differences.
"The possibility that men and women might differ for reasons other than socialization, expectations, hidden biases and barriers is very close to an absolute taboo," Pinker tells me. He faults books such as Lean In, by Facebook's chief operating officer, Sheryl Sandberg, for not entertaining the notion that men and women might not have "identical life desires." But he also insists that taking the possibility of such differences seriously need not lend any justification to policies or prejudices that exclude women from positions of expertise or power.
Jan Morris in The Telegraph:
This tremendous book puts me in mind of a huge murky kaleidoscope, an ever-shifting display through which one image remains ambiguously constant. The scene is the tumultuous world of the Arabs during the last stages of the First World War; the enigmatic central figure is that of Thomas Edward Lawrence, a small Anglo-Irish archaeologist in his late twenties, later to be known as Lawrence of Arabia. It was a populist, even patronising epithet, because there was nothing Arabian about him. This hefty volume, though, by a scholarly American journalist, demonstrates how central he was to the infinitely convoluted, deceptive and contradictory goings-on that were eventually to bring into being the Middle East as we know it now.
Until the First World War the whole region, including today’s Iraq, Syria, Israel, Saudi Arabia, Yemen, the petty Persian Gulf emirates and Egypt, were nominally part of the Ottoman Empire with its capital at Constantinople – nominally, because the British Empire in effect governed Egypt and the Gulf states, and possessed the port of Aden. The impending collapse of the Ottomans in the so-called Great War meant that almost the entire region would eventually be up for grabs among the victors, and it is cosmopolitan opportunism, as the conflict approached its conclusion, that is Scott Anderson’s huge subject. The scramble has been repeatedly chronicled for the best part of a century, but nobody has explored the subject with quite such intensity, and from quite so many angles.
War is too weird a thing to make sense of when it’s actually happening. It’s not just the combat, which by its nature is unintelligible. Armed conflict so fundamentally alters the environment it takes hold of that no aspect of life escapes undistorted: not love, not friendship, not sleep, not trust, not conversation. In war, even boredom is strange. The war in Iraq is finally over, at least for Americans, which means, in a way, that we may finally begin to comprehend it. I don’t mean in a historical sense: A multitude of books have already dissected the war’s origins, costs and wider implications. I mean in a human sense: what the war felt like, what it did to people’s brains, how it changed the lives it did not consume. This is not, strictly speaking, the realm of journalism or history, but of fiction and memoir. The best literature of the Vietnam War, like Tim O’Brien’s “The Things They Carried,” captured the ways the conflict splintered the psyches of the men who fought it and how it rattled around in their minds, and in the minds of the people who loved them, long after the fighting ended. All wars do that, but O’Brien, drawing on his experience as a foot soldier, connected those dislocations to the very particular milieu that formed the American experience in Vietnam: the moral ambiguity, the invisible enemy, the jungle, the waste.
In “Redeployment,” Phil Klay, a former Marine who served in Iraq, grapples with a different war but aims for a similar effect: showing us the myriad human manifestations that result from the collision of young, heavily armed Americans with a fractured and deeply foreign country that very few of them even remotely understand. Klay succeeds brilliantly, capturing on an intimate scale the ways in which the war in Iraq evoked a unique array of emotion, predicament and heartbreak. In Klay’s hands, Iraq comes across not merely as a theater of war but as a laboratory for the human condition in extremis. “Redeployment” is hilarious, biting, whipsawing and sad. It’s the best thing written so far on what the war did to people’s souls.
Friday, March 07, 2014
A faux Rockefeller fooled author Walter Kirn for years until it became clear Christian Gerhartsreiter was a liar and a killer
Hector Tobar in the Los Angeles Times:
Walter Kirn's new profile of the serial liar and convicted murderer known as "Clark Rockefeller" is no ordinary work of true crime and literary journalism.
"Blood Will Out: The True Story of a Murder, a Mystery, and a Masquerade" is the chronicle of Kirn's ill-fated friendship with the con man. And it's surely one of most honest, compelling and strangest books about the relationship between a writer and his subject ever penned by an American scribe.
Kirn is a magazine writer and author of novels such as "Up in the Air" and "Thumbsucker." But he was an insecure and not especially successful writer when he first met "Clark" in 1998. The faux Rockefeller was a preppy bon vivant who claimed to be estranged from his famous family. A mutual friend asked Kirn to do Clark a favor — deliver a semi-paralyzed dog from Montana, where Kirn was living, to Clark's home in Manhattan.
Unbeknownst to Kirn, "Clark Rockefeller" was the latest in a series of identities adopted by the German immigrant Christian Gerhartsreiter. As Clark, Gerhartsreiter hid his Bavarian roots behind a genteel, patrician accent and stories of his jet-setting lifestyle. Kirn, a son of working-class Midwesterners, was smitten. Like many an ambitious writer, he thought the charismatic and odd Clark might make a good character for a magazine article or even a novel.
Chris Mooney in Mother Jones:
The question may seem simple to answer: You are the citizen of a country, the resident of a city, the child of particular parents, the sibling (or not) of brothers and sisters, the parent (or not) of children, and so on. And you might further answer the question by invoking a personality, an identity: You're outgoing. You're politically liberal. You're Catholic. Going further still, you might bring up your history, your memories: You came from a place, where events happened to you. And those helped make you who you are.
Such are some of the off-the-cuff ways in which we explain ourselves. The scientific answer to the question above, however, is starting to look radically different. Last year, New Scientistmagazine even ran a cover article titled, "The Great Illusion of the Self," drawing on the findings of modern neuroscience to challenge the very idea that we have seamless, continuous, consistent identities. "Under scrutiny, many common-sense beliefs about selfhood begin to unravel," declared the magazine. "Some thinkers even go so far as claiming that there is no such thing as the self."
What's going on here? When it comes to understanding this new and very personal field of science, it's hard to think of a more apt guide than Jennifer Ouellette, author of the new bookMe, Myself, and Why: Searching for the Science of Self. Not only is Ouellette a celebrated science writer; she also happens to have been adopted, a fact that makes her life a kind of natural experiment in the relative roles of genes and the environment in determining our identities.
Jill Godmilow in IndieWire [via Chapati Mystery]:
Throughout the film, Oppenheimer encourages his collaborators to produce ostentatiously surreal and violent dramatic film reconstructions of their death squad activities. Ever since Robert Flaherty asked his Inuit collaborator, Nanook the Bear, (his real name was "Allakariallak") to fake the capture of a seal in 1922 – at the very beginning of ethnographic film tourism – we have seen hundreds of social actors perform “real” re-enactments of their lives for the cameras of documentary filmmakers. There is nothing new in “The Act of Killing” but carnage, and the special, cozy relationship we are urged to enjoy with the killers. Perhaps this is exactly what the critics are avoiding with their raves – that they have been duped into admiring, for an hour or two, the cool Rat Pack killers of Medan.
Collaboration is a way to share, with the social actors represented, responsibility for a film’s acts of description, strategies and arguments…a way to “keep it clean.” Some of the most useful films I’ve seen in the last twenty years – non-fiction and otherwise – have been the products of collaboration with the social actors represented, in unique and disparate ways. Carolyn Strachan and Allessandro Cavadini’s “Two Laws,” Kent MacKenzie’s “The Exiles,” and Rolf de Heer and Peter Djigirr’s “Ten Canoes” come quickly to mind.
First on this list should be Rithy Pahn’s “S-21: The Khmer Rouge Killing Machine” – the perfect counter model to “The Act of Killing.” In S-21, the two survivors of the infamous Cambodian prison and their Khmer Rouge prison guards are brought together in a patient re-enactment of their crimes, which the traumatized guards cannot otherwise recollect.”The Act of Killing” is also a collaboration of sorts, but for me a non-productive, uncomfortable, even unclean one.
Amanda Little in Bookforum:
LAST SPRING, A THIRTY-ONE-YEAR-OLD COLLEGE DROPOUT–TURNED–ENERGY EXECUTIVE named Billy Parish came to talk to my journalism class at Vanderbilt University. The course focused on climate reporting, and Parish had recently been profiled in Fortune magazine as a young virtuoso in the solar industry. Students wanted to hear his perspective as an innovator: What did he consider the most important untold story on climate change? “Easy,” he said, “it’s the story of our victory in progress, the story that we’re winning—not losing—the climate battle.” Most progressive journalists hate to talk about actual progress, Parish went on to argue, so they spend their time mewling about what’s not getting done on the climate-legislation front. Science writers, meanwhile, nitpick about important but arcane details of atmospheric warming in parts per million and other mind-numbing measurements. The skeptics, for their part, continue to chant, like skipping records, their groundless but vehement doubts about the problem’s very existence. Little wonder that Americans turn a deaf ear to this issue.
Maybe they wouldn’t, Parish argued, if they could read more climate literature that matters—stories about how America is actually innovating and adapting in response to this crisis, even as global treaties have languished and climate legislation collects dust on the shelves of Congress. Parish waxed technophilic, telling stories about new carbon-cutting innovations on the horizon: wind turbines designed like jet engines, not propellers; fuels made from algae and batteries made from viruses; nanotech solar cells that are smaller than gnats and can be integrated into paints, shingles, and glass. He explained that the cost of solar energy has come down 80 percent in the last five years, and solar production has grown more than 50 percent a year. “We’ve got to stop acting helpless,” he said. “We’ve got to start telling the stories of why we’re winning.” As a budding entrepreneur, Parish is notably prone to enthusiasm. But his argument stayed with me, and after his visit I began to see climate literature a bit differently, dividing it into two categories: The first, and overwhelmingly the largest, includes stories of conjecture about climate change itself—about whether it’s happening at all; whether humans are to blame; how severe the problem is or isn’t; how catastrophic the impacts may become. The second, and much more intriguing, category focuses on the tangible, practical ways we’re beginning to adapt: stories about innovators who are trying, against vertiginous odds, to get technologies and strategies in place that can make our transition to a low-carbon economy not just possible but seamless.
Andrew Pollack in The New York Times:
J. Craig Venter is the latest wealthy entrepreneur to think he can cheat aging and death. And he hopes to do so by resorting to his first love: sequencing genomes. On Tuesday, Dr. Venter announced that he was starting a new company, Human Longevity, which will focus on figuring out how people can live longer and healthier lives.
To do that, the company will build what Dr. Venter says will be the largest human DNA sequencing operation in the world, capable of processing 40,000 human genomes a year. The huge amount of DNA data will be combined with huge amounts of other data on the health and body composition of the people whose DNA is sequenced, in the hope of gleaning insights into the molecular causes of aging and age-related illnesses like cancer and heart disease. Slowing aging, if it can be done, could be a way to prevent many diseases, an alternative to treating one disease a time. “Your age is your No. 1 risk factor for almost every disease, but it’s not a disease itself,” Dr. Venter said in an interview. Still, his company will also work on treating individual diseases of aging.