Tuesday, July 29, 2014
Julia Amalia Heyer interviews Yuval Diskin, in Spiegel (photo: Reuters):
SPIEGEL: What about Israel talking directly with Hamas?
Diskin: That won't be possible. Really, only the Egyptians can credibly mediate. But they have to put a more generous offer on the table: the opening of the border crossing from Rafah into Egypt, for example. Israel must also make concessions and allow more freedom of movement.
SPIEGEL: Are those the reasons why Hamas provoked the current escalation?
Diskin: Hamas didn't want this war at first either. But as things often are in the Middle East, things happened differently. It began with the kidnapping of three Israeli teenagers in the West Bank. From what I read and from what I know about how Hamas operates, I think that the Hamas political bureau was taken by surprise. It seems as though it was not coordinated or directed by them.
SPIEGEL: Netanyahu, though, claimed that it was and used it as a justification for the harsh measures against Hamas in the West Bank, measures that also targeted the joint Hamas-Fatah government.
Diskin: Following the kidnapping of the teenagers, Hamas immediately understood that they had a problem. As the army operation in the West Bank expanded, radicals in the Gaza Strip started launching rockets into Israel and the air force flew raids into Gaza. Hamas didn't try to stop the rockets as they had in the past. Then there was the kidnapping and murder of the Palestinian boy in Jerusalem and this gave them more legitimacy to attack Israel themselves.
SPIEGEL: How should the government have reacted instead?
Diskin: It was a mistake by Netanyahu to attack the unity government between Hamas and Fatah under the leadership of Palestinian President Mahmoud Abbas. Israel should have been more sophisticated in the way it reacted. We should have supported the Palestinians because we want to make peace with everybody, not with just two-thirds or half of the Palestinians. An agreement with the unity government would have been more sophisticated than saying Abbas is a terrorist. But this unity government must accept all the conditions of the Middle East Quartet. They have to recognize Israel, renounce terrorism and recognize all earlier agreements between Israel and the Palestinians.
SPIEGEL: The possibility of a third Intifada has been mentioned repeatedly in recent days, triggered by the ongoing violence in the Gaza Strip.
Diskin: Nobody can predict an Intifada because they aren't something that is planned. But I would warn against believing that the Palestinians are peaceful due to exhaustion from the occupation. They will never accept the status quo of the Israeli occupation. When people lose hope for an improvement of their situation, they radicalize.
Stephen T. Asma in the Chronicle of Higher Education:
September 11 changed the God conversation. Atheism was always a reasonable alternative to theological glitches like the problem of evil, and of course God seemed increasingly unnecessary after Darwin’s revolution, but atheism was a relatively quiet and confident minority position. Like opera fans who know they’re right but don’t bother to evangelize the unsophisticated, atheists were generally too imperious to go to the trouble of public debate.
But after 9/11, Christopher Hitchens, Sam Harris, Richard Dawkins, and Daniel Dennett, nicknamed the Four Horsemen of the new atheism, showed us the first wave of atheist response: anger, retaliatory logic, and self-loathing about the failure of flaccid liberalism—our impending cultural suicide from too much naïve tolerance. Pugilistic Islamic fundamentalism was taken as a token for religion generally, and the excesses in this world of otherworldly metaphysics led the Horsemen to call for the end of faith altogether.
Academics slight the essential day-to-day comforts that keep religion, or at least its spiritual secular offshoots, relevant.
Recent books offer a second wave, with political, economic, and philosophical takes on religion and its surrogates. Peter Watson’s The Age of Atheists (Simon & Schuster), Terry Eagleton’s Culture and the Death of God (Yale University Press), and Roger Scruton’s The Soul of the World (Princeton University Press) are much more historically aware, and more comfortable with the persistent ebb and flow of Western religion, than were the Horsemen’s admonitions. But in focusing on seductive macrosocial and lofty theological impulses, the new books slight the essential day-to-day comforts that keep religion, or at least its spiritual secular offshoots, relevant. They also largely dismiss the powerful light that science can shed on spiritual longing. They don’t miss the forest for the trees; they miss it for the sky above the trees.
Morgan Meis in The Smart Set:
Willem de Kooning made a portrait of Marilyn Monroe in 1954. The painting consists of a few splotches of yellow and blue paint. There are two sketchy and lopsided eyes in the middle of the canvas. Two wedges of red surely represent Marilyn’s lips. Is that an arm on the right? Maybe. There’s a human form in there somewhere. But this isn’t a portrait in any way that the Great Masters of European painting would have understood.
You can see de Kooning’s painting today at an exhibit in Washington D.C. at the National Portrait Gallery, part of the Smithsonian Institution. The exhibit is called “Face Value: Portraiture in the Age of Abstraction.” The point of the exhibit is to display the work of “mid-twentieth century artists who were reinventing portraiture at a moment when almost everyone agreed that figuration was dead as a progressive art form.” Thus, de Kooning’s offering. He was trying to salvage some aspect of the human figure at a time when realistic looking paintings were not at all in fashion and portrait painting had been relegated to Sears.
It hadn’t always been this way. For hundreds of years, a painted portrait was supposed to look like the person it portrayed. Even especially talented and artful portrait painters — like Hans Holbein the Younger (c. 1497-1543) — had to think of portraits primarily in terms of a good likeness. Holbein’s portrait of Christina of Denmark (1537) is, for instance, an especially beautiful painting.
Thomas’s reputation as popular bard—an Orpheus or Taliesin reincarnate—trailed him from his earliest career in Wales. From there, as detailed in Andrew Lycett’sDylan Thomas: A New Life (and Adam Kirsch’s fine biographical essay in The New Yorker), he evolved into a proto-rock star. He may well have founded the clichés of the type: the whirlwind American tours, the adoring fans, the orgiastic indulgence, the death in the hotel later made infamous by the likes of Janis Joplin, Sid Vicious, and Leonard Cohen. And, of course, Bob Dylan.
Dylan’s adoption of Thomas’s name remains an uneasy asterisk over the poet’s legacy. Noting the popularity of the name “Dylan,” which was once obscure even in Wales, Kirsch concludes that “later Dylans only borrowed its aura of youthful, brooding rebellion; in the most literal sense, Dylan Thomas made his name.” True enough—but Paul Simon’s ’60s satire “A Simple Desultory Philippic” tells the rest of the story:
He's so unhip that when you say “Dylan,”He thinks you’re talking about Dylan Thomas,Whoever he was.The man ain’t got no culture.
In that sense, Bob Dylan borrowed Thomas’s name and never gave it back.
In a well-known passage of The Origins of Totalitarianism, Hannah Arendt wrote: "We become aware of the existence of a right to have rights (and that means to live in a framework where one is judged by one's actions and opinions) and a right to belong to some kind of organized community, only when millions of people emerge who had lost and could not regain these rights because of the new global political situation [...] The right that corresponds to this loss and that was never even mentioned among the human rights cannot be expressed in the categories of the eighteenth-century because they presume that rights spring immediately from the 'nature' of man [...] the right to have rights, or the right of every individual to belong to humanity, should be guaranteed by humanity itself. It is by no means certain whether this is possible." The "right to have rights" has become the well-known phrase through which to capture the plight of the stateless, the refugee, the asylee and displaced persons – that is, the plight of those who have been cast out of the framework "where one is judged by one's actions and opinions."
Throughout this discussion, Arendt polemicizes against the grounding of human rights upon any conception of human nature or history. For her, conceptions of human nature commit the mistake of treating humans as mere substance, as if they were things in nature. But following Augustine and Heidegger, for her humans are the ones for whom the question of being has become a question.
The Burning of the World: A Memoir of 1914 is a document of one man’s attempt to repaint his broken landscape. It is remarkable how quickly his world was lost. In hindsight, we think of the First World War as a four-year affair. We forget, though, that Austria-Hungary lost half of its men within the first two weeks of the war — 400,000 men, including 100,000 who were taken prisoner by the Russians. At the war’s start, the grand Austro-Hungarian soldier, with his long ridiculous sword, was often killed or maimed within days of reaching the battlefield. The injured and insane were sent home to wander their cities like ghosts, to parade before the horrified eyes of their neighbors. And the war kept going on.
The Burning of the World covers only the first eight months of the war, but carries a lifetime of experience. When the book opens, Hungarian painter Béla Zombory-Moldován is enjoying a summer holiday with friends at resort on the Adriatic. By the second page, war has started and Zombory-Moldován must report for duty. Before he sees any action, Zombory-Moldován finds himself in the abandoned, burned-out town of Rava Ruska, musing on its ruined state. By the middle of his memoir, Zombory-Moldován has been sent to the Galician front, been severely injured, and then been sent back to Budapest to recover. The remainder of the book follows his attempt to come to terms with life as a veteran, even though the war goes on, even though it has just started. Within weeks, his Budapest – his Hungary – is already a thing of the past. Béla Zombory-Moldován inhabits the city in a state of limbo. He passes by his favorite cafés but can’t bring himself to go in. The young ladies who once admired him now stare at his bloodied head, appalled. When the book ends, Zombory-Moldován reports once again for duty. It is March 1915. World War I still has three years and eight months to go.
Jonah Lehrer in Seed:
In the early 1920s, Niels Bohr was struggling to reimagine the structure of matter. Previous generations of physicists had thought the inner space of an atom looked like a miniature solar system with the atomic nucleus as the sun and the whirring electrons as planets in orbit. This was the classical model. But Bohr had spent time analyzing the radiation emitted by electrons, and he realized that science needed a new metaphor. The behavior of electrons seemed to defy every conventional explanation. As Bohr said, “When it comes to atoms, language can be used only as in poetry.” Ordinary words couldn’t capture the data. Bohr had long been fascinated by cubist paintings. As the intellectual historian Arthur Miller notes, he later filled his study with abstract still lifes and enjoyed explaining his interpretation of the art to visitors. For Bohr, the allure of cubism was that it shattered the certainty of the object. The art revealed the fissures in everything, turning the solidity of matter into a surreal blur.
Bohr’s discerning conviction was that the invisible world of the electron was essentially a cubist world. By 1923, de Broglie had already determined that electrons could exist as either particles or waves. What Bohr maintained was that the form they took depended on how you looked at them. Their very nature was a consequence of our observation. This meant that electrons weren’t like little planets at all. Instead, they were like one of Picasso’s deconstructed guitars, a blur of brushstrokes that only made sense once you stared at it. The art that looked so strange was actually telling the truth.
Stephanie Fairyington in The New York Times:
A few months ago, I was on a Manhattan-bound D train heading to work when a man with a chunky, noisy newspaper got on and sat next to me. As I watched him softly turn the pages of his paper, a chill spread like carbonated bubbles through the back of my head, instantly relaxing me and bringing me to the verge of sweet slumber. It wasn’t the first time I’d felt this sensation at the sound of rustling paper — I’ve experienced it as far back as I can remember. But it suddenly occurred to me that, as a lifelong insomniac, I might be able to put it to use by reproducing the experience digitally whenever sleep refused to come. Under the sheets of my bed that night, I plugged in some earphones, opened the YouTube app on my phone and searched for “Sound of pages.” What I discovered stunned me. There were nearly 2.6 million videos depicting a phenomenon called autonomous sensory meridian response, or A.S.M.R., designed to evoke a tingling sensation that travels over the scalp or other parts of the body in response to auditory, olfactory or visual forms of stimulation. The sound of rustling pages, it turns out, is just one of many A.S.M.R. triggers. The most popular stimuli include whispering; tapping or scratching; performing repetitive, mundane tasks like folding towels or sorting baseball cards; and role-playing, where the videographer, usually a breathy woman, softly talks into the camera and pretends to give a haircut, for example, or an eye examination. The videos span 30 minutes on average, but some last more than an hour.
...Dr. Carl W. Bazil, a sleep disorders specialist at Columbia University, says A.S.M.R. videos may provide novel ways to switch off our brains. “People who have insomnia are in a hyper state of arousal,” he said. “Behavioral treatments — guided imagery, progressive relaxation, hypnosis and meditation — are meant to try to trick your unconscious into doing what you want it to do. A.S.M.R. videos seem to be a variation on finding ways to shut your brain down.”
Monday, July 28, 2014
by Grace Boey
When Vladimir Nabokov’s Lolita was first published in 1955, the novel generated an enormous amount of controversy. Narrated by Humbert Humbert, a fictional literature professor in his late thirties, the tragicomedy depicts his obsessive sexual relationship with 12-year-old Dolores Haze—the eponymous Lolita.
60 years down the road, the book remains as controversial as ever. A large part of this seems to be that Lolita, despite our moral condemnation of child sex, somehow manages to elicit the reader’s sympathy for its pedophilic ‘protagonist’ (who is, possibly, more accurately described as a hebephile). Beyond our contempt for Humbert, there is also disgust with ourselves. How dare we even think of sympathizing with such a pervert? Surely by doing so we inch closer to condoning sex with children.
Such confusion reflects unresolved thoughts and feelings about sexual deviation in general. What does it mean to sympathize with perversion? Where, exactly, lies the wrong in what many of us think of as sexual deviance—such as pedophilia, zoophilia, homosexuality, and various other unusual forms of sexuality? What specifically is it that’s so outrageous about the affair between Humbert and Dolores? To answer such questions, we must delve into the field of sexual ethics.
Sex: the moral minefield
Why is the ethics of sex even a thing? For one, sex is a significant act which plays a big part in an individual’s life. How someone practices (or doesn’t practice) sex is intertwined with their emotions, relationships, expression and identity. Moreover, sex is an act involving our own bodies that we either wish to participate in, or don’t. In deontological terms or rights-speak, there are important rights and potential violations surrounding sex. From a consequentialist perspective, there is the potential for both great harm and utility to arise from sex. All this makes sex something we should tread around pretty carefully.
In the early 1980s wanting to be a naturalist — a coleopterist, in particular, that most Darwin-like of naturalists — I spent a couple of summer months in Killarney National Park, in Ireland, making a collection of chrysomelid beetles. This was the first of many such collecting trips, part of a series of increasingly violent engagements with the natural world that served as stepping stones that link my life as an Irish teen to the one I live now in Chicago. All of them involved the killing of animals or plants for the sake of science.
The Chrysomelidae had been offered up to me by Dr Jimmy O’Connor, an entomology curator at Ireland’s Natural History Museum (The Dead Zoo, as it was called in Dublin). Apparently, the Irish representatives of this group were poorly known, not having been taxonomically revised since early in the 20th Century. Chrysomelid beetles include a number of notorious pests such as Leptinotarsa decemlineata, the Colorado Potato beetle, but for the most part these insects go about their business without causing us much bother. They are remarkably pretty though, many of them possessing metallic elytra (the sclerotized outer-wing of the beetle) and when you train your eye to notice them you see them as a marvel of shimmer and vivid color. Some of them, the flea-beetles, have greatly enlarged hind-leg femora, so that when disturbed they erupt into action and spring away from you like a glorious idea that thought you had but now cannot seem to fully recall.
Collecting them is easy enough. Using a sweep net, I thrashed my way across the grassier spots in the National Park; in other locations I’d search the under-leaves of shrubs and low hanging plants, catching them on the tip of a wetted paintbrush.
The issue of killing them was quite another matter. After all, I wanted to collect them because I had conceived a liking for them, and was concerned that if neglected, we, the scientific community, would not know, ironically, if these animals needed more vigorous protection. I loved them enough I suppose to want them dead; a couple of specimens of each species at the very least. I was the Noah of death and my ark was a killing jar.
However, when one sees glamorous creatures such as these looking up at you, as it were, from the bottom of the net, the ethical calculation concerning their dispatch is not an easy one to make. Should these few glimmering Isaacs be sacrificed so that others of their kind might flourish. Or perhaps more proximately, since the question of how data might be used is always somewhat further down the road, should they die so that the storehouse of my knowledge could grow?
by Akim Reinhardt
In February the word came in. My brother in law had a job offer in Orange County. He and my sister would finally be giving up the little apartment in far northern Manhattan and heading for the West coast.
"Lemme know if I can help," I to told my sister.
"You wanna drive the moving truck across the country with Noah?" she asked.
"Sure, I can do that," I said.
Monday, July 21
With luggage, I make the 20 minute walk to the light rail station. Train shows up, and the ride to the airport is uneventful. Not like last time when I had some drunk fool trying to pick a fight with me at 9:00 in the morning cause he thought I was "gay lookin'" at him. Goin' on about how he did a dime in prison and he'd kick my ass, except he's either about 60 years old or a very rough 50, and already lit, drinking tall boys out of paper bags, so no, he can't actually kick my ass. After not engaging, I finally had to tell him to shut the fuck up already, but that didn't help. Didn't make it worse either. Just kept on prattling his belligerent, drunken shit.
Nothing like that this time. To the airport, all good. Until you walk in to find your flight's been delayed two hours.
After what passes for a nice meal at BWI (decent beer, cured olives, mixed salad with goat cheese; actually, that's a nice meal anywhere), I mosey over to the gate. My gate's jammed, so I go to something a bit emptier. I open up Murdering McKinley by Eric Rauchway, a history prof up at UC Davis. He's a good writer, which isn't a given for a historian.
I mean, just look at this pablum.
About thirty pages in, this terribly annoying extended family sits next to me. Not a decent one in the lot.
I move on to a quieter spot. Then the guy behind me starts slurping the straw of his empty Dunkin' Donuts cup. And he won't stop. On and off for 20 minutes. I look behind me. He's about 50 years old
Truly, there is no sense of decorum left in this country.
Jennifer West. Film Quilt, 2013.
Breaking the taboo of divorce in largely conservative India. Conceptual image by Sahil Mane Photography.
A Bit of Background
Last year, I put up this status message on Facebook: "Today, the 15th of February, is the 10th anniversary of my first wedding. It's interesting how far both of us, my ex-husband and I, have come since our divorce in 2006. And how different life—lives—would have been if I had stayed. Oh, thank god!"
People have always asked me why I talk about my divorce, including this article featured in Mirrors across India a few weeks after I got remarried two years ago. I have several reasons.
I got married to Shiv when I was 19 and he was 30, back in 2003, when the world was different, I was different. After one failed attempt in July/August, we got separated in December 2005, when I moved to Mumbai, and divorced 10 months later.
First, a caveat. I spoke casually about being divorced much before I got remarried, much before I found love with Sahil. I spoke about it when I was down, devastated and broke; when I was single; to friends and strangers; and at job interviews. I even spoke about considering one the very first time I met a woman who is now a friend—a young divorcee herself, she said (and I remember this vividly), "Are you sure, Tara*? I find now that I am perpetually ‘<Insert her own name> the Divorcee'." I put that in right upfront, as I realise it could seem convenient to talk about it now, when all has turned out okay. For instance, though there were many years in between, my grandparents didn't tell anyone in Dehradun, the small town in North India in which they live that I was divorced until I got remarried (the veritable ‘happy ending').
[I realised this when I had gone for my granddad's 80th birthday celebrations a few years ago, only to be startled by questions of "Shiv kahan hain, beta?" "Aapke husband Indonesia se nahi aa paye?" ("Where is Shiv?" "Your husband wasn't able to come from Indonesia?") That's when I pieced together the story they had been telling, or letting brew, partly grounded in the truth—my ex-husband is, indeed, currently in Indonesia, just with a different wife.]
Because of this, I've been asked this over and over, from the curious as well as the concerned ‘what is the need to wash dirty linen is public', I'll tell you why I speak about it.
by Eric Byrd
On Facebook I follow a number of the US Department of the Interior's National Battlefield Parks, National Battle Sites, and Military Parks. In the progress of the Civil War's sesquicentennial each of these sites has had their day in the social media sun, their special anniversary posts with pictures of the commemorative ceremonies. In May and June it was the turn of the parks that memorialize the battles of Grant's Overland Campaign.
In the spring of 1864 Ulysses Grant came east, to personally oversee the destruction of Robert E. Lee's Army of Northern Virginia, the veteran force that had in the previous two years baffled and humiliated every Federal drive on the rebel capitol of Richmond. Though baffled and humiliated – but never demoralized or destroyed, a distinction apparently lost on the aristocratic Lee – the eastern armies of the Union came on in early May and fought continuously for six weeks, chewing and choking "with a bulldog grip," as Lincoln would exhort via telegraph.
By the end of June, when the exhausted armies began to dig in for a long siege of the last rail hub supplying Richmond, Grant had lost 55,000 men and Lee 33,000. About half of each army. Lincoln said Grant was the general who could "face the arithmetic." Meaning he could fight all out, lose half his army while costing Lee half of his, replace his losses just when Lee could not, and then resume the offensive and finish the war. Grant, reflected one of his staff officers, "was assigned one of the most appalling tasks ever intrusted to a commander."
Sing Me a Song of Hyperobjects: Starting over with Humans and Other Creatures in the 21st Century CE
by Bill Benzon
Timothy Morton. Hyperobjects: Philosophy and Ecology after the End of the World. University of Minnesota Press 2013. 229 pp.
This is a strange book, for it is three. There is the book that is easy to praise for its range of topics – quantum mechanics, La Monte Young, global warming, The Matrix, the Prisoner’s Dilemma, for example – and its quasi-virtuoso stylistic versatility. There is, as well, the book that is easy to criticize – though I’m sure some would regard that as too mild a word – for its conceptual instabilities, lapses in logic, and misreading of science.
And there is another book, the one leaking out of the cracks and pores in the first two. That book has the scattered beginnings of a framework in which we can construct a viable approach to the future. That's the book I'm writing about, making this essay as much an interpretation of as a review of Morton's fine Hyperobjects.
Hyperobjects and Objects
Hyperobjects are “things that are massively distributed in time and space relative to humans” (p. 1). What isn’t a hyperobject is an object. Kumquats, automobiles, palm trees, squids, geosynchronous satellites, Olympic records, a promise, a rooster’s crow, these are all objects in the philosophical sense of the word. In the first paragraph of the book Morton lists these examples: the Lago Agrio oil field, Florida Everglades, the biosphere, the Solar System, “the sum total of all the nuclear materials on Earth; or just the plutonium, or the uranium,” Styrofoam, plastic bags, or “the sum of all the whirring machinery of capitalism.”
The philosophical sense of object is not quite the same as the ordinary sense, which tends toward physical things that are neither very large nor very small. Roughly speaking, for Morton and proponents of other object oriented ontology (OOO) – a recent school of Continental philosophy – anything that can be designated by a noun or a noun phrase is an object. Anything. Including, of course hyperobjects.
by Brooks Riley
I am Jewish by birth. My family wasn’t particularly religious; we went to synagogue at Rosh Hashanah, the Jewish New Year, and Yom Kippur, the Day of Atonement, but not really any other time. My brother and I went through years of Hebrew school, but we came home and ate bacon sandwiches. As a teenager, I became involved in BBYO, a Jewish teen organization and through it became quite heavily exposed to conversations about Israel, the general evilness of the Palestinians and the righteousness of the concept of a Jewish homeland. When I was 17, a school friend and I went away together to Israel for our first trip without our parents. We chose not to do a tour or a work on a kibbutz but instead to make our own way around Israel staying in youth hostels.
I remember the moment we got off the plane onto the tarmac thinking, “this is it, I’m in Israel, the Jewish homeland.” It was a transcendent moment that made me feel connected to my heritage and to a community that I had only ever skirted around of the edge for the most part. I truly believed that this would be a transformative trip for me.
In the second hostel we stayed in, I fell in love with Abbud, a Palestinian man who was working there for the summer. We spent a few days and nights together and then my friend and I moved to another part of the country. But I promised to come back. When I did, Abbud wanted to show me his village in the West Bank. This was in 1986 just before the first Palestinian Intifada and, apart from the general lack of common sense shown by two young girls agreeing to travel across country with a man they hardly knew, there didn’t seem to be any good reason not to go with him.
We arrived in Abbud’s village in time for dinner and he took us to his cousin’s home. My friend and I were both vegetarian and so unable to eat much of what was put before us and I was wearing quite a prominent Star of David around my neck, regardless we were treated as honored guests. We stayed in his parent’s home that night. I woke up early the next morning and padded through the house in bare feet. I came across Abbud’s elderly father who spoke no English. He took off his sandals and gave them to me to wear. The gesture was so gracious and generous that I couldn’t say no and spent the rest of the morning flopping around in sandals that were much too big for me. I couldn’t help but wonder what kind of reception my family would give Abbud if he visited me in London.
by Leanne Ogasawara
The race was on: for whoever discovered a way to accurately measure longitude aboard a ship would be able to control the seas --and thereby control the riches of the world!
The search for longitude at sea was one of the great quests starting in the late Renaissance. And, it was how it came to be that a 17th century nobleman named Roberto della Griva found himself aboard a ship sailing southward toward Australia in search of the Prime Meridian, in Umberto Eco’s novel Island of the Day Before.
Being obsessed by longitude, the characters in the book are also obsessed by notions of time. For to calculate longitude is, of course, to calculate time.
But to do this at sea is no easy feat, because while one only requires to know the local time at the ship's current meridian as well as what the current time is would be back at the meridian of departure (or at some fixed meridian, like, say at the Solomon Islands), this remained very difficult to accurately determine aboard ship. And inaccuracies in time would result in inaccuracies of place--as is well known.
You can see where this is going...
Strained Analogies Between Recently Released Films and Current Events: Dawn of the Planet of the Apes and the Microsoft Layoff
by Matt McKenna
Back in 2000, chances were that if you were using a computer, it was running a Microsoft operating system. In 2014, those chances have diminished considerably, and you are now more likely to be using a device running Apple's iOS or Google's Android software. Microsoft's stock price has responded accordingly, and its inflation adjusted market cap is now less than half of what it was at its peak in 1999. How appropriate it is then that Matt Reeve's Dawn of the Planet of the Apes was released this month just as Microsoft announced plans to lay off 18,000 employees in what looks to be the dawn of the trivialization of Microsoft's standing as a technology leader in the same way that the rebooted Apes series chronicles the trivialization of the human species as a planetary leader. While there are many tempting social readings crawling along the surface of the Planet of the Apes series, the most coherent one invites viewers to imagine the story's fictional planet Earth as a metaphor for the consumer electronics industry, a metaphor in which the humans represent Microsoft and the various species of apes represent the various technology companies usurping Microsoft's dominance.
Dawn of the Planet of the Apes picks up immediately where its predecessor, Rise of the Planet of the Apes, takes off. In Rise, the first film in the second reboot of the Planet of the Apes series, James Franco's character creates a supposedly benign virus that regenerates brain cells, resulting in the reversal of diseases such as Alzheimer's. Unfortunately, the virus has the annoying side effect of afflicting humans with flu-like symptoms (you can see where this is going). The apes on which the virus was tested, however, receive all the positive effects to their cognitive abilities without any of the negative effects to the rest of their bodies. At the end of Rise, the brainy apes storm the Golden Gate Bridge en route to Muir Woods where they plan to live the peaceful simian life. The humans, on the other hand, are impotent to stop their zoological Frankenstein's monsters and can only look on while engaging in some ominous sneezing.
by Josh Yarden
from every tree of the garden eat
but from the tree of knowledge of good and bad
don't eat from it
because on the day you eat from it
you will die
What fruit grows on the Tree of Knowledge?
I posed that question to a class of intelligent high school students. A few were quick to provide the garden variety answer: "Apples."
Of course. Who doesn't know that, after all? Those mediaeval, illuminated texts do show a woman holding an apple, and it's just… well, common knowledge. Right?
"Hmmm… But I think apples only grow on apple trees," I replied.
"Figs!" one of them called, as if he had a winning lottery number. "I think I read that somewhere. They figured out it was a fig, because Adam and Eve covered themselves with fig leaves."
I could see I wasn't getting very far. "Well, I don't know who ‘they' are—the ones who figured that out—but I've only seen figs grow on fig trees," I said, with a generous hint of ‘Get it?' in my voice. "If apples grow on apple trees, and figs grow on fig trees, what kind of fruit grows on a knowledge tree?"
"We don't really know," another thoughtful student suggested. "The Bible just says ‘fruit.' Maybe we're not supposed to know."
"Maybe…" I accepted that possibility, "Or, maybe we're not supposed to know until we can figure it out for ourselves, and then we are supposed to know."
"Yeah, but it doesn't say that they are supposed to think for themselves. It says they are supposed to follow the rules. That's why it's a stupid story," offered up one of my more unruly students.
"Oh, I don't know… It doesn't seem too stupid to me. Read between the lines." There was still a blank look on many of their faces.
Sunday, July 27, 2014
Jeanne Guillemin in Bulletin of the Atomic Scientists:
Last week, six vials of smallpox virus were discovered in a disused closet at the National Institutes of Health, where they had lain, forgotten and misplaced, for over 30 years. Some of them were found to contain live specimens, meaning that this dangerous virus—once considered to have been eradicated from the face of the planet—had the capacity to infect and spread.
At nearly the same time, on July 16, the director of the Centers for Disease Control and Prevention, Thomas Frieden, admitted to a Congressional committee that he was advised of a somewhat similar blunder at the CDC, more than two months after its discovery. (Members of the CDC had accidentally contaminated an innocuous strain of avian influenza with the dangerous H5N1 strain and shipped this unknown hazard to a less secure laboratory.) And not long before, dozens of CDC lab employees had been exposed to virulent anthrax bacteria.
These incidents raise doubts about government vigilance, with the case of the misplaced smallpox vials being arguably the most shocking, because the 1979 global eradication of smallpox is rightly celebrated as one of the most important public health achievements in history.
Once you know what plankton can do, you’ll understand why fertilising the ocean with iron is not such a crazy idea
David Biello in Aeon:
Call me Victor,’ says the mustachioed scientist as he picks me up from the airport on a brisk, fall afternoon in Germany. Victor Smetacek is an esteemed marine biologist, but he’s decided to spend his golden years on an ambitious new pursuit. He has devised a plan to alter the mix of gases in Earth’s atmosphere, in order to ward off climate change. He is, in other words, an aspiring geoengineer.
I came to the ancient city of Bremen to ask Smetacek about an extraordinary experiment he performed more than half the world away, in a forbidding sea seldom visited by humans. This sea surrounds the vast, white continent of Antarctica with a chilly current, locking it in a deep freeze. This encircling moat reaches from the surface waters to the ocean bottom, spanning thousands of kilometres. It is known as the Southern Ocean and it is famously dangerous on account of icebergs that hide in the gloom that hovers above its surface. The churn of its swells sometimes serves up freak waves that tower so high they can flip ships over in a single go. It is in this violent, lashing place that Smetacek hopes to transform Earth’s atmosphere.
Sean Carroll in Preposterous Universe:
One of the most profound and mysterious principles in all of physics is the Born Rule, named after Max Born. In quantum mechanics, particles don’t have classical properties like “position” or “momentum”; rather, there is a wave function that assigns a (complex) number, called the “amplitude,” to each possible measurement outcome. The Born Rule is then very simple: it says that the probability of obtaining any possible measurement outcome is equal to the square of the corresponding amplitude. (The wave function is just the set of all the amplitudes.)
The Born Rule is certainly correct, as far as all of our experimental efforts have been able to discern. But why? Born himself kind of stumbled onto his Rule.