Thursday, May 26, 2016
Campari and Smoke Rings
In the side street bar,
below the church of St. Catherine,
the drinks are cheap and the music loud.
Girls in tight jeans lean into young men
who might be Zepharelli extras
with their dark halos of curls.
It’s easy sitting here, with my glass
of rough red, watching them laugh
and flirt as the evening gathers
in among the Campari and neon,
to imagine my life just like theirs;
in the bar’s smeared glass.
Yet when I look back
through the fug of smoke rings,
down the dark alley
the way I just came, I realise
that around the next bend
there’s no unknown room
filled with the scent of crushed roses,
where muslin curtains lift
on the evening breeze and clouds
of swallows circle and circle
under the eaves in the growing dark.
by Sue Hubbard
Emily Greco in Moyers & Company:
Ana Maria Archila, one of the Center for Popular Democracy’s two co-executive directors, gleefully introduced Elizabeth Warren at her grassroots organization’s gala Tuesday night. “I have a feeling she’s going to say some really deep stuff, some really inspiring stuff, some really tweetable stuff,” Archila told the diverse crowd of progressive policy wonks and community organizers assembled in the Hyatt Regency Washington on Capitol Hill ballroom. “Those of you who tweet, get your thumbs together. Get ready because we are really going to hear something very, very awesome and important.” The senior senator from Massachusetts didn’t disappoint. Warren tore into Donald Trump, caricaturing the presumptive Republican presidential nominee as a greedy narcissist. She questioned whether the billionaire real estate mogul ever pays a dime in taxes, dismissing him as “a man who will never be president of the United States” because he is prone to “kissing the fannies of poor, misunderstood Wall Street bankers” and “so desperate for power he will say and do anything to get elected.”
“Donald Trump was drooling over the idea of a housing meltdown — because it meant he could buy up more property on the cheap,” Warren said. “What kind of a man does that? What kind of a man roots for people to get thrown out of their house? What kind of a man roots for people to get thrown out of their jobs? To root for people to lose their pensions?” To date, the former financial regulator, a hero to her party’s progressive wing for her tough stands against corporate abuse, has refrained from endorsing either of the candidates for her party’s presidential nomination — opting out of the sparring between supporters of Hillary Clinton and Bernie Sanders. While Warren’s positions on many issues align with Sanders’ platform, some of her comments Tuesday night echoed a damning new Clinton campaign ad that recalls Trump saying he “sort of hoped” a real estate bust would happen before the Great Recession that left millions of Americans underwater.
This synchronization prompted The Washington Post to label Warren Clinton’s “new weapon against Trump.”
Emily Bobrow in More Intelligent Life:
Matt, a father in his early 40s with soulful eyes, thinning hair and a ready smile, is doing his best to explain why he has a more intense relationship with his son than he does with his daughter. Over a mojito at a bar in Brooklyn, near the flat he shares with his wife and two children, he admits that he is not a stereotypically macho guy. Most of his friends are women, he says. He was never much of an athlete and his marriage is a fairly egalitarian two-career juggling act. Yet there is something about his bond with his boy that feels particularly profound. Partly, he thinks, it is because his four-year-old son is older, and therefore more interesting. As the first-born, his son is also teaching Matt how to be a parent, which provokes all sorts of potent new emotions and anxieties. But perhaps the most compelling reason is also the simplest: “I really identify with him,” Matt says. “He just looks a lot like me, and he’s like me in certain ways. Every time I look at him I see myself when I was four years old.” Of course he adores his daughter, “but it’s just different. I don’t know how to make a little girl happy the way I fundamentally know how to make a boy happy, so I worry I’m going to somehow screw that up.”
Such candour can be uncomfortable for parents. In rich countries, where children are more like luxury goods than savvy economic investments, and where gender is simply one attribute among many, parents tend to pride themselves on their open-hearted, unconditional love for every member of their brood. Admitting a stronger emotional connection with one child over another, one sex over the other, is taboo. Yet the presence or absence of children of either sex has a real impact on the dynamics of a family – even, it seems, on whether the family survives as a unit.
Wednesday, May 25, 2016
David Sloan Wilson in Evonomics:
One of the most influential articles published in the field of economics is Milton Friedman’s (1953) “The Methodology of Positive Economics”, in which he argues that people behave as if the assumptions of neoclassical economic theory are correct, even when they are not. One of the most influential articles in the field of evolution is Stephen Jay Gould and Richard Lewontin’s (1979) “The Spandrels of San Marcos and the Panglossian Paradigm”, which argues against excessive reliance on the concept of adaptation.
Different disciplines, different decades. No wonder these two classic articles have not been related to each other. Yet, there is much to be gained by doing so, for one reveals weaknesses in the other that are highly relevant to current economic and evolutionary thought.
The reason they can be related to each other is because Friedman relied upon an evolutionary argument for his “as if” justification of neoclassical economics.
Natalie Wolchover in Quanta:
The boundary does not pass between some huge finite number and the next, infinitely large one. Rather, it separates two kinds of mathematical statements: “finitistic” ones, which can be proved without invoking the concept of infinity, and “infinitistic” ones, which rest on the assumption — not evident in nature — that infinite objects exist.
Mapping and understanding this division is “at the heart of mathematical logic,” said Theodore Slaman, a professor of mathematics at the University of California, Berkeley. This endeavor leads directly to questions of mathematical objectivity, the meaning of infinity and the relationship between mathematics and physical reality.
More concretely, the new proof settles a question that has eluded top experts for two decades...
FROM A DISTANCE, the causes of the Brazilian crisis seem obvious. A corrupt government, after fourteen years in power, begins to suffer the consequences of erratic policies: a deep recession follows, and protesters then take to the streets to overthrow the government. This explanation isn’t so much unfounded as insufficient. The government is corrupt, but so are all the other parties. The economy is in recession, but there have been other periods of turbulence in the past, and not all of them led to a coup. Protesters are on the streets, yet they make up a small demographic, and are unrepresentative of the larger population. To state that a couple of organs in a body have failed says little of the disease that overtook it. I’m not sure, though, that watching the corpse decompose from up close yields any kind of special explanatory power. It may well be that those farther away are better equipped to explain things.
To watch the maggots scuffle and reproduce on the body—think of them as various members of the executive, legislative, and judiciary—is less revolting than it is profoundly boring. And yet none of us here are able to look away. In Rio, where I live and write for a monthly magazine, the crisis often seems to be the only topic of conversation. Once a week, I head to the magazine’s offices in Ipanema. There, for about twenty or thirty minutes, my colleagues and I exchange pleasantries and drink coffee. Then someone will pull up a video of the latest outrage: a right-wing congressman justifying his vote to oust President Dilma Rousseff by paying homage to a deceased torturer from the military dictatorship; a lawyer, responsible for filing the impeachment request against the President, waving the Brazilian flag in her hand as she shouts incantations against the “republic of the snake”; the popular leftist musician and supporter of the ruling Workers’ Party (PT) Chico Buarque confessing that he didn’t in fact write his own songs, but rather bought the lyrics and melodies off the street from a guy named Ahmed.
Refugees, asylees, IDPs (internally displaced persons), PRSs, stateless persons: these are new categories of human beings created by an international state-system in turmoil, human beings who are subject to a special kind of precarious existence. Although they share with other "suffering strangers" the status of victimhood and become the objects of our compassion – or as the UNHCR report puts it, become "persons of concern" – their plight reveals the most fateful disjunction between so-called "human rights" – or "the rights of man", in the older locution – and "the rights of the citizen"; between the universal claims to human dignity and the specificities of indignity suffered by those who possess only human rights. From Hannah Arendt's famous discussion of the "right to have rights" in The Origins of Totalitarianism to Giorgio Agamben's homo sacer to Judith Butler's "precarious lives" and Jacques Rancière's call to "the enactment of rights", the asylum seeker, the stateless and the refugee have become metaphors as well as symptoms of a much deeper malaise in the politics of modernity.
Yet as political fatigue about internationalism has gripped the United States in the wake of the interventions in Afghanistan and Iraq, and president Obama's politics of caution in Syria has created further moral quagmires, we have moved from "the right to have rights" to the "critique of humanitarian reason." Didier Fassin, who for many years worked with Médecins Sans Frontières in a high capacity, and to whom we owe this term, defines it as follows: "Humanitarian reason governs precarious lives: the lives of the unemployed and the asylum seeker, the lives of sick immigrants and people with AIDS, the lives of disaster victims and victims of conflict – threatened and forgotten lives that humanitarian government brings into existence by protecting and revealing them."
“There are more things in heaven and earth, Horatio, than are dreamt of in your philosophy,” says Shakespeare’s Hamlet, a troubled dropout struggling with questions of responsibility, to his best friend. Even by the Elizabethan era, it seems, a discipline that had begun in classical times as a practical method for discerning how best to live life had devolved into something increasingly hermetic. Wind the clock forward, to the late 19th century, and philosophy had become an exclusively academic profession, focused on seemingly arcane questions of aesthetics, epistemology, and ethics.
The 20th century, improbably, changed all that. The Great War, the sweeping away of imperial dynasties, waves of technological revolutions, a worldwide economic collapse, the rise of global totalitarianism, World War II, the atom bomb, the Cold War—all these events destroyed cultural certainties and introduced bewilderment and anxiety. And it was philosophy—as a means of understanding who we are, why the world is as it is, what we ought to do—that came to the rescue.
Or at least that’s the message of At the Existentialist Café, the sprightly, elegant, occasionally unsatisfying new book by Sarah Bakewell, author of the 2010 National Book Critics Circle Award–winning How to Live, or, A Life of Montaigne in One Question and Twenty Attempts at an Answer.
Sara Reardon in Nature:
Children from impoverished families are more prone to mental illness, and alterations in DNA structure could be to blame, according to a study published on 24 May in Molecular Psychiatry1. Poverty brings with it a number of different stressors, such as poor nutrition, increased prevalence of smoking and the general struggle of trying to get by. All of these can affect a child’s development, particularly in the brain, where the structure of areas involved in response to stress and decision-making have been linked to low socioeconomic status. Poor children are more prone to mental illnesses such as depression than their peers from wealthier families, but they are also more likely to have cognitive problems. Some of these differences are clearly visible in the brain structure and seem to appear at birth, which suggests that prenatal exposure to these stressors can be involved2.
From birth to adolescence
But neurodevelopment does not stop at birth. Neuroscientist Ahmad Hariri of Duke University in Durham, North Carolina, suspected that continual exposure to stressors might affect older children as well. He decided to test this idea by studying chemical tags known as methyl groups, which alter DNA structure to regulate how genes are expressed. There is some evidence that methylation patterns can be passed down through generations, but they are also altered by environmental factors, such as smoking. To test whether these mechanisms are involved in the increased likelihood of depression seen in impoverished children, Hariri and his colleagues zeroed in on a gene called SLC6A4, which encodes a protein that transports the brain-signalling molecule serotonin into neurons. The gene has long been known to be involved in depression, and the serotonin receptor is the target of many antidepressant drugs. Hariri and his colleagues collected blood samples from 183 Caucasian children aged 11–15, and tested the children for symptoms of depression. They also examined how the children responded to stress by scanning their brains to monitor activity when shown a picture of a frightened face. People who are highly sensitive to threats show more activity in the amygdala — the brain’s ‘fight or flight’ centre — when they see such an emotion.
I love you
because the Earth turns round the sun
because the North wind blows north
because the Pope is Catholic
and most Rabbis Jewish
because the winters flow into springs
and the air clears after a storm
because only my love for you
despite the charms of gravity
keeps me from falling off this Earth
into another dimension
I love you
because it is the natural order of things
I love you
like the habit I picked up in college
of sleeping through lectures
or saying I’m sorry
when I get stopped for speeding
because I drink a glass of water
in the morning
and chain-smoke cigarettes
all through the day
because I take my coffee Black
and my milk with chocolate
because you keep my feet warm
though my life a mess
I love you
because I don’t want it
any other way
Tuesday, May 24, 2016
Tim Parks in the New York Review of Books:
It has become commonplace, in this age of globalization, to speak of novelists and poets who change language, whether to find a wider audience or to adapt to life in a new country. But what about those writers who move to another country and do not change language, who continue to write in their mother tongue many years after it has ceased to be the language of daily conversation? Do the words they use grow arid and stiff? Or is there an advantage in being away from what is perhaps only the flavor of the day at home, the expressions invented today and gone tomorrow? Then, beyond specifically linguistic concerns, what audience do you write toward if you are no longer regularly speaking to people who use your language?
The most famous candidate for a reflection on this situation would be James Joyce, who left Ireland in 1904 aged twenty-two and lived abroad, mainly in Trieste and Paris, until his death in 1941. Other writers one could speak of would be W. G. Sebald, writing in German while living in England, Dubravka Ugrešić writing in Croatian while living in Holland, or Aleksandr Solzhenitsyn and Joseph Brodsky, who went on writing in Russian after being forced into exile in the United States. One could go back and look at Robert Browning’s fifteen years in Italy, or Italo Calvino’s thirteen years in Paris. There are many others. Yet the easiest example, the only one I can write about with some authority, and, frankly, one of the most extreme, for length of time away and level of engagement with the foreign language and foreign country, is myself. What has happened to my English over thirty-five years in Italy? How has this long expatriation—I would never call it exile—changed my writing?
Jimena Canales in Nautilus:
On April 6, 1922, Einstein met a man he would never forget. He was one of the most celebrated philosophers of the century, widely known for espousing a theory of time that explained what clocks did not: memories, premonitions, expectations, and anticipations. Thanks to him, we now know that to act on the future one needs to start by changing the past. Why does one thing not always lead to the next? The meeting had been planned as a cordial and scholarly event. It was anything but that. The physicist and the philosopher clashed, each defending opposing, even irreconcilable, ways of understanding time. At the Société française de philosophie—one of the most venerable institutions in France—they confronted each other under the eyes of a select group of intellectuals. The “dialogue between the greatest philosopher and the greatest physicist of the 20th century” was dutifully written down.1 It was a script fit for the theater. The meeting, and the words they uttered, would be discussed for the rest of the century.
The philosopher’s name was Henri Bergson. In the early decades of the century, his fame, prestige, and influence surpassed that of the physicist—who, in contrast, is so well known today. Bergson was compared to Socrates, Copernicus, Kant, Simón Bolívar, and even Don Juan. The philosopher John Dewey claimed that “no philosophic problem will ever exhibit just the same face and aspect that it presented before Professor Bergson.” William James, the Harvard professor and famed psychologist, described Bergson’s Creative Evolution (1907) as “a true miracle,” marking the “beginning of a new era.” For James, Matter and Memory (1896) created “a sort of Copernican revolution as much as Berkeley’s Principles or Kant’s Critique did.” The philosopher Jean Wahl once said that “if one had to name the four great philosophers one could say: Socrates, Plato—taking them together—Descartes, Kant, and Bergson.” The philosopher and historian of philosophy Étienne Gilson categorically claimed that the first third of the 20th century was “the age of Bergson.” He was simultaneously considered “the greatest thinker in the world” and “the most dangerous man in the world.” Many of his followers embarked on “mystical pilgrimages” to his summer home in Saint-Cergue, Switzerland.
Jacob Harris in The Atlantic:
One of the joys of modern technology is how easy it is to immerse yourself in the past. Every day, more libraries and archives are pushing pieces of their collections online in easily browsable interfaces.
The New York Public Library, for instance, has historic menus and interactive floor plans. Chronicling America is a searchable repository of newspapers published between 1836 and 1922 from the Library of Congress, which is also one of the many institutions in the Flickr Commons public image archive. Wikipedia has its own Wikimedia Commons, to which anybody can upload images and videos. Project Gutenberg continues to add new public-domain books to its collection every day, and New York’s Metropolitan Museum of Art has posted thousands of images online with metadata as part of its Open Access for Scholarly Collections initiative.
My personal favorite however is TimesMachine, a site available to all New York Times subscribers that lets readers virtually flip through any historical issue of The New York Times all the way up through 2002. The site delivers the reader directly to the past, making you feel like a cross between a tourist and an archaeologist. You might start by visiting a historic event—say, coverage of the Titanic sinking—but the real fun is wandering off the beaten path and exploring all the other news of the day. On the same day the Titanic sank, there was also coverage of a gun battle in Greenwich Village, and a passenger lost in a runaway balloon. On any day, such vignettes sometimes become rabbit holes to the past.
This is the story of how I ended up captivated by a chance encounter with a 135-year-old newspaper advertisement—and how the random face staring back at me from the archives would reveal the surprising origins of ASCII art, a graphic design technique that’s usually associated with 20th-century computer art.
Alex Wellerstein in The New Yorker:
The demonstration began on the afternoon of May 21, 1946, at a secret laboratory tucked into a canyon some three miles from Los Alamos, New Mexico, the birthplace of the atom bomb. Louis Slotin, a Canadian physicist, was showing his colleagues how to bring the exposed core of a nuclear weapon nearly to the point of criticality, a tricky operation known as “tickling the dragon’s tail.” The core, sitting by itself on a squat table, looked unremarkable—a hemisphere of dull metal with a nub of plutonium sticking out of its center, the whole thing warm to the touch because of its radioactivity. It had been quickly molded into shape after the bombing of Nagasaki, to be used in another attack on Japan, then reallocated when it turned out not to be needed for the war effort. At that time, Slotin was perhaps the world’s foremost expert on handling dangerous quantities of plutonium. He had helped assemble the first atomic weapon, barely a year earlier, and a contemporary photograph shows him standing beside its innards with his shirt unbuttoned and sunglasses on, cool and collected. Back then, the bomb was a handmade, artisanal product.
Slotin’s procedure was simple. He would lower a half-shell of beryllium, called the tamper, over the core, stopping just before it was snugly seated. The tamper would reflect back the neutrons that were shooting off the plutonium, jump-starting a weak and short-lived nuclear chain reaction, on which the physicists could then gather data. Slotin held the tamper in his left hand. In his right hand, he held a long screwdriver, which he planned to wedge between the two components, keeping them apart. As he began the slow and painstaking process of lowering the tamper, one of his colleagues, Raemer Schreiber, turned away to focus on other work, expecting that the experiment would be uninteresting until several more moments had passed. But suddenly he heard a sound behind him: Slotin’s screwdriver had slipped, and the tamper had dropped fully over the core. When Schreiber turned around, he saw a flash of blue light and felt a wave of heat on his face.
Edward Docx in Prospect:
Dylan turns 75 on 24th May. For millions of devotees like myself—many of whom consider him the world’s greatest living artist—it is a moment of celebration tinged with apprehension. Joan Baez, his most significant early anointer-disciple (Joan the Baptist), best expresses what might be described as “the Dylan feeling” in the excellent Martin Scorsese 2005 documentary when she says: “There are no veils, curtains, doors, walls, anything, between what pours out of Bob’s hand on to the page and what is somehow available to the core of people who are believers in him. Some people would say, ‘not interested,’ but if you are interested, he goes way, way deep.” I love this for lots of reasons but most of all because it captures not only the religious devotion that many who love him feel, but also the bemused indifference of the sane and secular who do not.
Of course, the first order of business when writing about Dylan is to urge readers to ignore writers who write about Dylan. We are like Jehovah’s Witnesses, forever tramping door to door with our clumsy bonhomie and earnest smudgy leaflets; in all honesty, you would be much better off seeking out the resonant majesty of the actual work. Indeed, you’ll be relieved—and possibly endeared—to hear that Dylan himself considers his disciples to be deranged. “Why is it when people talk about me they have to go crazy?” Dylan asked in a recent interview for Rolling Stone. “What the fuck is the matter with them?”
Ralph Jones in New Humanist:
Few atheists know the Bible as intimately as Dan Barker. Few, after all, can profess to have begun their careers as fundamentalist Christian preachers. Currently co-president of the Freedom from Religion Foundation, an American non-profit organisation, Barker was a self-proclaimed “extremist” for 19 years, until he renounced the faith. Given how vehemently the 66-year-old now defends a life free of any supernatural authority, I ask him if he regrets the consequences that his Christian ministry may have had on people he would now describe as vulnerable. “Yes, I do regret a lot of it,” he says with candour. “I would counsel people to pray for healing. That’s dangerous. That’s harmful. People die from that. And I acted irresponsibly with my health, because I knew that God was going to take care of me.” This is a window that, once opened, is difficult to close. Barker reels off multiple instances in which he believes that he seriously damaged the lives of his parishioners.
In Arizona, a woman approached him, looking for faith healing to cure her of an illness. The two prayed together and when, inevitably, it did nothing, he said, “Let it be unto you according to your faith” (a reference to a line originally found in Matthew 8:13). “In other words,” Barker says, “it was her fault. She walked out of that meeting not only not healed but feeling chastised. It’s not a kind way to treat another human being.” In his mid-twenties, he counselled a woman who was struggling with an abusive husband. Barker told her to persevere with him because, as the Bible says, he would eventually see the light. “So I counselled a woman to stay in an abusive relationship, because the Bible says that you are married for life.” What would he say if she approached him with the same problem now? “I would tell her to run for the nearest shelter and get out of there.” Barker may have left religion behind but he is still a preacher of sorts. His latest book, God: the Most Unpleasant Character in All Fiction, draws on his knowledge of scripture to attack the Bible’s claim to moral authority.
...I am interested in Barker’s views on Donald Trump, the man taking alarmingly large strides up the escalator of US politics. “It seems to me that there’s an awful lot of shallow support for people like Trump,” he says. The Republican candidate has appealed to the supposed “Christian” character of the US as a way to mobilise prejudice against Muslims. His followers seem to believe that he is a Christian but Barker sees this more as identity politics than evangelism. “He doesn’t know that much about the Bible. He doesn’t speak the Christian lingo.”
Jennifer Hackett in Scientific American:
For most people a single bee or wasp sting is one too many. But University of Arizona entomologist Justin Schmidt is a dramatic exception: By his own estimation he has been stung more than 1,000 times by at least 80 kinds of insects as part of his job. After unintentionally collecting a few different types of stings while conducting fieldwork to investigate the social behavior of stinging insects, Schmidt decided to take a cue from medical science and create a sting pain index that ranked each sting on a scale of 1 to 4 with eloquent, almost poetic descriptions of the pain (or lack thereof) they caused. The scale, Schmidt hoped, would help reveal how the ability to sting—and the type of sting delivered—serve different insects and enable their respective social structures.
In his new book The Sting of the Wild, which came out this week and was published by Johns Hopkins University Press, Schmidt explains the roles of stings in insect society in great detail. He devotes chapters to how different insects inflict their respective flavors of pain, covering creatures from fire ants to tarantula hawk wasps to honeybees. For the first time, Schmidt’s full sting pain index and his thoughts on each experience—including such comments as “like coffee, but oh so bitter” for a low-level sting or “like spilling a beaker of hydrochloric acid on a paper cut” for a higher one—is published at the end of the book. Even though the pain-laced topic might leave you wincing, Schmidt’s engaging and entertaining writing makes for a tale worth reading.
Afternoon in Siena
Soon I will know this room.
It will have become familiar.
Then sometime after I’ve left
they’ll rent it to another writer
or student, a couple on holiday
for a long weekend.
For now I’ll try to fix it in my mind,
this ordinary room with its cold
tile floor without a rug,
the low chair and ugly wardrobe
with its foxed glass,
the shuttered windows that open
onto the narrow street where
in the evening a small dog yaps
and yelps beneath the washing line,
the purple canopy of wisteria.
And in the corner, of course,
the messy bed, where in another life
we might have made love -
the afternoon sun
bathing us in liquid light -
if only I knew who you were.
by Sue Hubbard
Monday, May 23, 2016
by Scott F. Aikin and Robert B. Talisse
Argumentation is the term used to denote the activity of arguing with a real interlocutor, in real time, over claims that are actually in dispute. When argumentation is properly conducted, the parties involved exchange arguments, objections, criticisms, and rejoinders, all aimed at discerning the truth (or at least what one would be most justified in accepting to be true). To be sure, argumentation does not always result in a consensus among disputants; even when argumentation is impeccably conducted, disagreement often persists. But this is no strike against argumentation. This is for a few reasons. First, the open exchange of reasons, evidence, and criticism is, after all, the best means we have for rationally resolving disputes and pursuing the truth. Insofar as we want a rational resolution, this is not only our best means, it's our only means. Furthermore, even when argumentation does not dispel disagreement, it can provide disputants with a firmer grasp of precisely where they differ. So even if argument doesn't yield consensus, it does yield fecundity. And, as John Stuart Mill famously observed, understanding the views of one's critics is an essential element of understanding one's own views.
We have frequently claimed in this column that argumentation comes naturally to human beings. People aspire to form and maintain true beliefs and eschew false beliefs, and the central way in which they enact this aspiration is by arguing with each other. Of course, that people are naturally disposed to engage in argumentation does not entail that people are naturally adept at it. The pitfalls of human reasoning are abundant, and there is rightly a substantial academic industry devoted to identifying, studying, and cataloguing them.
Yet detecting argumentative pitfalls is itself part of the activity of argumentation. When we argue, for sure, we argue about things. And so most argument has all the vocabulary of any other talk about the world. But when we argue, we aren't just looking at the things we are talking about, we are evaluating what we've said as reasons. And so, we must have a vocabulary that doesn't merely track things we are talking about, but it must also track how we've talked about it. That's what it is to assess whether you think someone's reasoning is acceptable or not. The issue isn't always about whether you accept what an interlocutor says, but it's also about how the things they say logically relate to each other.
by Daniel Ranard
Math works pretty well. We can count apples and oranges; we can scribble equations and then launch a rocket that lands gently upright. When an argument is indisputable, we colloquially say "do the math," and we speak of events that will happen with "mathematical certainty." Math works so well that you're forced to wonder: why, and what does it mean about our world? I won't fully answer these questions, but I'll offer a few perspectives.
You don't need to know much math to see it works. Say you go apple-picking with a friend; you count 12 as you pick them, and your friend counts 19 of her own. How many apples are in the basket? Maybe you crunch the numbers on a scrap of paper, just to be sure. You manipulate symbols on a page, and afterward you make a claim about reality: you know how many apples you would count if you pulled them out.
But was that math or just common sense? If you're not impressed by addition, let's try multiplication. I suspect many of us encounter our first real mathematical "theorem" when we learn that A times B is B times A. As Euclid wrote circa 300 BC, "If two numbers multiplied by one another [in different orders] make certain numbers, then the numbers so produced equal one another." This fact may be so familiar you forget its meaning: 4 x 6 = 6 x 4, or rather 6 + 6 + 6 + 6 = 4 + 4 + 4 + 4 + 4 + 4. It may be obvious, but a curious child would still ask, why? The equation demands proof, much like the Pythagorean Theorem. Euclid gave a proof in Book VII, Proposition 16 of the Elements. And though he proved an abstract fact using abstract symbols, the world seems to obey this arithmetic rule: if you have four groups of six apples, Euclid predicts you can always rearrange them into six groups of four.
Maybe it's no surprise we can use arithmetic to make these predictions. But what about the success of more sophisticated math and physics?
"I got my own pure little bangtail mind and
the confines of its binding please me yet."
~ Neal Cassady, letter to Jack Kerouac
One of the curious phenomena that computing in general, and artificial intelligence in particular, has emphasized is our inevitable commitment to metaphor as a way of understanding the world. Actually, it is even more ingrained than that: one could argue that metaphor, quite literally, is our way of being in the world. A mountain may or may not be a mountain before we name it - it may not even be a mountain until we name it (for example, at what point, either temporally or spatially, does it become, or cease to be, a mountain?). But it will inhabit its ‘mountain-ness' whether or not we choose to name it as such. The same goes for microbes, or the mating dance of a bird of paradise. In this sense, the material world existed, in some way or other, prior to our linguistic entrance, and these same things will continue to exist following our exit.
But what of the things that we make? Wouldn't these things somehow be more amenable to a more purely literal description? After all, we made them, so we should be able to say exactly what these things are or do, without having to resort to some external referents. Except we can't. And even more troubling (perhaps) is the fact that the more complex and representative these systems become, the more irrevocably entangled in metaphor do we find ourselves.
In a recent Aeon essay, Robert Epstein briefly guides us through a history of metaphors for how our brains allegedly work. The various models are rather diverse, ranging from hydraulics to mechanics to electricity to "information processing", whatever that is. However, there is a common theme, which I'll state with nearly the force and certainty of a theorem: the brain is really complicated, so take the most complicated thing that we can imagine, whether it is a product of our own ingenuity or not, and make that the model by which we explain the brain. For Epstein - and he is merely recording a fact here - this is why we have been laboring under the metaphor of brain-as-a-computer for the past half-century.
by Jonathan Kujawa
(this is the sequel to last month's 3QD essay on the Pancake Problems)
I frequently come across a rafter of wild turkeys on bike rides through the countryside near my home. This particular group is recognizable thanks to having a peahen as an honorary member. Just this morning I was treated to a startling surprise: the peahen was busily herding a brood of chicks! I would have thought peacocks and turkeys were too distantly related to successfully breed. Apparently nobody told the peahen. I haven't seen any other peacocks in the neighborhood, so it would seem that she is more than friends with one of her turkey buddies. According to the internet, peacock/turkey hybrids (turcocks? peakeys?) are a thing which can happen.
Going by looks and their natural geographic ranges, my wrong guess was that peacocks and turkeys should be pretty distant on the tree of life. In the not-too-distant past, classification of species depended on such observational data.
Nowadays we can dig directly into the DNA to look for answers about relatedness. In the past decade it became possible to sequence the entire DNA of an organism. Not only that, but it's become fast and cheap. In fifteen years we've gone from the Human Genome Project taking thirteen years and $2.7 billion dollars to sequence the human genome to now being able to do it in days for $1,000. The progress in this field puts Moore's Law to shame.
It's one thing to have the data, it's another to put it to use. To deal with the flood of information pouring out of DNA sequencers an entirely new field called computational molecular biology has sprung up. It's a wonderful combination of biology, mathematics, and computer science.
A good example of this is turnips. Looking at them in the garden you might guess that they are more closely related to radishes than cabbage. In the 1980s Jeffrey Palmer and his collaborators looked at the mitochondrial genomes of turnips and cabbage and found that the genes they contained were nearly identical. What was different was the order of those genes . The random mutations which occurred over the years didn't change the genes themselves, only their position in the DNA.
Even better, Palmer and company saw that the kind of rearrangements which occur are of specific kind. When a mutation occurs, what happens is that a segment of DNA consisting of some number of genes is snipped out, flipped around, and put back in, now in reverse order. For example, if genes were the numbers one through five, a typical sequence of mutations might look like:
Here at each step the segment of genes to be snipped out and reversed is indicated with an underline. Because each mutation reverses the order of some of the genes, folks call it a reversal.