Monday, December 05, 2016
by Scott F. Aikin and Robert B. Talisse
Still reeling from the unexpected outcome of the US presidential election, commentators understandably have begun diagnosing the political and intellectual condition of the country. One assessment that has been gaining traction especially among Left-leaning intellectuals is that, in electing Mr. Trump to the Presidency, the United States has embraced a "post-truth" politics. Though quickly becoming a predominant theme of political commentary, as yet the term does not have a unified meaning. Arguably, it refers instead to a several related but distinct phenomena. But if the term is going to serve any useful diagnostic function, it is necessary to disambiguate its central uses.
Most commonly, "post-truth" is employed to mark the fact that apparently a large segment of the electorate holds that although Ms. Clinton's alleged dishonesty disqualifies her for office, Mr. Trump's dozens of demonstrable lies, deceptions, and whole-cloth fabrications are acceptable, if not positively admirable. That is, our politics has become "post-truth" in that lying and dissembling no longer necessarily count against a politician. But when one regards lying as disqualifying only for one's political opponents, one reveals that one's concern isn't really with truth telling at all. One is appealing to truth in a strictly opportunistic way.
A second deployment of "post-truth" is closely related, though more radical. It concerns the newfound force of unrelenting denial. When caught in a lie or fabrication, Mr. Trump's leading tactic is to flatly deny that he ever said the lying or fabricated thing that, provably, he said. This goes beyond President Bill Clinton's infamously tortured semantics concerning "the meaning of ‘is'." Here, Trump embraces the view of Humpty Dumpty, whose words mean whatever he at any moment declares them to mean. When meaning is fixed wholly by the speaker's will, what the speaker has said is no long evaluable by anyone but the speaker himself. Hence our politics is post-truth not only in that lying, and even asserting falsehood, is conceptually impossible. There could be no such thing as lying for those who, like Carrol's egg, refuse to be mastered by words, even by their own words.
by Jonathan Kujawa
On the occasion of Richard Guy's 100th birthday.
Human beings have a talent for spotting patterns. No doubt this was handy in our early days. If both Ug and Oka were violently ill after eating purple berries from a particular bush, our clan was well served to notice and avoid those berries in the future. If anything, evolution encourages us to be overly quick to infer patterns. Better safe than sorry. No need for a double blind, randomized trial with a large sample size when life and death are on the line.
Our keen sense for what is random and what is not works both for and against us. After all, as we saw last month, it was Pick's keen eye for patterns which lead him to discover his beautiful formula. For an easier example, let's write down the first few numbers of the Fibonacci sequence:
It only takes a minute to spot the rule used to generate these numbers. With a little more time we can even guess exactly which ones are even and which ones are odd .
Now imagine I tell you that I flipped two coins twenty-one times and got:
Coin A: TTTTHTTHHTTHTTTTHHHTT,
Coin B: HTTHTTHTTHTTHTTHTTHTT.
You would have to be an awfully trusting person to believe that both are fair coins and that I faithfully flipped and recorded the results. Our pattern seeking brain notices the regularity of Coin B. It violates our sense of what a "random" sequence of coin flips should look like. Mathematically we know each string of coins flips is equally likely. But even the most steely mathematician would be hard pressed to bet against Coin B next coming up heads.
A great icebreaker on the first day of teaching is to ask the students generate two lists of 100 coin flips: one by flipping a real coin and one by hand. Even though they do this while the instructor is out of the room, the instructor invariably can identify which is list is which. The secret is to look for long runs of heads, tails, or other patterns (like Coin B). Since all outcomes are equally likely to the coin it will happily generate strings of heads in a row. Human brains, however, notice such patterns and veto them in their effort to fake a random sequence of heads and tails. There is a running joke on this theme in Rosencrantz and Guildenstern Are Dead.
when words make love sentences are born
the world’s heft is changed by the weight of nouns,
the hesitations of hyphens and commas,
like the space between breaths,
tell the rhythm of what’s new and what’s been,
the dead stops of periods spell the end of what a breath holds,
adjectives, like the blood blush of infants
color clauses, articles wrap things in skin,
pronouns, unlike the particular names of new beings,
often identify the generalities of their forms by inclusion,
by saying, “We,” suggesting that mine and thine share,
and verbs are the darting eyes of fresh life,
the spastic gestures of unfamiliarity, the random smiles
that pass in the features of infants, sudden, uncalled-for
and of course the cautious steps of the old
reaching for footholds that once came naturally,
without thought, before the foreshadows of final words
by Daniel Ranard
—A.R. Ammons, in Guide
The world is usually grasped by its pieces. We may speak of a "holistic" perspective, but it's hard to understand all of a thing at once, or even to talk about it all at once. So we delineate pieces within the whole; we analyze the pieces and their interactions. For instance, to understand the recent political changes in the United States, it is uninformative to treat society monolithically. On the other hand, it's impossible to consider the disparate lives of every citizen at once. Instead, we delineate groups within the whole of society – political groups, racial groups, geographic groups – and discuss their interplay. Some delineations are more useful than others; it's probably difficult to understand the 2016 election in terms of cat- and dog-lovers. And all delineations share some degree of blurriness or arbitrariness: race is already a blurry construction, and though it may be crystallized in a bureaucratic form, the distinction only makes it more arbitrary. Through these delineations, we come to understand the otherwise formless and chaotic world and even to create our experience within it.
Imagine viewing all of human history as a video on fast-forward; observed all at once, you see only the tangled scurrying human beings. You cannot usefully describe what you see, nor guess what might happen next. But to the historian, history is not formless, though it may require hard work and ingenuity to find the right units of analysis. The traditional divisions of global society into self-described nation states and political institutions may serve you well, but new divisions may also be informative. In What is Global History?, Sebastian Conrad outlines the view that historians must not "accept political entities a priori as the boundaries of analysis, but instead trace the actual scope of entanglements and interconnections and work form there." For instance, Marx chose to analyze history as an interplay of social classes. New perspectives might require more than the simple re-organization of people into different groups; historians might also invent new abstractions, such as the concept of capital, that help organize their observations.
These abstractions can be world-changing and world-making. That is, we live in the world of our chosen abstractions, which shape our narratives about ourselves and society. Although we choose these delineations and abstractions ourselves, I would not draw the radical conclusion that all choices are equal, or that the world is anything we make of it. Some choices are more useful, or maybe also more true or beautiful.
“Pierre Chareau: Modern Architecture and Design”, current exhibition at The Jewish Museum, New York.
"Commenting on Chareau’s work and the exhibition design, DS+R’s (exhibition design) founding partner, Elizabeth Diller, noted, “Pierre Chareau: Modern Architecture and Design is an opportunity to return to a significant figure in every architect’s education, but one primarily known through only one masterwork, the Maison de Verre. This exhibition is a rare opportunity to see so much of Chareau’s creative output brought together in one place. The challenge in undertaking its design was to provide a multi-faceted and imaginative backdrop that would highlight, but not compete with, his exceptional mastery of detailing and assemblage. By engaging with Chareau’s furniture, interiors, and collected ephemera, we are able to absorb and represent his idiosyncratic voice, which has had relatively little exposure in the U.S.'"
by Olivia Zhu
A month ago, I wrote about haikus that wended their way through the Internet and found me. Really, truly—I hadn’t gone looking for them, had read them casually, and thought I might forget them. “And yet, and yet…” Chiyo-ni’s and Issa’s haikus kept nagging at me, so I looked for the translations that spoke most to me and wrote about them. Their mark lingers still, though, and when I passed by a used copy of Harold G. Henderson’s An Introduction to Haiku: An Anthology of Poems and Poets from Bashō to Shiki in a bookstore, I had to pick it up.
Relatively speaking, it’s quite an old book on haiku. Published in 1958, it now appears to be out of print—so I really felt quite fortunate stumbling across as a clueless neophyte. It’s so old that inflation dictated I pay almost double the $1.45 that’s listed on the duck-decorated paperback cover, even in the book’s well-loved condition—still well worth it, of course. From what little I’ve read online, it seems likely that Henderson might have contributed to initial interest in haiku in America, as he helped found the Haiku Society of America and published some of the earliest English-language works on Japanese haiku.
by Jalees Rehman
Less than one fifth of PhD students in the United States will be able to pursue tenure track academic faculty careers once they graduate from their program. Reduced federal funding for research and dwindling support from the institutions for their tenure-track faculty are some of the major reasons for why there is such an imbalance between the large numbers of PhD graduates and the limited availability of academic positions. Upon completing the program, PhD graduates have to consider non-academic job opportunities such as in the industry, government agencies and non-profit foundations but not every doctoral program is equally well-suited to prepare their graduates for such alternate careers. It is therefore essential for prospective students to carefully assess the doctoral program they want to enroll in and the primary mentor they would work with. The best approach is to proactively contact prospective mentors, meet with them and learn about the research opportunities in their group but also discuss how completing the doctoral program would prepare them for their future careers.
The vast majority of professors will gladly meet a prospective graduate student and discuss research opportunities as well as long-term career options, especially if the student requesting the meeting clarifies the goal of the meeting. However, there are cases when students wait in vain for a response. Is it because their email never reached the professor because it got lost in the internet ether or a spam folder? Was the professor simply too busy to respond? A research study headed by Katherine Milkman from the University of Pennsylvania suggests that the lack of response from the professor may in part be influenced by the perceived race or gender of the student.
by Brooks Riley
by Dwight Furrow
There is an ingrained set of assumptions and attitudes about creativity in the arts that harms our understanding of art and ultimately human existence. That is the idea of the artist as a relatively unconstrained maker, a fashioner ex nihilo who brings something new into being solely through the force of her imagination and capacity for self-expression. We might contrast this with an older view of art perhaps best expressed by this quote attributed to Michelangelo:
"In every block of marble I see a statue as plain as though it stood before me, shaped and perfect in attitude and action. I have only to hew away the rough walls that imprison the lovely apparition to reveal it to the other eyes as mine see it."
On the view expressed by Michelangelo, an artist is like a skillful craftsperson who attends to the inherent qualities of a piece of raw material, it's shape, grain, texture or color, and then decides what she can do with it. Art is too varied and complex to wholly fit either description, both of which are drawn too starkly, but I want to make the case that Michelangelo's view has more to recommend it than first meets the eye.
Aesthetic appreciation is often described in terms of adopting an aesthetic attitude, a state of mind in which one attends sympathetically and with focused attention to the aesthetic features of objects. Part of that aesthetic attitude is a willingness to be receptive to what is in the work, to refrain from imposing preconceptions on it, to let the work speak for itself. The viewer or listener must open herself up to being moved by the work and to discover all there is to be discovered in it. As important as this attitude of openness and receptivity is to appreciating art, it would be exceedingly odd if this aesthetic attitude was not also part of the process of creating the work. But if we take this receptive attitude seriously it shows the limitations of our assumptions about artists as ultimate masters.
by Shadab Zeest Hashmi
We are at the Original Pancake House. Walking through these double doors, a life-size mirror is always the first to meet me, but I am never shocked by the reflection as I am now. Does the wallpaper seem more ivory than before? Are the white tablecloths casting a glare? Am I more brown? I catch several glances in the mirror, of breakfast-eaters looking up to see who has entered. Complexion is the new currency. The Election season of 2016 has ended with rising fear and fading nuance. Shades of skin are suddenly coining new dialects of silence. Language, as we know it, is shedding gradations— those gray areas where subtlety of thought resides; nuance is becoming obsolete like discarded snakeskin, but unlike snakeskin, it will not regenerate on its own.
My husband and I discuss the menu in our usual English mixed with Urdu; he orders a German pancake and a vegetarian omelet. Years ago, when we came here for the first time, he declared that the German pancakes at this restaurant were authentic and reminded him of his mother. His mother is originally from Berlin where she met and married his father (a student from Pakistan) after she converted to Islam in the ‘60s. They moved to Karachi a few years later and have lived there ever since.
Sunday, December 04, 2016
Ray Monk in the New York Review of Books:
Thus, in the New Year of 1929, was Ludwig Wittgenstein’s return to Cambridge announced by John Maynard Keynes in a letter to his wife, Lydia Lopokova. Wittgenstein had previously been at Cambridge before World War I as a student of Bertrand Russell, but had acquired his godlike status through the publication after the war of his first and only book, Tractatus Logico-Philosophicus, which was very quickly recognized as a work of genius by philosophers in both Cambridge and his home city of Vienna. Wittgenstein himself was initially convinced that it provided definitive solutions to all the problems of philosophy, and accordingly gave up philosophy in favor of schoolteaching. In 1929, however, he returned to Cambridge to think again about philosophical problems, having become convinced that his book did not, in fact, solve them once and for all.
What drew him back to Cambridge was not the prospect of working again with Russell, who by this time (having been stripped of his fellowship at Trinity College, Cambridge, because of his opposition to World War I) was a freelance journalist, a political activist, and only intermittently a philosopher. Rather, it was the opportunity of working with Frank Ramsey, the man who had persuaded him of the flaws in the Tractatus. Most significantly, Ramsey had shown that the account Wittgenstein gives of the nature of logic in the Tractatus could not be entirely correct.
Belén Fernández in Jacobin:
The late Alexander Cockburn, reflecting on the work of decorated New York Times foreign affairs columnist and neoliberal warmonger extraordinaire Thomas Friedman, once observed: “Friedman’s is an industrial, implacable noise, like having a generator running under the next table in a restaurant. The only sensible thing to do is leave.”
But while generators at least serve a rather obvious function, the same can’t usually be said of Friedman, who has just spewed out his latest unnecessarily humongous book Thank You for Being Late: An Optimist’s Guide to Thriving in the Age of Accelerations.
In the nearly eight hundred pages that comprise my electronic version of the manuscript, there is approximately one glimmer of hope: the point at which Friedman remarks that this is “maybe my last book.”
The title Thank You for Being Late is a reference to Friedman’s realization that when his Washington, DC breakfast companions are a few minutes tardy, he can use the time not only to people-watch and eavesdrop on neighboring conversations but also to have ideas. Who knew?
Ethan Siegel in Forbes:
Perhaps the greatest discovery of all announced in 2016 was the direct detection of gravitational waves. Even though they had been predicted by Einstein's general theory of relativity 101 years prior, it took the development of a laser interferometer sensitive to ripples in space that would displace two mirrors separated by multiple kilometers by less than 10^-19 meters, or 1/10,000th the width of a proton. This finally came to pass during LIGO's 2015 data run, and two bona fide black hole-black hole merger events unambiguously popped out of the data. But how does physics actually allow this? Mārtiņš Kalvāns wants to know:
This question has puzzled me for a long time. Articles about LIGO discovery state that some percentage of black hole merger mass was radiated away, leaving [a] resulting black hole smaller than [the] sum of [the] original mergers. Yet it is accepted that nothing escapes black holes [...] So my question is: how was energy radiated from black hole mergers?
This is a really deep question, and goes straight to the heart of black hole physics and general relativity.
How Stigma Sows Seeds of Its Own Defeat: Defending the liberal project is a Sisyphean task in part because successfully inculcating liberal norms leads to habits that weaken the ability to sustain them
Conor Friedersdorf in The Atlantic:
In the Western world, the percentage of people who say that it is essential to live in a democracy is in precipitous decline. In the United States, only 19 percent of millennials agree that it would be illegitimate for the military to take control of government. The president-elect routinely speculates about authoritarian policies, like stripping citizenship from those who burn the American flag in protest.
During a bygone crisis in global politics, when the liberal order was under sustained attack, Friedrich Hayek published this diagnosis of the challenge before liberals:
If old truths are to retain their hold on men’s minds, they must be restated in the language and concepts of successive generations. What at one time are their most effective expressions gradually become so worn with use that they cease to carry a definite meaning. The underlying ideas may be as valid as ever, but the words, even when they refer to problems that are still with us, no longer convey the same conviction; the arguments do not move in a context familiar to us; and they rarely give us direct answers to the questions we are asking. This may be inevitable because no statement of an ideal that is likely to sway men’s minds can be complete: it must be adapted to a given climate of opinion, presuppose much that is accepted by all men of the time, and illustrate general principles in terms of issues with which they are concerned.
The passage resurfaced this week when Will Wilkinson, in-house philosopher at the Niskanen Center, cited it to suggest that the Sisyphean task of saving liberalism is now ours, the boulder at our feet, the struggle of the hill looming once again.
“If the old truths are not updated for each new age, they will slip from our grasp and lose our allegiance,” he wrote. “The terms in which those truths have been couched will become hollow, potted mottoes, will fail to galvanize, inspire, and move us. The old truths will remain truths, but they’ll be dismissed and neglected as mere dogma, noise. And the liberal, open society will again face a crisis of faith.”
Across the Western world, liberals are grappling with how to execute that project. And while I have no pat answer, I do see an obstacle to success that’s worth understanding.
On Auden’s Musee des Beaux Arts
Not for me so much do I care
what it means—
the parent smiling while
her child’s skating,
cutting figure eights over
a pond’s ice,
veil between two worlds.
One- a world to laugh & breathe in.
The other, you drown in.
Or, seeing something
fall from the sky—
who speaks for
him or her that never grew wings
or simply dreamed the possible—
mention the torturer’s horse
casually scratching its ass,
see how quickly one’s thoughts turn
soft & nuzzy.
Now is the time
to further expand the metaphor:
Off goes the gilded Jolly Roger,
a smiley face
o’er its skull & bones.
At the tiller, a pirate steering,
tacking further, each instant
more distant from those
casually orphaned of human love and care.
See how it gathers speed with all the available air.
Auden's Musee des Beaux Art
Andrew O'Hehir in Salon:
How much of the “news” is fake? How much of reality is “real”? After an election cycle driven by lies, delusions and propaganda — including lies about lies, multiple layers of fake news and meta-fake news — we are about to install a fake president, elected by way of the machineries of fake democracy. The country that elected him is fake too, at least in the sense that the voters who supported Donald Trump largely inhabit an imaginary America, or at least want to. They think it’s an America that used to exist, one they heard about from their fathers and grandfathers and have always longed to go back to. It’s not. Their America is an illusion that has been constructed and fed to them through the plastic umbilicus of Fox News and right-wing social media to explain the anger and disenfranchisement and economic dislocation and loss of relative privilege they feel. All of which are real, if not necessarily honorable; it represents the height of liberal uselessness to keep on quarreling about whether Trump’s fabled “white working class” suffers real economic pain or is just a cesspool of racism. That argument is really about other things, to be sure: It’s about whether the Democratic Party — whose long-promised era of permanent demographic hegemony and middle-class multiculturalism keeps being delayed into the indefinite future, defeat after defeat after defeat — requires a major reconstruction or just a little cosmetic surgery. Meanwhile, out in the pseudo-reality of Trumpian America, racial resentment and economic suffering are so profoundly intertwined that there’s no way to disentangle them. Arguments that the so-called left should pretty much ignore the deplorables who keep on voting against their own interests, or should abandon “identity politics” in quest of some middle-road economic populism that blends Bill Clinton and FDR, are both missing the point. In a nation where a candidate who won the popular vote by roughly 2.5 million did not win the election, we are no longer dealing with reality, at least as it used to exist.
Hillary Clinton was the ultimate Establishment candidate facing the ultimate outsider, and also a quintessential old-media personality facing a veritable Voldemort of social media. Given that, she came pretty damn close to pulling it off. But Clinton was also a candidate from reality facing a shimmering celebrity avatar, a clownish prankster who took physical form in our universe but who could say anything and do anything because he was self-evidently not real. That disadvantage proved impossible to overcome. Furthermore, Trump’s supporters may be delusional and misguided, but they aren’t half as dumb as they often look to “coastal elites.” Many of them understood, consciously or otherwise, that his incoherent promises could not be taken literally and that his outrageous personality did not reflect the realm of reality. They were sick of reality, and you can’t entirely blame them. For lots of people in “middle America” (the term is patronizing, but let’s move on) reality has been so debased, or so much replaced, as to seem valueless.
Christina Beck in The Christian Science Monitor:
One hundred and eighty-four years ago today, a literary giant was born to a small, struggling family in Pennsylvania. Yet within just a few decades, Louisa May Alcott won herself both a reputation and the hearts and minds of generations with her prose. Google Doodle creator Sophia Diao decided to depict Ms. Alcott with her three sisters in commemoration of her birthday and her most beloved work, "Little Women." Alcott was born in 1832, the daughter of prominent (but impecunious) Transcendentalist intellectual Amos Bronson Alcott. Mr. Alcott moved his family around frequently, finally settling in Concord, Mass., in 1840 when Louisa was eight years old. In Concord, the Alcotts found, if not earthly wealth, then a bounty of friends and intellectual sparring partners. Prominent Transcendentalists and New England intellectuals Ralph Waldo Emerson and Henry David Thoreau also lived in town, as did Margaret Fuller and Nathaniel Hawthorne. In the midst of this intellectual bounty, however, the Alcotts continued to struggle financially, forcing Louisa to take jobs as a school teacher and a seamstress. As abolitionists and New Englanders, the Alcotts supported the North during the American Civil War, and Louisa took her commitment to the cause a step further when she served as a nurse in Union hospitals.
Her experiences during the Civil War inspired her first book, "Hospital Sketches," which won her some small notice in the literary world. "Hospital Sketches," although rarely read today, served to highlight Louisa’s writing talent, and attracted the notice of a publisher. That publisher met with her father, and the two men struck a deal, urging Louisa to write a book for young girls. Contingent on Louisa’s writing was a book contract for her father. At first, Louisa was reluctant to write the book, saying that it was not her preferred type of writing. Yet her manuscript, called "Little Women," which drew on her own experiences with her three sisters, quickly became a massive success, finally lifting the Alcotts out of their longstanding poverty.
Saturday, December 03, 2016
Jerusalem’s veneer of harmony, tolerance and inclusiveness is as thin and as alluring as the fine layer of gold covering the gray lead dome standing on the top of the contested Temple Mount, or the Haram al-Sharif as it is called in Arabic. Timeless conflict brews under the beautiful surface of the sacred city, whose many names are yet another manifestation of the continuing rivalry around the “ownership” of its holy sites and symbolic history. The controversial resolution passed by Unesco in October, which attempted to classify the Western Wall, one of Judaism’s holiest sites, as a part of the Muslim Al Aqsa Mosque compound, is yet another step in this long tradition of conflict.
This everlasting rivalry, paradoxically, has only enhanced the beauty and cultural richness of Jerusalem. The various churches and mosques, each competing to have the tallest structure in the sacred city, have invested a fortune in building magnificent minarets and bell towers meant to own the Jerusalem skyline. The warring Christian sects, in their struggle to dominate the sacred Church of the Holy Sepulcher (a struggle that at times has led to almost comical fistfights among priests, monks and ministers), have made great efforts to enhance their part of the space and make it outshine the others. In the same tradition, the great resources allocated by the Israelis to restore and celebrate Jerusalem’s Jewish past have made it an attractive destination for travelers from all around the world.
Kathleen Collins was a professor of film history at New York’s City College who made a groundbreaking contribution to the subject that she taught. “Losing Ground” (1982), which Collins wrote and directed, was one of the first feature-length dramas made by an African American woman. Collins, who was also an activist and playwright, never got the chance to make another film. She died in 1988, at age 46, after a bout with breast cancer — a life, and a life’s work, cut brutally short.
“Losing Ground” is the story of a marriage in crisis and an intimate portrait of the black creative class in New York in the 1970s. Sarah, a promising young academic, is married to Victor, an older and somewhat louche painter who has just made his first major sale to a museum. (Notably, his work is acquired not by an American institution but by the Louvre.) To celebrate, they rent a summer house in a majority-Puerto Rican community in the Hudson Valley, where Victor becomes smitten with the local culture (and a local woman) while Sarah starves for intellectual and emotional attention, until one of her students asks her to come back to the city to star in a film of his.
Richard Brody, writing in the New Yorker this past spring, called “Losing Ground” “a nearly lost masterwork” and noted ruefully that “[h]ad it screened widely in its time, it would have marked film history.”
Sometimes you change your mind about a writer. Perhaps, when you first read them you were only pretending to admire what you’d been told to admire. But also your tastes change. For instance, at 25 I was more open to writers telling me how to live and how to think; by 65 I had come to dislike didacticism. I don’t want to be told how to think and how to live by, say, Bernard Shaw, or D H Lawrence or the later Tolstoy. I don’t like art – especially theatrical art – whose function seems to be to reassure us that we are on the right side. Sitting there complacently agreeing with a playwright that war is bad, that capitalism is bad, that bad people are bad. “You don’t make art out of good intentions,” is one of Flaubert’s wiser pronouncements.
Sometimes, when our tastes become more defined, they become narrower. But this doesn’t have to be the case. I want to address a rarer changing of the mind, which is altogether more enriching: when a writer you had previously been indifferent to, indeed actively despised, suddenly makes sense to you, and you realise – with, yes, a kind of joy – that at last you see the point of them.
I first read EM Forster when an English master handed out a list of Great Books to be read one summer holiday. A Passage to India was on that list. I still have the Penguin edition – a reprint of 1960, costing three shillings and sixpence – in which I read the novel. There are no notes in the margin, not a single cry of “Irony!” It clearly made little impression on me. Later, of my own volition, when I was about 20, I read A Room With a View, and actively began to take against Forster.
Jonathan Safran Foer in The Guardian:
The first time my father looked at me was on a screen, using technology developed to detect flaws in the hulls of ships. His father, my grandfather, could only rest his hand on my grandmother’s belly and imagine his infant in his mind. But by the time I was conceived, my father’s imagination was guided by technology that gave shape to sound waves rippling off my body. The Glasgow-based Anglican obstetrician Ian Donald, who in the 1950s helped bring ultrasound technology from shipyard to doctor’s office, had devoted himself to the task out of a belief that the images would increase empathy for the unborn, and make women less likely to choose abortions. The technology has also been used, though, to make the decision to terminate a pregnancy – because of deformity, because the parent wants a child of a certain sex. Whatever the intended and actual effects, it is clear that the now iconic black and white images of our bodies before we are born mediate life and death. But what prepares us to make life-and-death decisions? My wife and I debated learning the sex of our first child before birth. I raised the issue with my uncle, a gynaecologist who had delivered more than 5,000 babies. He was prone neither to giving advice nor anything whiffing of spirituality, but he urged me, strongly, not to find out. He said, “If a doctor looks at a screen and tells you, you will have information. If you find out in the moment of birth, you will have a miracle.”
I don’t believe in miracles, but I followed his advice, and he was right. One needn’t believe in miracles to experience them. But one must be present for them. Psychologists who study empathy and compassion are finding that, unlike our almost instantaneous responses to physical pain, it takes time for the brain to comprehend “the psychological and moral dimensions of a situation”. Simply put, the more distracted we become, and the more emphasis we place on speed at the expense of depth – redefining “text” from what fills the hundreds of pages of a novel, to a line of words and emoticons on a phone’s screen – the less likely and able we are to care. That’s not even a statement about the relative worth of the contents of a novel and a text, only about the time we spend with each.
Costica Bradatan in the Los Angeles Review of Books:
For some, he was one of the most subversive thinkers of his time — a 20th-century Nietzsche, only darker and with a better sense of humor. Many, especially in his youth, thought him to be a dangerous lunatic. According to others, however, he was just a charmingly irresponsible young man, who posed no dangers to others — only to himself perhaps. When his book on mysticism went to the printers, the typesetter — a good, God-fearing man — realizing how blasphemous its contents were, refused to touch it; the publisher washed his hands of the matter and the author had to publish the blasphemy elsewhere, at his own expense. Who was this man?
Emil Cioran (1911–1995) was a Romanian-born French philosopher and author of some two dozen books of savage, unsettling beauty. He is an essayist in the best French tradition, and even though French was not his native tongue, many think him among the finest writers in that language. His writing style is whimsical, unsystematic, fragmentary; he is celebrated as one of the great masters of aphorism. But the “fragment” was for Cioran more than a writing style: it was a vocation and a way of life; he called himself “un homme de fragment.”