Sunday, April 23, 2017
Harry Lewis in the Washington Post:
When should traditional liberal values be sacrificed to important but narrower ends? That is the question behind Harvard University’s effort to subordinate freedom of association and freedom of speech to a locally fashionable form of “nondiscrimination.”
Last spring, the university decided to attack the off-campus, all-male Final Clubs by disqualifying their members from Rhodes Scholarships and other distinctions — unless the clubs admitted women. A few of these clubs are infamous for loud parties and drunken misbehavior. The new strategy against them had the merit of novelty, even in the absence of evidence that coed clubs would behave any better.
Faculty members reacted with alarm, recalling Sen. Joseph McCarthy’s persecution of Harvard professors in the 1950s simply for belonging to a hated organization. Students deserve a better lesson from Harvard than an attempt to solve social problems by blackballing members of unpopular groups.
The policy covers all “single-gender social organizations” consisting of Harvard students, so the same sanctions would be visited on women’s clubs, including sororities. More women than men are affected, even though most of the women’s clubs don’t have real estate, much less raucous parties. Hundreds of women staged a surprise protest in response.
HM Naqvi in Dawn:
In Ways of Seeing, John Berger, the late, great, British art critic, posits that, “The way we see things is affected by what we know or what we believe. In the Middle Ages when men believed in the physical existence of Hell, the sight of fire must have meant something different from what it means today… History always constitutes the relation between a present and its past.”
…In his opus, Image and Identity, the late, great, Pakistani art critic, Dr Akbar Naqvi, announces, “Pakistan’s history is older than its age… The thesis of this book is that Pakistan is inseparable from [the] heritage of Al-Hind, and without that it has no identity.” Although this might not be news to serious historians, in the ever-evolving exclusionary socio-political ecosystems of the subcontinent, the assertion was like a brick lobbed in the oft stagnant pond of popular discourse. This theme pervades Dr Naqvi’s oeuvre, from Shahid Sajjad’s Sculpture to Sadequain and the Culture of Enlightenment.
“The subcontinent was partitioned,” he writes elsewhere, “but its people continued to share myths, histories, cultures and a multifaceted civilisation.” Consequently, we in India, Pakistan, Bangladesh, and Sri Lanka are “joint custodians” of the discourse of the subcontinent. Dr Naqvi observes this shared ethos in the work of Abanindranath Tagore, who responded to late Mughal miniatures, Abdur Rehman Chughtai, who routinely rendered icons from Hindu mythopoetics, Syed Sadequain Naqvi, who distilled everything from tantric symbolism to Arabic calligraphy, and Ustad Allah Buksh. “Buksh’s art was the Indian face of European painting and accepted national art … [i]n this style, local romantic lore and mythological subjects were painted according to Euro-Indian conventions of Raj art schools.” Manifestly, we, demonyms of the subcontinent, “have several histories converging upon us.”
For Dr Naqvi, history is not good, bad, some sort of binary, or for that matter, linear: Picasso’s Cubism, derived from African masks, in turn influenced the likes of Jean Metzinger and Albert Gleizes or Zubeida Agha and Shakir Ali.
More here. (Note: Thanks to Professor Sadia Abbas)
Saturday, April 22, 2017
Yuval Harari in the New York Times:
In “The Knowledge Illusion,” the cognitive scientists Steven Sloman and Philip Fernbach hammer another nail into the coffin of the rational individual. From the 17th century to the 20th century, Western thought depicted individual human beings as independent rational agents, and consequently made these mythical creatures the basis of modern society. Democracy is founded on the idea that the voter knows best, free market capitalism believes the customer is always right, and modern education tries to teach students to think for themselves.
Over the last few decades, the ideal of the rational individual has been attacked from all sides. Postcolonial and feminist thinkers challenged it as a chauvinistic Western fantasy, glorifying the autonomy and power of white men. Behavioral economists and evolutionary psychologists have demonstrated that most human decisions are based on emotional reactions and heuristic shortcuts rather than rational analysis, and that while our emotions and heuristics were perhaps suitable for dealing with the African savanna in the Stone Age, they are woefully inadequate for dealing with the urban jungle of the silicon age.
Sloman and Fernbach take this argument further, positing that not just rationality but the very idea of individual thinking is a myth. Humans rarely think for themselves. Rather, we think in groups. Just as it takes a tribe to raise a child, it also takes a tribe to invent a tool, solve a conflict or cure a disease. No individual knows everything it takes to build a cathedral, an atom bomb or an aircraft. What gave Homo sapiens an edge over all other animals and turned us into the masters of the planet was not our individual rationality, but our unparalleled ability to think together in large groups.
From Columbia University Press:
This selection of poetry and prose by Ghalib provides an accessible and wide-ranging introduction to the preeminent Urdu poet of the nineteenth century. Ghalib's poems, especially his ghazals, remain beloved throughout South Asia for their arresting intelligence and lively wit. His letters—informal, humorous, and deeply personal—reveal the vigor of his prose style and the warmth of his friendships. These careful translations allow readers with little or no knowledge of Urdu to appreciate the wide range of Ghalib's poetry, from his gift for extreme simplicity to his taste for unresolvable complexities of structure.
Beginning with a critical introduction for nonspecialists and specialists alike, Frances Pritchett and Owen Cornwall present a selection of Ghalib's works, carefully annotating details of poetic form. Their translation maintains line-for-line accuracy and thereby preserves complex poetic devices that play upon the tension between the two lines of each verse. The book includes whole ghazals, selected individual verses from other ghazals, poems in other genres, and letters. The book also includes a glossary, the Urdu text of the original poetry, and an appendix containing Ghalib's comments on his own verses.
Go here to read the introduction to the book.
Video length: 1:00:42
Rick Nauert in Psych Central:
Music is primal, said neuroradiologist Jonathan Burdette, M.D., of Wake Forest Baptist Medical Center in North Carolina. It affects all of us, but in very personal, unique ways.
“Your interaction with music is different than mine, but it’s still powerful,” he said.
“Your brain has a reaction when you like or don’t like something, including music. We’ve been able to take some baby steps into seeing that, and ‘dislike’ looks different than ‘like’ and much different than ‘favorite.’”
To study how music preferences might affect functional brain connectivity — the interactions among separate areas of the brain — Burdette and his fellow investigators used functional magnetic resonance imaging (fMRI), which depicts brain activity by detecting changes in blood flow.
Scans were made of 21 people while they listened to music they said they most liked and disliked from among five genres (classical, country, rap, rock, and Chinese opera) and to a song or piece of music they had previously named as their personal favorite.
Those fMRI scans showed a consistent pattern: The listeners’ preferences, not the type of music they were listening to, had the greatest impact on brain connectivity, especially on a brain circuit known to be involved in internally focused thought, empathy, and self-awareness.
A Venetian critic named Bruno Alfieri saw:
(in Jackson Pollock’s work)
—absolute lack of harmony
—complete lack of structural organization
—total absence of technique, however rudimentary
—once again, chaos
from Art In America, February 1994
Being true to what we are, what is,
frayed around the edges, perhaps, and growing weird.
Born in NYC, and from there, no movement.
It is our own terror, our own making,
abandoned in the high-rise night
like an impotent frog.
2. Absolute lack of harmony
There are times when you can’t illuminate nothing, man.
Don’t open that door, they say, don’t even enter the room.
My second wife would know, she didn’t belong
among the pacifists making music. Every
day you encounter people going
straight to hell.
3. Complete lack of structural organization
The summer air, by itself, is enough.
Add a few fireflies at twilight for memory’s sake.
And measure all the green you’ve seen.
Insane with desire to go home again?
Listen, the sky is whirling overhead.
Listen to the silence.
Jessica Brown in NY Magazine:
It’s very likely that, even in the last 24 hours, you’ve switched so seamlessly between being a friend to an employee, boss to parent, or customer to neighbor, that you didn’t even notice yourself doing it. We all switch between multiple roles in a given day, requiring us to draw on different aspects of our personality, and even alter how we talk. (If you spoke to your newborn baby the same way you greeted your boss in the morning, for example, you’d probably be sent home to rest up.) One place most people juggle different identities is at work. Maybe you belong to a few different teams, for example, or maybe you both do and teach your job at the same time, like a doctor who also teaches medical students. Or you might have two or three different jobs entirely; perhaps you work part-time in a coffee shop to fund your freelance endeavors or tech start-up. Even within one role, you might be a supportive co-worker one minute, and deal-clinching boss the next, all before your morning coffee.
But while this constant juggling sounds exhausting, it doesn’t necessarily harm us, according to a study recently published in the journal Academy of Management. There are two main responses to identity-switching, according to Lakshmi Ramarajan, one of the study’s authors. Some of us will experience what she calls “identity conflict,” where we find it difficult to manage multiple identities, whereas others have “identity enhancement,” where different roles are seen as being complementary to each other. Ramarajan, from Harvard University — along with co-researchers Steffanie Wilk from Ohio State University, and Nancy Rothbar, from the University of Pennsylvania — argues, perhaps unsurprisingly, that your experience hinges on your outlook. Seeing multiple work identities as good for each other can help you be more productive and feel more motivated at work. Seeing your different identities as being in conflict with each other, however, could be putting a downer on your day.
Philippe Sands in The Guardian:
I was 19 when I first read If This Is a Man, and the book filled a gap created by the shadows cast across an otherwise happy childhood home by Auschwitz and Treblinka: my maternal grandparents, rare survivors of the horrors, never talked about their experiences or those who were disappeared, and in this way Levi’s account spoke directly, and personally, offering a fuller sense of matters for which words were not permitted. His has not been the only such book – there are others, including more recent works such as Thomas Buergenthal’s A Lucky Child, Göran Rosenberg’s A Brief Stop on the Road from Auschwitz, and Marceline Loridan-Ivans’s But You Did Not Come Back – but it was the first. He was a messenger of detail, allowing me to see and feel matters of dread and horror: waiting for a deportation order; travelling in a cattle cart by train; descending a ramp for selection; imagining what it must be like to know you are about to be gassed and cremated; struggling for survival surrounded by people you love and hate. Levi’s voice was especially affecting, so clear, firm and gentle, yet humane and apparently untouched by anger, bitterness or self-pity. If This Is a Man is miraculous, finding the human in every individual who traverses its pages, whether a Häftling (prisoner) or Muselmann (“the weak, the inept, those doomed to selection”), a kapo or a guard.
Levi, a 23-year old chemist, was arrested in December 1943 and transported to Auschwitz in February 1944. There he remained until the camp was liberated on 27 January 1945. He arrived back home in Turin in October, unrecognisable to the concierge who had seen him only a couple of years earlier. This and more I learned from Ian Thomson’s nuanced biography, Primo Levi, which enriches our understanding of the author. On Levi’s return, stories were told and notes prepared, as he went back to work at a paint factory. By February 1946, he had completed a first draft about the last 10 days of his time in the camp, a section that would come to be the book’s last chapter, written “in furious haste”. Ten months later, there was a complete text, worked on “with love and rage”, reflecting a vow “never to forget”.
Friday, April 21, 2017
Jonathan Guyer and Surti Singh in the Los Angeles Review of Books:
"We find absurd, and deserving of total disdain, the religious, racist, and nationalist prejudices that make up the tyranny of certain individuals who, drunk on their own temporary omniscience, seek to subjugate the destiny of the work of art.” So wrote 37 Egypt-based artists and writers in their 1938 manifesto Long Live Degenerate Art, expressing solidarity with their counterparts in Europe suffering under fascism. This was the beginning of the Art and Liberty Group, an avant-garde movement also known as Egypt’s Surrealists.
“Modern art in Egypt was always a pale copy and a delayed copy,” says the contemporary Egyptian painter Adel El Siwi, “but for the first time in our history, we have this very rare moment where what was going on in Paris was in parallel to other things going on in Cairo.” The Art and Liberty Group forged connections with Surrealists and Trotskyists abroad while shaping their own identity. Working in tandem with their European peers, they also grappled with the circumstances of an increasingly militarized Egyptian capital, where trends in art and publishing remained conservative. They responded to the fault lines of interwar Cairo and were of a piece with them.
By the time of the 1952 Free Officers’ coup in Egypt, which led to the presidency of Gamal Abdel Nasser and the rise of a new Egyptian nationalism and later pan-Arabism, the members of the Art and Liberty Group had been dispersed: many were exiled or imprisoned.
Hannah Devlin in The Guardian:
Push an object and Newton’s laws (and common experience) dictate that it will accelerate in the direction in which it was shoved.
“That’s what most things that we’re used to do,” said Michael Forbes, a physicist at Washington State University and co-author of the paper, which shows that normal intuitions do not always apply to physics experiments. “With negative mass, if you push something, it accelerates toward you.”
Negative mass has previously cropped up in speculative theories, including those suggesting the existence of wormholes, a form of cosmological shortcut between two points in the universe. Just as electric charge can be either positive or negative, matter could, hypothetically, have either positive or negative mass.
For an object with negative mass, Newton’s second law of motion, in which a force is equal to the mass of an object multiplied by its acceleration (F=ma) would be experienced in reverse.
Theoretically, this sounds straightforward, but picturing how this behaviour would work in the real world is bewildering, even for experts.
Deirdre Coyle in Electric Literature:
For a while, I was seeing a guy who really liked David Foster Wallace. He once forced me to do cocaine by shoving it inside me during sex. He wasn’t the first man to recommend Wallace, but he’s the last whose suggestion I pretended to consider. So while I’ve never read a book by Wallace, I’m preemptively uninterested in your opinion about it.
These recommendations from men have never inspired me to read Wallace’s magnum opus, Infinite Jest, or his essays, or stories, or even to take the path of least resistance and see the Jason Segel movie about him. Said recommendations have, however, festered over such a long period that they’ve mutated into deeply felt opinions about Wallace himself: namely, that he was an overly self-aware genius who needed a better editor and that I’d hate his writing.
Wallace-recommending men are ubiquitous enough to be their own in-joke. New York Magazine notes that “Wallace, too, has become lit-bro shorthand…some women [treat] ‘loves DFW’ as synonymous with ‘is one of those motherfuckers’” (hi, it’s me). When conservative Supreme Court nominee Neil Gorsuch cited Wallace in a hearing, The New Republic asserted that “Wallace is the lingua franca of a certain subset of overeducated, usually wealthy, extremely self-serious (mostly) men.” Onion-esque news outlet Reductress clickbaited me perfectly with “Why I’m Waiting for The Right Man to Tell Me I Should Read ‘Infinite Jest.’” Wallace is on a list of books that literally all white men own.
Video length: 15:12
FRANCIS PICABIA is famous above all for his flamboyant stylistic and ideological diversity. This diversity has created a legend. The legend has to do with freedom: Picabia is heralded—especially by artists—as the insouciant trickster deity of modernism, the Aquarian hero of artistic self-determinacy in the face of all sorts of orthodoxies, even (especially) the right ones.
The legend is productive and, given the pictorial efficacy of so many of his best works, deserved. It does, however, confuse the central problem of Picabia’s career. It mistakes the symptom (the artist’s matchless stylistic diversity) for the cause, which is philosophical. What made Picabia, among all the artists of the historical avant-garde, so apparently immune to stylistic and ideological coherence and codification? And what, if anything, separates his project from mere decadence (however attractive) or dilettantism?
The fullness, clarity, and informative depth of his recent retrospective at the Museum of Modern Art allowed us to revisit such questions. Picabia, who was born into wealth and privilege, lived in a world in which everything was possible because nothing had any meaning or purpose. He experienced freedom as a curse—the curse of a man falling forever in endless space—and sought desperately to escape it. The curse is best called nihilism, and was diagnosed most famously by Nietzsche, whom Picabia adored: “What does nihilism mean? That the highest values devalue themselves. The aim is lacking; ‘why?’ finds no answer.”
Babies are hermeneutic subjects par excellence. When they come out of the womb, none of our dichotomies apply, not even outside and inside one’s body, day and night, me and you. And every waking hour they start interpreting the world: noticing patterns (nap then lunch, bath-book-song then sleep), contrasts (wet/dry, mom’s arms/dad’s arms, banging on a small yogurt pot/on a large one), cruxes of signifiers (mom’s endlessly changing facial expression, sounds, movement, versus the mobile above the crib), and reference points that anchor their lives into a recognizable, hospitable, shall we say human world (the doudou, mom’s smell, the blankie, soon something like “home”). As much as we try to read them, they are readers of the world: they approach the most ordinary object as a universe to explore, a mystery to decipher. Not a single object is common, because at first nothing has anything in common with anything else. Before categories exist to sap our enjoyment of the here and now by concealing everything under a name, thus creating the illusion we know them, each thing is a unique instance of just itself. So here they are, navigating a sea of ever changing information, where very little is ever the same for lack of being remembered or even consciously differentiated or apprehended as separate from a magma of other singular experiences, in a learning experiment that spans everything from what air feels like in one’s lung to the difference between liquids and solids, experiencing the world without ever naming it. Quite an immersion program, where no possible translation into any reference language or culture exists, where the very shape of space and time, the boundaries of one’s skin are still fluid. The amount of “newness” in a single minute of a baby’s day is daunting. And the rate of their epistemological adjustment is staggering.
Unlike the other long-lived megafauna, Steller’s sea cows, one of the last of the Pleistocene survivors to die out, found their refuge in a remote scrape of the ocean instead of on land. The sea cows were relatives of the manatee and dugong. Unlike those two species, they were adapted to living in frigid Arctic waters. They were also much larger, growing to be as long as 30 feet from tail to snout, versus 10 for a manatee. Before the Ice Age, they seem to have been ubiquitous along the edge of the Pacific, living everywhere from Japan to the Baja Peninsula. By the 18th century, when they were first made known to Western science, the sea cows were confined to waters surrounding two tiny Arctic Islands in the Commander Chain, in between the Aleutians and the Kamchatka Peninsula.
The sea cows were first described by the German naturalist Georg Steller in the 18th century. Steller was part of an expedition organized led by the Danish explorer Vitus Bering. Financed by the Imperial Russian government, its mission was to chart the waters between Siberia and North America, and find a workable route between the two if possible.
Nathaniel Scharping in Discover:
“Cancer has been cured a thousand times.”
So says Christopher Austin, the director of the National Center for Advancing Translational Sciences (NCATS) at the National Institutes of Health. Austin should know — as the director of NCATS, his focus is on exactly these kinds of groundbreaking laboratory studies. His proclamation comes with a significant caveat that will pop the bubbles in your champagne. Austin is so interested in these studies because they all happened in mice, in a lab. When the hundreds of different drugs that made mouse tumors disappear were carried forward to human trials, they went in and came out without doing what they promised. Or worse, they turned out to be toxic. The failure of drugs and procedures to translate from animal models to humans plagues the entirety of medical research. An astounding 90 percent of drug trials never make it from the first phase of development to FDA approval, and the humble lab mouse shoulders much of the blame. Long held as the standard model for animal research, scientific advances have called the reign of the mouse into question. But should scientists ditch their furry lab model altogether, or can they simply design a better mouse?
Mice were introduced into the lab back in the 1920s, when an ambitious young geneticist named Clarence Cook Little believed he had found the perfect model for studying cancer. Little strongly believed that cancer was an inheritable disease, and mice, short-lived and low maintenance, turned out to be the ideal subjects for his experiments. Little would go on to found The Jackson Laboratory and sold mouse strains to researchers all over the country, even going so far as to secure the mouse as the official animal model for research funded with government grants following the passage of the National Cancer Institute Act in 1937.
Michael Price in Science:
For women looking to become pregnant through in vitro fertilization (IVF), a diamond petri dish could be a girl’s best friend. That’s one conclusion from a new study, which finds that human sperm cells live longer and move more efficiently on diamond surfaces compared with traditional polystyrene petri dishes. The researchers also discovered that shining a red light on the sperm cells improved their performance. Combining these techniques might significantly increase the chances of IVF success. During IVF procedures, sperm is introduced to an egg in a petri dish. If the egg is successfully fertilized, the resulting zygote is implanted into the woman’s uterus. The critical fertilization stage usually takes place on polystyrene, a plastic from which almost all petri dishes are made. Sperm, like most cells, exude harmful, cell-disrupting molecules known as reactive oxygen species (ROS). Inside the body, these ROS last only fractions of a second and are quickly neutralized as they bind with nearby molecules. But polystyrene naturally forms a thin, gluelike nano-layer of water on its surface, which traps the ROS. “The sperm is stewing in its own ROS,” says Andrei Sommer, a physicist who led the study while working at Ulm University in Germany and who is currently working as an independent scientist. “This longer exposure is highly, highly, highly destructive to the cell.” The upshot is that many sperm cells exposed to polystyrene quickly lose their motility, turning from “guided missiles” into barely moving or immobile cells incapable of fertilizing an egg. Generally, the more highly motile sperm cells a sample contains, the more likely it is that an IVF procedure will result in pregnancy, though a host of other factors like sperm count and egg quality also play a role. (Eggs produce less ROS and therefore are less susceptible to oxidative damage than are sperm cells.)
Building on previous work by his lab, Sommer and colleagues wondered whether keeping sperm cells on a material like diamond, which forms a slick, not sticky, surface layer of water, would protect them from ROS. The researchers coated quartz petri dishes with a superthin layer of diamond less than a micron thick, and put human sperm cells on them. They assigned the cells one of four grades ranging from A (rapidly moving and the most likely to fertilize) to D (completely immobile and incapable of fertilization). Then they did the same thing for sperm cells taken from the same sample but placed on traditional polystyrene petri dishes.
A lake sunken among
cedar and black spruce hills:
On the ice a woman skating,
red against the white,
concentrating on moving
in perfect circles.
(actually she is my mother, she is
over at the outdoor skating rink
near the cemetery. On three side
of her there are street of brown
brick houses; cars go by; on the
fourth side is the park building.
The snow banked around the rink
is grey with soot. She never skates
here. She’s wearing a sweater and
faded maroon earmuffs, she has
taken off her gloves)
Now near the horizon
the enlarged pink sun swings down.
Soon it will be zero.
With arms wide the skater
turns, leaver her break like a diver’s
trail of bubble.
Seeing the ice
as what it is, water:
seeing the months
as they are, the years
in sequence occurring
the miniature human
figure balanced on steel
needles (those compasses
floated in sauces) on time
time circling:....... miracle
Over all I place
a glass bell
by Margaret Atwood
from Selected Poems
Oxford University Press, 1976
Thursday, April 20, 2017
Mitch Waldrop in National Geographic:
Albert Einstein was already a world-famous physicist when the FBI started keeping a secret dossier on him in December 1932. He and his wife Elsa had just moved to the United States from their native Germany, and Einstein had been very vocal about the social issues of his time, arguing publicly against racism and nationalism.
By the time of Einstein’s death on April 18, 1955, that FBI file would be 1,427 pages long. Agency director J. Edgar Hoover was deeply suspicious of Einstein’s activism; the man was quite possibly a communist, according to Hoover, and was certainly “an extreme radical.”
Einstein himself probably would have laughed out loud at those labels if he’d known about them; he’d heard far worse from the Nazis back home. And he was not at all intimidated by officialdom. “Unthinking respect for authority is the greatest enemy of truth,” he declared in 1901.
Video length: 1:08:36
[Thanks to Yohan John.]
Mark Blyth and Matthias Matthijs in Review of International Political Economy:
Like the Bourbons of old, who learned nothing and forgot nothing, the narrow loss of right-wing populist Norbert Hofer in the Austrian presidential elections, or the defeat of Prime Minister Matteo Renzi's constitutional reforms in Italy just a month after Trump's victory, failed to register as important, let alone connected to prior events. At the time of writing, the spread between German Bunds and French Trésors has steadily widened as financial markets began to price in the real possibility that Marine Le Pen, leader of the extreme-right Front National, may actually win the French Presidential elections in May 2017. Contemporary International Political Economy (IPE) theory has pretty much nothing to say about these events. Sadly, this is nothing new, since it was no different a mere decade before.
Back in early 2007, both scholarly and elite conventional wisdom agreed on the following. The ongoing multilateral Doha round of the World Trade Organization (WTO) was going to take some time to negotiate, but it would be completed in the not too distant future. Unsustainable global macroeconomic imbalances were thought to increase the risk of a US dollar crash. The world economy was blessed by a ‘Great Moderation’ in volatility wrought by the technocratic competence of independent central bankers. The euro was going to be the new international reserve currency of choice and a beacon of monetary stability. Further economic and political integration was still the choice for Europe despite temporary setbacks around the adoption of a proposed constitutional treaty. The IMF looked like it would soon have to shut down its operations because there were no more financial crises lurking around the corner. The ‘BRICs’ were going to substantially increase their overall clout in global economic governance. And finally, international migration flows would only continue to intensify. So what actually happened?
Alan McIntyre in The Scottish Review:
So why has Trump latched onto the idea that coal mining is an industry that must be saved at all costs? It's for the same reason that he pedals the fantasy of bringing steel mills back to Ohio – a nostalgia for an industrial America that shaped and supported working-class communities and which technological progress is inexorably erasing. The truth, whether it's miners in West Virginia or shipyard workers on the Clyde, is that hard physical work in homogenous communities created social bonds and social capital that isn't easily replaced. Whether those communities are in the remote valleys of Appalachia or the somewhat less remote valleys of South Wales, the closing of a mine wreaks social havoc.
Thirty years on from Arthur Scargill's failed strike, the parallels are clear. It wasn't just about jobs and economics, it was about trying to protect a way of life that, while dirty and dangerous, was also unique. The jobs may eventually be replaced, but loose networks of Uber drivers are unlikely to spawn many male voice choirs, world-class brass bands, or the social and cultural corona that surrounded the pits. A key insight from J D Vance's recent award-winning Appalachian family history, 'Hillbilly Elegy', is that when the social capital anchored in a concentration of traditional jobs gets eroded, whole communities can descend into a purposeless existence in which drug abuse, alcoholism and domestic violence find fertile ground.
The political current that Trump has expertly tapped into is the need for 'meaningful work' rather than just a job.
Inside every utopia is a dystopia striving to get out. World-changing plans to bring all human life and activity under beneficent control devolve inevitably into regimentation and compulsion. Edenic life-affirming communes descend into chaos and waste. Our presently evolving techutopia has barely reached its peak, and yet in it this horror-movie process has already begun: information must be free, and so lies and manipulations proliferate; common human connections are degraded; limits on power and self-dealing erode. Inequality increases with differential access. And all this in less than a single generation.
The utopian promises of the mid-twentieth century (modernism, broadly understood) stayed alive for longer, largely because its projects, which depended on design, manufacturing processes, materials, and city planning, took years or decades to be fully realized, while the world seemed to stay much the same. In 1939 the greater part of America was still a land of Toonerville trolleys, boarding houses, balky mules, door-to-door salesmen, pump handles, iceboxes, A&P’s, nerve tonics, kerosene, two-bit haircuts, hand-rolled cigarettes, incurable diseases, and patched inner-tubes, even as the idea of the future was brought closer with every newsreel and skyscraper and issue of Life or Look.
While older utopias often were predicated on returning to the virtues of an imagined past, a key figure behind this utopia of the new was Norman Bel Geddes, a theatre designer turned industrial designer. Bel Geddes is best known for designing the General Motors Futurama exhibit at the 1939 New York World’s Fair, a huge and hugely celebrated vision of the world of 1960, full of towering modernist skyscrapers in new cities and lots and lots of cars.