Culture

ACamp1900

Counting my ‘bet against ND’ winnings
Messages
48,951
Reaction score
11,234
All I remember from those movies:

-they are generally really shitty
-they killed Jazz at some point which made them even shittier
 

connor_in

Oh Yeeaah!!!
Messages
11,433
Reaction score
1,006
What's your opinion?

Its fine as a thought experiment as far as it goes, I guess and in the limited scope of the thing I would try to save the little girl also.

But, whether Shapiro is being douchey or not (Tomlinson whom Shapiro was responding to, was being douchey too and said the Shapiro proved Tomlinson's point and yet didn't get Tomlinson's point and then blocked him), if you start to adjust the factors, the answer could easily change. The freezer could survive possibly and would have a better chance than the girl. Just because a person chooses the girl, does that mean its the right choice? Change the little girl to Mother Teresa...same choice made? Hitler? Future of the human race hangs in the balance...same choice?

Do you throw yourself on a grenade for your buddies or man the SAM to try to shoot down the plane that might destroy a whole village/the base/etc?

Have never been a huge fan of thought experiments and drawing conclusions from them because I have always been someone who plays out the devils advocate voice in my head for either myself or a group. Thus, I can completely understand Shapiro's arguments.
 

connor_in

Oh Yeeaah!!!
Messages
11,433
Reaction score
1,006
All I remember from those movies:

-they are generally really shitty
-they killed Jazz at some point which made them even shittier

I have not seen the last 3...and not sure I ever finished the one before that
 

Domina Nostra

Well-known member
Messages
6,251
Reaction score
1,388
What's your opinion?

You save the girl. There is no reason to accept the utilitarian's interpretation that saving one girl over 1,000 embryos means that you don't value the others as human life.

The choice is false. You could hypothetically concede the premise that a little girl's life is more valuable than a 1,000 non-implanted lab embryos, without conceding the compatible premise that they have real worth and should be protected insofar as possible. Just like you can concede the premise that the President life should be the first priority of the Secret Service, without conceding that the civilian lives that might be lost while the President is being secured are worthless.

The primary purpose of insisting on the fact that embryos are human beings is not to make sure that scientists and rescue personnel treat them like human beings in the course of their lab experiments. It's to ensure that they don't conduct inhumane lab experiments with human beings in the first place.

Insofar as scientists decide that they (as expert technicians), or 51% of eligible voters, or the Supreme Court, or an ethics board, or anyone else gets to make an arbitrary decision about what human life is and when it starts, you are going to end up with really morally-twisted, untenable situations like this one. Here, its pretty much the extreme: 1,000 human lives in a metal box, about to be burned. That is messed up.

But you still save the girl because emotionally, it would be almost impossible and unthinkable to do otherwise. Can you really live the rest of your life with a little girl's screams stuck in your head, knowing you could have stopped them? Could you really look her parents, or brothers and sisters in the eye and explain why 1,000 is more important than 1? At the same time, you have every right to think of the mad scientists that are mass producing humans in their lab are barbaric. But you aren't suddenly complicit in their insanity because you saved a little girl. Those lives are on them, not you.

This, of course, begs the question of whether it would be morally acceptable to save the 1,000 embryos over the little girl? Personally, I think to do so exhibits a lack of judgment that leans towards a cold, over-abstract utilitarianism. They aren't implanted. Their road to further development is too tenuous to be weighed against a little girl's in that moment. But I wouldn't blame an earnest 13 yo for doing it in good conscience.
 
Last edited:

connor_in

Oh Yeeaah!!!
Messages
11,433
Reaction score
1,006
<blockquote class="twitter-tweet" data-lang="en"><p lang="en" dir="ltr">Lady Gaga, Joe Biden team up for PSA on sexual assault <a href="https://t.co/JK8oj7JxCz">https://t.co/JK8oj7JxCz</a> <a href="https://t.co/rGpdk2gwXw">pic.twitter.com/rGpdk2gwXw</a></p>— The Hill (@thehill) <a href="https://twitter.com/thehill/status/923377153856634880?ref_src=twsrc%5Etfw">October 26, 2017</a></blockquote>
<script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>



HAHAHAHAHA...Joe Biden?

<blockquote class="twitter-tweet" data-conversation="none" data-lang="en"><p lang="und" dir="ltr"> <a href="https://t.co/fuedGUDekh">pic.twitter.com/fuedGUDekh</a></p>— GayPatriot™ (@GayPatriot) <a href="https://twitter.com/GayPatriot/status/923392550798544898?ref_src=twsrc%5Etfw">October 26, 2017</a></blockquote>
<script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>



...and in the interest of equal time...

Second Woman: George H.W. Bush Groped Me
 

Legacy

New member
Messages
7,871
Reaction score
321
The primary purpose of insisting on the fact that embryos are human beings is not to make sure that scientists and rescue personnel treat them like human beings in the course of their lab experiments. It's to ensure that they don't conduct inhumane lab experiments with human beings in the first place.

Insofar as scientists decide that they (as expert technicians), or 51% of eligible voters, or the Supreme Court, or an ethics board, or anyone else gets to make an arbitrary decision about what human life is and when it starts, you are going to end up with really morally-twisted, untenable situations like this one. Here, its pretty much the extreme: 1,000 human lives in a metal box, about to be burned. That is messed up.

Reps. (A selective quote from above). This is the ethical question and the reason I posted the article about surgeons in China doing DNA alterations on an artificially fertilized egg outside the womb to correct beta thalassemia and the article on surgery in Germany on a months old fetus (or child if you wish) to correct spina bifida.

Fetal surgery

If you can do that surgery and you believe that the fetus is a child, shouldn't you do that surgery?

If you can treat genetic diseases with Stem cells and the fetus (or child) has a high degree of spontaneous abortion (or not), aren't you obligated to do so?

If you feel that your decisions on the use of Stem cells must involve how those stem cells were obtained and their source, shouldn't the same decision process be made in your choice of who to save from the fire? that is that the embryos were artificially created in a lab? Then you choose to save the girl.

Inherited Genetic Diseases: Stem Cell Treatments
 
Last edited:

wizards8507

Well-known member
Messages
20,660
Reaction score
2,661
If you can do that surgery and you believe that the fetus is a child, shouldn't you do that surgery?

If you can treat genetic diseases with Stem cells and the fetus (or child) has a high degree of spontaneous abortion (or not), aren't you obligated to do so?
Not necessarily. From the Catholic Medical Association:

"Extraordinary means are medicines, treatments and operations that do not provide a benefit, and/or involve excessive burden, pain, or expense. While people may use extraordinary means, they are not morally obligated to do so since earthly life for humans is not an absolute good and because, at some point, medical interventions are no longer effective and/or because the costs and burdens of medical interventions are out of proportion to the good of earthly life that they are intended to serve."

If you feel that your decisions on the use of Stem cells must involve how those stem cells were obtained and their source, shouldn't the same decision process be made in your choice of who to save from the fire? that is that the embryos were artificially created in a lab? Then you choose to save the girl.

Inherited Genetic Diseases: Stem Cell Treatments
That shouldn't matter. Even if the embryos were created immorally, they're still human lives and should be treated as such.

Regardless, the implications of the analogy are flawed in a different way. The implication, clearly, is that if you let the embryos die, then you're a hypocrite who doesn't REALLY believe that they're human lives. One can recognize that they're human lives and still make morally coherent decisions that allow them to die. It's the difference between a direct abortion and an indirect abortion. For example, consider an ectopic pregnancy, in which an embryo implants inside a Fallopian tube. The only possible outcome is that the child will die, but there are medical decisions that need to be made for the life and well being of the mother. The mother is at risk of a ruptured Fallopian tube, which can result in death or, at the very least, severe consequences for her future fertility. It is never licit to directly kill the embryo, but it's acceptable to allow the embryo to die as a byproduct of the mother's treatment. To connect this example to the analogy, you're saving the mother (the girl), which results in the embryos dying, but that's a very different thing than directly killing the embryos yourself.
 

Legacy

New member
Messages
7,871
Reaction score
321
Not necessarily. From the Catholic Medical Association:

"Extraordinary means are medicines, treatments and operations that do not provide a benefit, and/or involve excessive burden, pain, or expense. While people may use extraordinary means, they are not morally obligated to do so since earthly life for humans is not an absolute good and because, at some point, medical interventions are no longer effective and/or because the costs and burdens of medical interventions are out of proportion to the good of earthly life that they are intended to serve."

With regard to fetal surgery to correct genetic defects and teaching on indirect abortion and ectopic pregnancy in, Humanae Vitae, says:
"the Church does not consider at all illicit the use of those therapeutic means necessary to cure bodily diseases, even if a foreseeable impediment to procreation should result therefrom — provided such impediment is not directly intended."

Fetal surgery to correct those defects is therapeutic and allowed? But stem cell therapy to correct those defects is not?
 
Last edited:

Whiskeyjack

Mittens Margaritas Ante Porcos
Staff member
Messages
20,894
Reaction score
8,126
<blockquote class="twitter-video" data-lang="en"><p lang="en" dir="ltr">Some "highlights" from an interview with the first robot citizen <a href="https://t.co/qwrPkvOWpK">pic.twitter.com/qwrPkvOWpK</a></p>— Dave Jorgenson (@davejorgenson) <a href="https://twitter.com/davejorgenson/status/923582231507030017?ref_src=twsrc%5Etfw">October 26, 2017</a></blockquote>
<script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
 

zelezo vlk

Well-known member
Messages
18,013
Reaction score
5,055
<blockquote class="twitter-tweet" data-lang="en"><p lang="en" dir="ltr">Some "highlights" from an interview with the first robot citizen <a href="https://t.co/qwrPkvOWpK">pic.twitter.com/qwrPkvOWpK</a></p>— Dave Jorgenson (@davejorgenson) <a href="https://twitter.com/davejorgenson/status/923582231507030017?ref_src=twsrc%5Etfw">October 26, 2017</a></blockquote>
<script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>

Dave Jorgenson (pronounced YOR-GEN-SEN) is much less funnier than he thinks. And you can tell him I said that too
 

Whiskeyjack

Mittens Margaritas Ante Porcos
Staff member
Messages
20,894
Reaction score
8,126
Dave Jorgenson (pronounced YOR-GEN-SEN) is much less funnier than he thinks. And you can tell him I said that too

It was tweeted into my timeline with "This couldn't possibly go wrong." Shared it mostly for the creepy video. Didn't even read that's guy's commentary.
 

dublinirish

Everestt Gholstonson
Messages
27,329
Reaction score
13,092
<blockquote class="twitter-tweet" data-lang="en"><p lang="en" dir="ltr">Lady Gaga, Joe Biden team up for PSA on sexual assault <a href="https://t.co/JK8oj7JxCz">https://t.co/JK8oj7JxCz</a> <a href="https://t.co/rGpdk2gwXw">pic.twitter.com/rGpdk2gwXw</a></p>— The Hill (@thehill) <a href="https://twitter.com/thehill/status/923377153856634880?ref_src=twsrc%5Etfw">October 26, 2017</a></blockquote>
<script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>



HAHAHAHAHA...Joe Biden?

<blockquote class="twitter-tweet" data-conversation="none" data-lang="en"><p lang="und" dir="ltr"> <a href="https://t.co/fuedGUDekh">pic.twitter.com/fuedGUDekh</a></p>— GayPatriot™ (@GayPatriot) <a href="https://twitter.com/GayPatriot/status/923392550798544898?ref_src=twsrc%5Etfw">October 26, 2017</a></blockquote>
<script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>



...and in the interest of equal time...

Second Woman: George H.W. Bush Groped Me

Biden has a track record of supporting women's rights. not sure what point you are trying to make here?
 

zelezo vlk

Well-known member
Messages
18,013
Reaction score
5,055
It was tweeted into my timeline with "This couldn't possibly go wrong." Shared it mostly for the creepy video. Didn't even read that's guy's commentary.

I haven't bothered with it either, he's an old friend of mine. I knew he started at WaPo, but never thought he'd end up being shared by other friends
 

wizards8507

Well-known member
Messages
20,660
Reaction score
2,661
Fetal surgery to correct those defects is therapeutic and allowed? But stem cell therapy to correct those defects is not?
There's nothing wrong with stem cell therapy if those stem cells are harvested from adults or cord blood or anything like that. The issue with stem cell therapy is the deliberate fertilization of embryos for the purposes of destroying them.
 

Whiskeyjack

Mittens Margaritas Ante Porcos
Staff member
Messages
20,894
Reaction score
8,126
Mary Eberstadt just published an article in The Weekly Standard titled "The Primal Scream of Identity Politics":

Just when it seemed as if the election of Donald Trump had rendered his supporters incoherent with triumphalism and his detractors incoherent with rage—thereby dumbing-down political conversation for a long time to come—something different and more interesting happened. A genuine debate has sprung up among liberals and progressives about the subject of the hour: identity politics.

Jump-started by a short manifesto called The Once and Future Liberal: After Identity Politics by Columbia University professor Mark Lilla, it’s a conversation worth following for reasons beyond partisanship. As in his New York Times essay published 10 days after Trump’s electoral victory, Lilla’s purpose in this broadside is two-fold: to excoriate identity politics, sometimes called “identity liberalism,” and to convince his “fellow liberals that their current way of looking at the country, speaking to it, teaching the young, and engaging in practical politics has been misguided and counterproductive.”

The discussion now underway on the left illuminates a fault line that has yet to be sufficiently mapped or explained. The deeper question raised is not the instrumental concern of Lilla and others—how liberalism can retool itself in order to win more elections. Rather, it’s the elemental one: How has the question of “identity” come to be emotional and political ground zero for so many in America, and elsewhere in the Western world?

As the online Stanford Encyclopedia of Philosophy explains in its entry on identity politics, “wherever they line up in the debates, thinkers agree that the notion of identity has become indispensable to contemporary political discourse.” In The Once and Future Liberal, Lilla offers one kind of answer to why that’s so. “[T]hirty years of economic growth and technological advance that followed the Second World War,” he argues, combined with new geographic, institutional, and erotic mobility and led to a “hyperindividualistic bourgeois society, materially and in our cultural dogmas.”

Flush with prosperity and unprecedented new freedoms, we moderns, Lilla believes, went on to atomize ourselves: “Personal choice. Individual rights. Self-definition. We speak these words as if a wedding vow.” By the 1980s, such hyperindividualism coalesced into what he calls the “Reagan Dispensation,” which prized self-reliance and small government over the collective—thus marking a radical break from the preceding “Roosevelt Dispensation” emphasizing more communal attachments, including duty and solidarity.

By embracing the politics of identity, Lilla says, liberals and progressives have unwittingly contaminated their politics with a “Reaganism for lefties,” resulting in the toxic consequences visible today: shutdowns of free speech on campuses, out-of-touch urban and globalized elites, and a political order deformed into a “victimhood Olympics.”

In effect, his is a supply-side answer to the “why” question: Identity politics became the order of the day because it could. What’s lacking from this analysis—as from other critiques, right as well as left—is what might be called the demand-side answer: Why have so many people found in identity politics the very center of their political being?

After all: That identitarianism is now the heart and soul of politics for many is a visceral truth—as raw as the footage of violent political clashes making headlines with a frequency that would have shocked most citizens only a decade ago. What’s singular about such politics is exactly its profound and immediate emotivism, its frightening volatility, its instantaneous ignition into unreasoned violence. Lilla acknowledges this reality obliquely in describing “a kind of moral panic about racial, gender and sexual identity”—all true, as far as it goes. But the problem is that it doesn’t go nearly far enough.

When a mob of young men attack a 74-year-old man and a middle-aged woman, as happened at Middlebury College in March in the case of Charles Murray and Allison Stanger, something deeper is afoot than American individualism run amok. When debate after campus debate is preemptively shut down due to social media threats of violence, reasoned talk of a “Reagan Dispensation” doesn’t begin to capture the menace there. Berkeley spent $600,000 on “security” for a visit by the conservative author and pundit Ben Shapiro. Non-progressive speakers who have nothing to do with racism or supremacism are regularly harassed, threatened, disinvited, and shouted down on campuses across the country. To ascribe these transgressions to identitarian narcissism alone is to miss what’s truly novel about them. And most chilling.

What’s unfolding on campuses today isn’t merely the “pseudo-politics of self-regard” of Lilla’s description. It’s all panic, all the time. Even “assaults on free speech” doesn’t capture the gravity of the new menace, though of course they are that, too. Dangerous collective hysteria is more like it.

Writing after she gave a 2015 lecture at Oberlin on feminism that was mocked and jeered and protested, including by people whose mouths were covered in duct tape, Christina Hoff Sommers observed that “some of those students need the services of a professional deprogrammer. What I saw was very cult-like.” “The inmates ran the asylum,” Charles Murray reported of the attack at Middlebury, adding that he had “never encountered anything close to this. . . . and the ferocity.” Ben Shapiro, who has been heckled all over the country, has pronounced his protesters “delusional.”

The trend toward preemptive silencing is, moreover, escalating. As Stanley Kurtz has documented in National Review, there were as many anti-speech incidents on U.S. campuses in the first six weeks of the fall 2017 semester as in the entire spring semester, including the “disruption of a lecture at Reed College, the shout-down of former FBI Director James Comey at Howard University, the disruption of an immigration debate at the University of Pittsburgh, the shout-down of a spokesman for the ACLU at William and Mary, and the attempted shout-down of the President of Virginia Tech.”

This aggressive irrationalism goes missing from The Once and Future Liberal, as it does from most other accounts by liberals of identity politics. It is true, as Lilla observes, that today’s culture of victimization encourages people to “descend into the rabbit hole of self.” But the question remains: What gravitational force pulls them toward that hole in the first place?

In a widely discussed essay in the Atlantic in 2015, “The Coddling of the American Mind,” Jonathan Haidt and Greg Lukianoff offered another answer of sorts. “Something strange is happening at American colleges and universities,” they reported, and “some recent campus actions border on the surreal.” The authors dubbed the phenomenon “vindictive protectiveness”—a runaway effort to protect students from psychological harm, including by punishing putative transgressors.

Alarmed by this development for several reasons—not least because they fear that it teaches students to think “pathologically”—the authors pointed to empirical measures of campus devolution. Most arresting, they noted, rates of mental illness in young adults have been rising, both on campus and off, in recent decades. Some portion of the increase is surely due to better diagnosis and greater willingness to seek help, but most experts seem to agree that some portion of the trend is real. Nearly all of the campus mental-health directors surveyed in 2013 by the American College Counseling Association reported that the number of students with severe psychological problems was rising at their schools [emphasis added].

The authors also mentioned “The rate of emotional distress reported by students themselves is also high, and rising.”

Such a generation-wide descent into psychiatric trouble calls for explanation. Haidt and Lukianoff, to their credit, were uncertain about the why question, writing, “It’s difficult to know exactly why vindictive protectiveness has burst forth so powerfully in the past few years.” They zeroed in on several possible contributors: the surge in crime in the 1960s and ’70s that made parents more protective of their children; the “zero tolerance” policies in schools after the Columbine shootings; increased political polarization; and the rise of social media.

No doubt social media are an inescapable part of what ails us. The question is no longer whether Google is making us stupid, as Nicholas Carr put it in 2008. It’s instead whether Facebook and Snapchat and Instagram are driving many to mutually assured social destruction. Yet for all that public life is being configured and disfigured by connectivity, even social media and the Internet do not answer the why question about identity frenzy. They beg it, for two reasons: first, because identity politics predates the Internet; and second, because the self-absorption and insecurity amplified by nonstop introspection online just summon the point of causality all over again. Why can’t Narcissus stop looking at himself? The frantic, habitual electronic construction of one’s self or selves underscores that identity is all the rage—often literally—especially, though not only, among people in their teens, 20s, and 30s.

Other writers suggest a third explanation of sorts for the fury behind identity politics: white racism.

In a piece titled “America’s First White President,” published on December 10, 2016, Salon executive editor Andrew O’Hehir delivered an example of this line of thought. “Trump,” he wrote, “is the first president defined by whiteness, the first whose glaring and overwhelming whiteness is a salient issue that lies at the core of his appeal.” The “presidential candidate’s race played a central role in his campaign, and is one of the key factors that got him elected.” The 2016 result, in O’Hehir’s telling, amounts to retribution of some kind for America’s having, twice, elected a black president (“the election of Barack Obama inflicted a psychic wound that demanded immediate payback, at almost any cost”).

In an essay published in September, “The First White President,” bestselling author Ta-Nehisi Coates issued a related indictment in the Atlantic. Drawn from a new collection called We Were Eight Years in Power: An American Tragedy, Coates’s piece asserted that: “To Trump, whiteness is neither notional nor symbolic but is the very core of his power”; Trump “is a white man who would not be president were it not for this fact”; and Trump is “the first president whose entire political existence hinges on the fact of a black president.” The essay also included an attack on a number of high-profile writers, Mark Lilla among them, as unreliable commentators on identity politics—on the grounds that “those charged with analyzing [Trump] cannot name his essential nature, because they too are implicated in it.”

As these analyses and associated commentary show, the idea that contemporary politics is rooted fundamentally in white racism endures. Once again, though, as an explanation for the prevalence and emotional staying power of identity politics at large, white racism doesn’t suffice—for the simple reason that so many other members in the identitarian coalition claim other motivations, other oppressors, and other grievances.

For starters, identity politics isn’t just a left-wing thing. As Washingtonian magazine noted of a pro-Trump rally on the National Mall in September, “There were Hispanics for Trump, Grandmas for Trump, Gays for Trump.” A member of the last explained, “Identity politics is very popular, and very important.” Some on the “alt-right” regard themselves as identitarians, too.

Then there are, perhaps most notable of all, the identitarians of sexual politics, whose influence on law and culture has been especially prodigious during the past quarter-century. In addition to the epiphenomenal manifestations of the obsession over sexual and gender identity—Facebook’s 71 genders, media focus on intersex and transgendered people, the “bathroom wars,” and the rest—there are areas into which sexual identitarianism has sunk lasting roots.

At least eight countries—including India, Germany, and Australia—now allow for identification as something other than male or female, and a growing number of states and other authorities leave gender identity in official forms to personal say-so. Marriage, adoption, commercial surrogacy, and other areas of family law have been reconfigured around the world. In fact, viewing the whole of “identity politics” through the single lens of public efficacy, one would have to say that sexual identitarians have both exercised and obtained more power than any other single group.

The legalization of same-sex marriage, as observers both for and against the 2015 Obergefell decision came to agree, owed most to one factor: empathy for the moral claim that attraction to one’s own sex is like pigmentation or DNA, immutable and immune to change. Yet a split cultural second later, exactly the opposite case has come to be made for the intersex, transgendered, and other sexual minorities: that identity is fluid, indeterminate, perhaps even recalcitrant, rather than born that way.

In this head-on collision of purported creation stories about sexual and gender identity that cannot possibly both be true, we see once more that the question Who am I? is the most fraught of our time. It has become like a second skin: something that can’t be sloughed off, or even scratched, without excruciating pain to the subject—reason and logic and the rest of persuasion-as-usual be damned.

White racism, past and present, explains many terrible things. So do other evils, including the kind just revealed in the Harvey Weinstein scandal. But neither racism nor sexual predation nor related injustices can explain the primordial emotionalism and fierce irrationality that have come to be part and parcel of identitarianism for all.

Writing in New York magazine in September, Andrew Sullivan delivered an insight in the direction of the why question. American politics, he wrote, has become a war between “two tribes”: “Over the past couple of decades in America, the enduring, complicated divides of ideology, geography, party, class, religion, and race have mutated into something deeper, simpler to map, and therefore much more ominous.”

Yet what, exactly, has caused so many Americans to want to join one of these tribes in the first place? Sullivan advanced a list of many “accelerants” from the past few decades: the failed nomination of Judge Robert Bork to the Supreme Court, mass illegal Latino immigration, Newt Gingrich’s GOP revolution, talk radio, Fox News, MSNBC, partisan gerrymandering, the absence of compulsory military service, multiculturalism, declining Christianity, the rural brain drain, and more.

No doubt, taken together, these disparate events explain something about the political trajectory now behind us. But does one really become part of a horde, defined in opposition to other hordes, over relatively quotidian prompts like these? Doesn’t the very word “tribal” suggest that something more primal may be in the mix too?

Of course it does.

Just as “tribe” is antecedent to the state, something else is antecedent to the tribe—something missing from all the high-profile talk, pro and con, about how American and other Western societies have become mired in identitarianism.

In laying out the particulars of today’s “tribes,” Sullivan wrote of “unconditional pride, in our neighborhood and community; in our ethnic and social identities and their rituals; among our fellow enthusiasts. There are hip-hop and country-music tribes; bros; nerds; Wasps; Dead Heads and Packers fans; Facebook groups. . . . And then, most critically, there is the Uber-tribe that constitutes the nation-state, a megatribe that unites a country around shared national rituals, symbols, music, history, mythology, and events.” And here we reach a turning point, not just in this essay but also in the widening argument, because that list omits what the majority of humanity would call the most important “tribe” of all.

It’s not that “America Wasn’t Built for Humans,” as the title of Sullivan’s piece has it. It’s rather that America, like other civilizations, was built for humans who learned community not from roving bands of unrelated nomads, but from those around them—beginning in the small civilization of the family.

In Democracy in America, Alexis de Tocqueville wrote of how democratic governance shapes familial relations, rendering fathers and sons more equal and closer and less hierarchical than they are in its aristocratic counterparts. If it’s obvious that a form of government can shape the family, isn’t it even more obvious that the first polity to which future citizens belong—the family—will shape the kind of citizens they become?

Our macro-politics have gone tribal because our micro-politics are no longer familial. This, above all, is what’s happened during the five decades in which identity politics went from being unheard of to ubiquitous.

To quote from the Stanford Encyclopedia of Philosophy once more, “although ‘identity politics’ can draw on intellectual precursors from Mary Wollstonecraft to Frantz Fanon, writing that actually uses this specific phrase, with all its contemporary baggage, is limited almost exclusively to the last thirty years.” Its founding document is “The Combahee River Collective Statement,” a 1977 declaration that grew out of several years of meetings among black feminists in Massachusetts.

The key assertion of this manifesto, which prefigured the politics to come, is “This focusing on our own oppression is embodied in the concept of identity politics. We believe that the most profound and potentially most radical politics come directly out of our own identity, as opposed to working to end somebody else’s oppression.”

And who is the “somebody else” to whom the document refers? Men. “Contemporary Black feminism,” the Combahee River Collective explained, “is the outgrowth of countless generations of personal sacrifice, militancy, and work by our mothers and sisters [emphasis added].” When men are mentioned in the Combahee statement, it is largely as adversaries with “habitually sexist ways of interacting with and oppressing Black women.” The writers mourn that male reaction to feminism “has been notoriously negative.” Most evocative of all is the note of dejection: “We realize that the only people who care enough about us to work consistently for our liberation are us.”

The founding document of identity politics, in other words, reflects reality as many African American women would have found it in the 1970s—one in which they were the canaries in the coal mine of the sexual revolution. It’s a world in which men are ever less trusted, relations between the sexes are chronically estranged, and marriage is thin on the ground. African American women were—and still are—disproportionately affected by aspects of the sexual revolution like abortion, out-of-wedlock births, and fatherless homes. Isn’t it suggestive that the earliest collective articulation of identity politics came from the community that was first to suffer from the accelerated fraying of family ties, a harbinger of what came next for all?

Identity politics cannot be understood apart from the preceding and concomitant social fact of family implosion. The year before the Combahee document’s publication—1976—was a watershed of a sort. The out-of-wedlock birth rate for black Americans tipped over the 50-percent mark (the 1965 Moynihan Report worried over a rate half as high). This rate has kept climbing and exceeded 70 percent in 2016. At the same time, other measures indicating the splintering of the nuclear and extended family expanded too. By 2012, Millennial women—who were then under the age of 30—exhibited for the first time the out-of-wedlock birth rate of black women in 1976: i.e., more than 50 percent. Millennials, of course, are the demographic backbone of identity politics.

And the out-of-wedlock birth rate is just one measure of the unprecedented disruption of the family over the last half-century-plus. Consider, just in passing, the impact of abortion. In 2008, the Guttmacher Institute reported that 61 percent of women terminating pregnancies were already mothers of at least one child. Many children—and many grown children—have been deprived of potential siblings via pregnancy termination.

Abortion, like single motherhood, is only one engine of a phenomenon that has come to characterize more and more American lives during the past half-century: what might be called the “family, interrupted.” Many post-sexual revolutionary people now pass through life vaguely aware of family members who could have been but aren’t—whether via parental disruption in childhood or the long string of exes now typical in Western mating or abortion or childlessness by choice or other romantic and sexual habits that did not exist en masse until after the 1960s.

Many of us now live in patterns of serial monogamy, for instance, in which one partner is followed by another. When children occur, this means a consistently shifting set of family members to whom one is sometimes biologically related and sometimes not: stepfathers, half-siblings, “uncles,” and “cousins.” As couples form and un-form, finding new partners and shedding old ones, these relations morph with them. The result for many people is the addition and subtraction of “family” members on a scale that was unimaginable until reliable contraception for women—the FDA approved the first oral contraceptive in 1960—and the legalizing of abortion. Together they made the de-institutionalization of traditional marriage and family possible.

P.D. Eastman’s famous children’s book Are You My Mother? was published in 1960. In it, a baby bird goes from one creature to another trying to find one like him, finally to be re-united in a happy maternal ending. Imagine playing something like that game today.

Is That Your Stepsister? Maybe yes—if your mother is still married to that person’s biological father. If instead this parental unit has split up and her father has moved with his daughter to a new state and acquired a new step-mother and new stepsiblings, likely no.

Is That Your Uncle? This too depends entirely on what other adults in the picture have decided to do. If your so-called “uncle” was your mother’s boyfriend several boyfriends ago and she hasn’t seen him in years, then you and he are probably not “related” anymore—or anyway, would be unlikely to describe yourselves as such. On the other hand, if that “uncle” is your biological father’s biological brother, then likely the bond still holds—even if your biological mother and father never married.

Is That Your Niece? If she’s your sister’s biological or adopted child, you’d probably say yes. But if instead she’s your sister’s new live-in boyfriend’s child from a previous liaison, you’d hesitate. By similar logic, say, the adult children of a man who takes a trophy wife their age are unlikely to refer to her as “Mom.”

And round and round the game of musical identity chairs goes.

The result of all these shifting and swirling selves is that many people no longer know what almost all of humanity once knew, including in the great swath of history that was otherwise nastier, more brutish, and shorter than ours: a reliable circle of faces, many biologically related to oneself, present during early and adolescent life. That continuity helped to make possible the plank-by-plank construction of identity as son or daughter, cousin or grandfather, mother or aunt, and the rest of what’s called, tellingly, the family tree.

For many people, for all kinds of reasons, the roots of that tree no longer hold. Whether you miss Ozzie and Harriet or are instead Modern Family’s biggest fan—as the previous president claimed to be—is immaterial. The relative stability of yesterday’s familial identity could not help but answer the question at the heart of today’s politics—Who am I?—in a way that many of us can’t answer it anymore.

And, of course, these tributaries poled by isolated pilots are pulled into powerful currents of politics. It is in this sense that identity politics does indeed explain something of Donald Trump’s election—not so much because he is “our first white president,” but because he’s obviously a place-holder for something else. The faction of the country that includes the “resistance” treats him more like an abusive stepfather than an elected head of state. Then there is his base, whose loyalty in the face of one transgression after another has been remarked upon for many months. For at least some of those people, Trump is—as the alt-right provocateur Milo Yiannopoulos put it—“Daddy.”

As a final proof that the roots of identity politics owe much to what used to be called modern nurture—or the lack of it—consider one more phenomenon baffling to non-identitarians that becomes clearer on applying this proposed familial lens: the otherwise-inexplicable frenzy over “cultural appropriation.”

The emblematic eruption came at Yale in 2015, when the university’s intercultural affairs committee preemptively asked students to avoid certain Halloween costumes that might offend. Faculty member Erika Christakis offered a mild demurral, suggesting in an email the logical consequences of such a policy—that it might bar blonde toddlers, say, from dressing up as Asian characters from a popular Disney film. Her dissent sparked a protest letter signed by hundreds; an ugly public confrontation between menacing students and Christakis’s husband, sociologist Nicholas Christakis; a social media campaign against both of them; and, ultimately, her departure from Yale.

Yet this was only the most visible of the costume controversies. The president of the University of Louisville issued a public apology in 2015 after it was revealed that he and a group of staffers had worn sombreros and other Mexican-themed attire at a lunch party. Surveying the Halloween-costume parameters handed down by authorities at Tufts, a writer for the Daily Beast in 2016 noted that “students who heed the above guidelines are presumably restricted from dressing up as samurais, hombres, geishas, belly dancers, Vikings, ninjas, rajas, French maids, Bollywood dancers, Rastafarians, Pocahontas, Aladdin, Zorro, or Thor.” Even lingerie peddlers aren’t immune from the politics of “appropriation.” Victoria’s Secret was outed in the fashion pages last year not because of what its models weren’t wearing, ironically, but because of what they were: accessories that made sartorial reference to Chinese New Year and similar taboos.

Again, to perplexed bystanders who think a bongo drum is just a bongo drum and that tacos can be enjoyed by everyone, the cacophony over cultural “ownership” makes no sense. That’s why appropriation-protesters are typically written off by non-progressives as “snowflakes,” say, or the products of misguided “helicopter parenting”—i.e., spoiled brats. But what if the truth lies somewhere else?

“Mine! Mine! It’s mine!” The manifest panic behind cries of “cultural appropriation” is real—as real as the tantrum of a toddler. It’s as real as the developmental regression seen in the retreat to campus “safe spaces,” those tiny non-treehouses stuffed with candy, coloring books, and Care Bears. In social science, the toddler’s developmental “mine!” is called the “endowment effect”—the notion that humans ascribe extra value to possessions simply because they’re theirs. Some theorists consider it a subset of another human proclivity: loss aversion.

Maybe that cultural scream of “mine!” is issuing from souls who did have something taken from them—only something more elemental than the totemic objects now functioning as figurative blankies for lost and angry former children. As of today, less than 65 percent of American children live with both biological parents, even as other familial boughs have broken via external forces like the opioid crisis, criminality and incarceration, and globalization. Maybe depression and anxiety have been rising steadily among children and teenagers for a reason. Maybe the furor over “appropriation” unveils the true foundation of identity politics, which is pathos.

Did anyone really think things would turn out otherwise—that the massive kinship dislocations of the past 60 years wouldn’t produce increasingly visible, transformative effects not only in individual lives and households, but on politics and culture, too?

After all, it defies common sense to believe that the human surroundings during one’s formative years have no effect on the life to come. There’s also a library of social science, now over half a century in the making, tracing the links between fatherless homes and higher risks of truancy, criminality, psychiatric trouble, and the rest of the ledger suggesting that ripping up primordial ties hasn’t done society any favors. It’s all there, no matter how many of us have deep reasons for wishing otherwise.

One irony is certain. While identity politics has become an object of conversation in the left-leaning circles of Anglo-American and European political thought, deliverance from today’s disfigurations cannot come from the same quarter. The reason is simple. Not only identitarians but also liberals and progressives who are now anti-identitarian or identitarian-skeptical all agree on one big thing: The sexual revolution is off-limits for revision anywhere, anytime. It is their moral bedrock.

No-fault divorce, out-of-wedlock births, paid surrogacy, absolutism about erotic freedom, disdain for traditional moral codes: The very policies and practices that chip away at the family and drive the subsequent flight to identity politics are those that liberals and progressives embrace.

Then there are related family-unfriendly social realities that they also deem benign. Pornography, which once upon a time some feminists objected to, is now the stuff of their full-throated enthusiasm. Prostitution has been re-defined as the more anodyne “sex work.” And, of course, abortion is—in the unnervingly theological modifier applied to it by Hillary Clinton and many others on the left—“sacrosanct.” In the end, asking liberals and progressives to solve the problem of identity politics is like asking the proverbial orphan with chutzpah who murdered his parents.

Yes, conservatives have missed something major about identity politics: its authenticity. But the liberal-progressive side has missed something bigger. Identity politics is not so much politics as a primal scream. It’s the result of what might be called the Great Scattering—the Western world’s unprecedented familial dispersion.

Anyone who’s ever heard a coyote in the desert, separated at night from the pack, knows the sound. Maybe the otherwise-unexplained hysteria of today’s identity politics is just that: the collective human howl of our time, sent up by inescapably communal creatures who can no longer identify their own.

Since this is the Standard, Eberstadt makes no mention of her own side's complicity in this state of affairs, though I thought it worth sharing here since it connects a lot of dots pretty well. Add in that the Sexual Revolution was simply the logical outworking of capitalism in the realm of social relations, and it'd be much easier to fully endorse.
 

Old Man Mike

Fast as Lightning!
Messages
8,979
Reaction score
6,471
This is too complicated to make any reasonable response to, but I'm urged to make a few inadequate (to say the least) remarks:

1). the core ailment of modern people (as described by the author anyway --- not all folks are so lacking in "family" etc) --- is that they are afraid. They are afraid because they have almost no one to really count on, or perhaps only one decent "friend." Some of this can be traced to all that (important) insight/commentary about families being destroyed by the rendering vortices of American Life, dispersing people to all ends of the country and planet, and terminally destroying relationships. Some of this can be traced in a different dimension by the society offering so much "candy" (of both pleasures and power) that the individual never grows out of adolescence "until it's too late" for real relationships. ... and there are other intertwined things. These American never-look-back, grab-what-you-can, self-immersed hedonist/survivors Dance Dance Dance until they find that their Dance is heading for a cliff and they need (badly) someone else (who isn't there.)

2). not everyone's that childish nor self-immersed in America however. I saw lots of self-immersed fools not trying to grow up, while I did my college prof thing for 30 years. But I also had classes full of students who did NOT project this. Those classes were the ones I taught as Professor of Environmental Studies. Those kids were constantly "in community" with others of all sorts, but united around the feeling that the world, society, the village was a lot bigger and more important than just themselves. Instead of seeing life as basically "Intake", they viewed Intake as merely the requirement that needed to be addressed in order for them to have "Outflow" to contribute to a higher good. You can have this same non-ego-immersed micro-community with different cores than environmentalism, of course --- certain sorts of religions have it, certain forms of "causes" also, even the military, or a type of business. Sometimes those alternate families aren't genuine, but if the cause is "expansive" enough they can be.

3). America drives people away from these alternative families just as effectively as it destroys natural ones. With the new self-immersed "gaming" and VR it will just get more isolating and worse --- people don't want to hear that any more than they want to hear about substance abuse and alcoholism, because it's just too much fun. "Fun" is for the self-immersed and it is a strong person who can moderate the Fun when they have the means to go for it. I fear the impact of uber-real VR very greatly by the way. It should turn out to the the ultimate drug. I can imagine self-immersed people dying at their stations, wasting away both flesh and mental acuity (to say nothing of time and money), becoming freaks of isolation lacking social graces let alone true friends or families, acting out fantasies in the real world, including the grossest of violences.

4). My environmental kids were not on that trajectory. We examined our deep values all the time (even had a whole course which I invented, called Values and Sustainability to help do so.) Like those values or not, these young persons valued things beyond themselves deeply, and saw their lives as properly lived trying to aid the sustainability of natural, social, and technological systems, not just for their sakes but for everyone in the future. And those kids hung together --- even after graduation. They typically held on to their natural biological families, but also these new "families" that they formed in school. I know this because they even considered this old man a member to be visited now and then back in Kalamazoo. The bottomline here is that one needs to stop obsessing in the Candystore and in the Mirror. Look out the Window, then get out there and find something that you genuinely feel is more important than yourself, and start supporting it. If it's a true and nobel "cause", "family" will form naturally around you.


This is just the tiniest bit of reflection on this gargantuan problem that we have.
 

zelezo vlk

Well-known member
Messages
18,013
Reaction score
5,055
Here's an Opinion piece from the NY Times underscoring what I've been railing against lately. https://www.nytimes.com/2017/10/27/opinion/sunday/happiness-is-other-people.html?smid=tw-nytopinion&smtyp=cur

In a particularly low moment a few years back, after arriving friendless and lonely from Britain to live in the United States, I downloaded a “happiness app” onto my phone. It was surprisingly hard to choose one. There were close to a thousand bliss-promising options in the app store — ones that would teach you to meditate or be grateful, or that would send you photomontages of sunsets and puppies or unfeasibly flattering shots of your loved ones (giving you a moment to temporarily ignore your actual, less attractive loved ones.)

The app I eventually chose messaged me every hour or so with a positive affirmation that I was supposed to repeat to myself over and over. “I am beautiful,” or “I am enough.” The problem was, every time my phone buzzed with an incoming message, I would get a Pavlovian jolt of excitement thinking an actual person was trying to contact me. “I am enough,” I would snarl bitterly upon realizing the truth, unable to shake the feeling that, without friends or community, I really wasn’t.

“Happiness comes from within,” said the inspirational photo-card in my Facebook news feed a few days later, the loopy white meme-font set against a backdrop of a woman contorted into a yoga pose so tortuous it looked as though she might actually be investigating her own innards trying to locate her bliss.

Having spent the last few years researching and writing a book about happiness and anxiety in America, I’ve noticed that this particular strain of happiness advice — the kind that pitches the search for contentment as an internal, personal quest, divorced from other people — has become increasingly common. Variations include “Happiness is determined not by what’s happening around you, but what’s happening inside you”; “Happiness should not depend on other people”; and the perky and socially shareable “Happiness is an inside job.” One email I received from a self-help mailing list even doubled down on the idea with the turbocharged word mash-up “withinwards,” (although when the subject heading “Go Withinwards” landed in my inbox I briefly thought it was an ad for a nose-to-tail offal restaurant.)

In an individualistic culture powered by self-actualization, the idea that happiness should be engineered from the inside out, rather than the outside in, is slowly taking on the status of a default truism. This is happiness framed as journey of self-discovery, rather than the natural byproduct of engaging with the world; a happiness that stresses emotional independence rather than interdependence; one based on the idea that meaningful contentment can be found only by a full exploration of the self, a deep dive into our innermost souls and the intricacies and tripwires of our own personalities. Step 1: Find Yourself. Step 2: Be Yourself.

This isolationist philosophy is showing up not just in the way that many Americans talk about happiness, but in how they spend their time. People who study these things have observed a marked increase in solitary “happiness pursuits” — activities carried out either completely alone or in a group without interaction — with the explicit aim of keeping each person locked in her own private emotional experience.

Spiritual and religious practice is slowly shifting from a community-based endeavor to a private one, with silent meditation retreats, mindfulness apps and yoga classes replacing church socials and collective worship. The self-help industry — with its guiding principle that the search for happiness should be an individual, self-focused enterprise — is booming, with Americans spending more than $1 billion on self-help books a year to help guide them on their inner journeys. Meanwhile, “self-care” has become the new going out.

But while placing more and more emphasis on seeking happiness within, Americans in general are spending less and less time actually connecting with other people. Nearly half of all meals eaten in this country are now eaten alone. Teenagers and young millennials are spending less time just “hanging out” with their friends than any generation in recent history, replacing real-world interaction with smartphones.

And it’s not just young people. The Bureau of Labor Statistics’ Time Use Survey shows that the average American now spends less than four minutes a day “hosting and attending social events,” a category that covers all types of parties and other organized social occasions. That’s 24 hours a year, barely enough to cover Thanksgiving dinner, and your own child’s birthday party.

The same time-use data also allocates another, broader category to “socializing and communicating,” a designation that includes not just the good stuff — the heart-to-heart with an adoring spouse, or setting the world to rights with a dear friend over a bottle of wine — but any kind of socializing and communicating at all between two adults, where this is their main activity rather than an incidental part of something else, like working. All in all — and that includes daily bouts of nagging, arguing and whining — the average American spends barely more than half an hour a day on social communication. Compare that to time per day spent watching television (three hours) or even “grooming” (one hour for women, and just over 44 minutes for men).

Self-reflection, introspection and some degree of solitude are important parts of a psychologically healthy life. But somewhere along the line we seem to have gotten the balance wrong. Because far from confirming our insistence that “happiness comes from within,” a wide body of research tells us almost the exact opposite.

Academic happiness studies are full of anomalies and contradictions, often revealing more about the agendas and values of those conducting them than the realities of human emotion. But if there is one point on which virtually every piece of research into the nature and causes of human happiness agrees, it is this: our happiness depends on other people.

Study after study shows that good social relationships are the strongest, most consistent predictor there is of a happy life, even going so far as to call them a “necessary condition for happiness,” meaning that humans can’t actually be happy without them. This is a finding that cuts across race, age, gender, income and social class so overwhelmingly that it dwarfs any other factor.

And according to research, if we want to be happy, we should really be aiming to spend less time alone. Despite claiming to crave solitude when asked in the abstract, when sampled in the moment, people across the board consistently report themselves as happier when they are around other people than when they are on their own. Surprisingly this effect is not just true for people who consider themselves extroverts but equally strong for introverts as well.

What’s more, neglecting our social relationships is actually shockingly dangerous to our health. Research shows that a lack of social connection carries with it a risk of premature death comparable to that of smoking, and is roughly twice as dangerous to our health as obesity. The most significant thing we can do for our well-being is not to “find ourselves” or “go within.” It’s to invest as much time and effort as we can into nurturing the relationships we have with the people in our lives.

Given all that, the next time you have the choice between meditating and sitting in a bar with your friends complaining about meditation class, you should probably seriously consider going to the bar, no matter what your happiness app says.

Ruth Whippman is the author of “America the Anxious: How Our Pursuit of Happiness Is Creating a Nation of Nervous Wrecks.”
 

zelezo vlk

Well-known member
Messages
18,013
Reaction score
5,055
FYI, this thread doesn't show up on the Front Page anymore. Was it supposed to be included with the political threads in that announcement?
 

Legacy

New member
Messages
7,871
Reaction score
321
I don't think most Trump supporters feel like the government owes them a job. But they believe that the government has made affirmative policy choices that have consistently undermined their quality of life and employment prospects over the last several decades. The ability of big box companies to move in and crush all the small businesses was created and is still actively encouraged by our policy decisions. Just as the ability for our corporate giants to flout anti-trust law, off-shore jobs, abuse H1B visas, etc. You and wizards frequently default to this mode of argumentation that assumes neoliberal economics and maximal globalization as a given. It's not. There were other policy decisions we could have made, other priorities we could have pursued, but we didn't. And the route we opted for produced a lot of losers.



Again, you and wiz love to point to the instability of our lower classes as evidence that their suffering is all self-inflicted. It's a convenient argument if you're looking to justify washing your hands of these people, but it's not nearly that simple. Much of that instability is due to policies and priorities that our elites have affirmatively chosen. The current opioid crisis is a great example, which is in turn a sign of the hopelessness these people experience after watching all the meaningful connections in their lives dissolve in the acid of liberalism.



I guess it comes down to differing ideas about what really matters in life. I don't think it's obvious that moving away in pursuit of a better salary and a higher material standard of living is the "correct" course of action over sacrificing some economic benefits to remain near family. Lots of people live happy purposeful lives without much in the way of material comforts; but I don't see many people living happy purposeful lives without thick communal bonds.



See my first paragraph above. I don't think the destruction of your family business by globalization was an unavoidable outcome.



I don't think it's a generational thing. People are waking up to fact that the decisions being made in Washington are purposefully designed to enrich bankers, CEOs and professionals at the expense of those in "fly over" country. American culture has been very resistant to any form of broad-based class consciousness in the past, but that may be changing.

Reply to IrishLax, post 366. Reps.

Offered for discussion without taking a stance. The Father-Fuhrer (National Review, by Kevin Williamson)
 
Last edited:

zelezo vlk

Well-known member
Messages
18,013
Reaction score
5,055
Here's a great article about architecture and the need for beautiful buildings.

https://www.currentaffairs.org/2017/10/why-you-hate-contemporary-architecture

The British author Douglas Adams had this to say about airports: “Airports are ugly. Some are very ugly. Some attain a degree of ugliness that can only be the result of special effort.” Sadly, this truth is not applicable merely to airports: it can also be said of most contemporary architecture.

Take the Tour Montparnasse, a black, slickly glass-panelled skyscraper, looming over the beautiful Paris cityscape like a giant domino waiting to fall. Parisians hated it so much that the city was subsequently forced to enact an ordinance forbidding any further skyscrapers higher than 36 meters.

Or take Boston’s City Hall Plaza. Downtown Boston is generally an attractive place, with old buildings and a waterfront and a beautiful public garden. But Boston’s City Hall is a hideous concrete edifice of mind-bogglingly inscrutable shape, like an ominous component found left over after you’ve painstakingly assembled a complicated household appliance. In the 1960s, before the first batch of concrete had even dried in the mold, people were already begging preemptively for the damn thing to be torn down. There’s a whole additional complex of equally unpleasant federal buildings attached to the same plaza, designed by Walter Gropius, an architect whose chuckle-inducing surname belies the utter cheerlessness of his designs. The John F. Kennedy Building, for example—featurelessly grim on the outside, infuriatingly unnavigable on the inside—is where, among other things, terrified immigrants attend their deportation hearings, and where traumatized veterans come to apply for benefits. Such an inhospitable building sends a very clear message, which is: the government wants its lowly supplicants to feel confused, alienated, and afraid.


The Tour Montparnasse. Who can possibly defend this? And if there’s something clearly wrong with it, which there is, what is it and why don’t we talk about it more in other cases?
The fact is, contemporary architecture gives most regular humans the heebie-jeebies. Try telling that to architects and their acolytes, though, and you’ll get an earful about why your feeling is misguided, the product of some embarrassing misconception about architectural principles. One defense, typically, is that these eyesores are, in reality, incredible feats of engineering. After all, “blobitecture”—which, we regret to say, is a real school of contemporary architecture—is created using complicated computer-driven algorithms! You may think the ensuing blob-structure looks like a tentacled turd, or a crumpled kleenex, but that’s because you don’t have an architect’s trained eye.

Another thing you will often hear from design-school types is that contemporary architecture is honest. It doesn’t rely on the forms and usages of the past, and it is not interested in coddling you and your dumb feelings. Wake up, sheeple! Your boss hates you, and your bloodsucking landlord too, and your government fully intends to grind you between its gears. That’s the world we live in! Get used to it! Fans of Brutalism—the blocky-industrial-concrete school of architecture—are quick to emphasize that these buildings tell it like it is, as if this somehow excused the fact that they look, at best, dreary, and, at worst, like the headquarters of some kind of post-apocalyptic totalitarian dictatorship.


The New York Times says this is is the building that showed Brutalism could be “playful.” This may be true, but only in the sense that the cat tormenting a mouse, or the torturer doing “eeny-meeny” to determine which testicle to zap first, is being “playful.”
Let’s be really honest with ourselves: a brief glance at any structure designed in the last 50 years should be enough to persuade anyone that something has gone deeply, terribly wrong with us. Some unseen person or force seems committed to replacing literally every attractive and appealing thing with an ugly and unpleasant thing. The architecture produced by contemporary global capitalism is possibly the most obvious visible evidence that it has some kind of perverse effect on the human soul. Of course, there is no accounting for taste, and there may be some among us who are naturally are deeply disposed to appreciate blobs and blocks. But polling suggests that devotees of contemporary architecture are overwhelmingly in the minority: aside from monuments, few of the public’s favorite structures are from the postwar period. (When the results of the poll were released, architects harrumphed that it didn’t “reflect expert judgment” but merely people’s “emotions,” a distinction that rather proves the entire point.) And when it comes to architecture, as distinct from most other forms of art, it isn’t enough to simply shrug and say that personal preferences differ: where public buildings are concerned, or public spaces which have an existing character and historic resonances for the people who live there, to impose an architect’s eccentric will on the masses, and force them to spend their days in spaces they find ugly and unsettling, is actually oppressive and cruel.

The politics of this issue, moreover, are all upside-down. For example, how do we explain why, in the aftermath of the Grenfell Tower tragedy in London, more conservative commentators were calling for more comfortable and home-like public housing, while left-wing writers staunchly defended the populist spirit of the high-rise apartment building, despite ample evidence that the majority of people would prefer not to be forced to live in or among such places? Conservatives who critique public housing may have easily-proven ulterior motives, but why so many on the left are wedded to defending unpopular schools of architectural and urban design is less immediately obvious.


Here is one hospital in Barcelona. Source: AcidCow

Here is another. Where would you rather convalesce?
There have, after all, been moments in the history of socialism—like the Arts & Crafts movement in late 19th-century England—where the creation of beautiful things was seen as part and parcel of building a fairer, kinder world. A shared egalitarian social undertaking, ideally, ought to be one of joy as well as struggle: in these desperate times, there are certainly more overwhelming imperatives than making the world beautiful to look at, but to decline to make the world more beautiful when it’s in your power to so, or to destroy some beautiful thing without need, is a grotesque perversion of the cooperative ideal. This is especially true when it comes to architecture. The environments we surround ourselves with have the power to shape our thoughts and emotions. People trammeled in on all sides by ugliness are often unhappy without even knowing why. If you live in a place where you are cut off from light, and nature, and color, and regular communion with other humans, it is easy to become desperate, lonely, and depressed. The question is: how did contemporary architecture wind up like this? And how can it be fixed?

For about 2,000 years, everything human beings built was beautiful, or at least unobjectionable. The 20th century put a stop to this, evidenced by the fact that people often go out of their way to vacation in “historic” (read: beautiful) towns that contain as little postwar architecture as possible. But why? What actually changed? Why does there seem to be such an obvious break between the thousands of years before World War II and the postwar period? And why does this seem to hold true everywhere?


Caltrans District 7 Headquarters. Photo credit: Morphosis Architects. Oh my fucking god, just look at it. Look at it! Does this make you happy? Does it nourish your spirit? What’s with all the little random protrusions? Aaaaagghh.
A few obvious stylistic changes characterize postwar architecture. For one, what is (now somewhat derisively) called “ornament” disappeared. At the dawn of the 20th century, American architect Louis Sullivan proclaimed the famous maxim that “form follows function.” Even though Sullivan’s own buildings were often highly ornate, adorned with elaborate Art Nouveau ironwork and Celtic-inspired masonry, “form follows function” was instantly misinterpreted as a call for stark utilitarian simplicity. A few years later, architect and theorist Adolph Loos, in a 1908 essay called “Ornament and Crime,” dramatically declared that a lack of ornamentation was a “sign of spiritual strength.” These two ideas quickly became dogmas of the architectural profession. A generation of architects with both socialistic and fascistic political leanings saw ornament as a sign of bourgeois decadence and cultural indulgence, and began discarding every design element that could be considered “mere decoration.”


This is the kind of thing Louis Sullivan designed and yet people think “form follows function” means you can’t do this anymore for reasons that go unexplained.
A contempt for ornament imbued the imagination of those architects who saw themselves as dedicated to social engineering rather than the mere creation of beautiful trifles. This mindset is best exemplified by the French architect Le Corbusier, who famously characterized the house as a “machine for living.” Corbusier’s ideas about planning and design were still taken seriously even when he proposed his “Plan Voisin” for Paris, which would have involved demolishing half of the city north of the Seine and replacing it with about a dozen enormous uniform skyscrapers. (Thankfully, nobody took him quite seriously enough to let him do it.) Corbusier may have done more than anyone to convince architects that they were no longer allowed to decorate their creations, issuing unquestionable pronouncements, like “the desire to decorate everything about one is a false spirit and an abominable small perversion” and “the more a people are cultivated, the more decor disappears.” He condemned “precious and useless objects that accumulated on the shelves,” and decried the “rustling silks, the marbles which twist and turn, the vermilion whiplashes, the silver blades of Byzantium and the Orient… Let’s be done with it!”


Intricate details used to be a thing. Now they are not. They should be. See also: the amazing lobby of the Guardian Building in Detroit—this is Art Deco, the last truly impressive movement in architecture. Why can’t we do things like this anymore? Why is this just one building instead of every building? Nobody knows. Go ask the people who gave a prize to these things.
This paranoid revulsion against classical aesthetics was not so much a school of thought as a command: from now on, the architect had to be concerned solely with the large-scale form of the structure, not with silly trivialities such as gargoyles and grillwork, no matter how much pleasure such things may have given viewers. It’s somewhat stunning just how uniform the rejection of “ornament” became. Since the eclipse of Art Deco at the end of the 1930s, the intricate designs that characterized centuries of building, across civilizations, from India to Persia to the Mayans, have vanished from architecture. With only a few exceptions, such as New Classical architecture’s mixed successes in reviving Greco-Roman forms, and Postmodern architecture’s irritating attempts to parody them, no modern buildings include the kind of highly complex painting, woodwork, ironwork, and sculpture that characterized the most strikingly beautiful structures of prior eras.


Ceiling of The Temple of Heaven in Beijing, China
The anti-decorative consensus also accorded with the artistic consensus about what kind of “spirit” 20th century architecture ought to express. The idea of transcendently “beautiful” architecture began to seem faintly ludicrous in a postwar world of chaos, conflict, and alienation. Life was violent, discordant, and uninterpretable. Art should not aspire to futile goals like transcendence, but should try to express the often ugly, brutal, and difficult facts of human beings’ material existence. To call a building “ugly” was therefore no longer an insult: for one thing, the concept of ugliness had no meaning. But to the extent that it did, art could and should be ugly, because life is ugly, and the highest duty of art is to be honest about who we are rather than deluding us with comforting fables.

This idea, that architecture should try to be “honest” rather than “beautiful,” is well expressed in an infamously heated 1982 debate at the Harvard School of Design between two architects, Peter Eisenman and Christopher Alexander. Eisenman is a well-known “starchitect” whose projects are inspired by the deconstructive philosophy of Jacques Derrida, and whose forms are intentionally chaotic and grating. Eisenman took his duty to create “disharmony” seriously: one Eisenman-designed house so departed from the normal concept of a house that its owners actually wrote an entire book about the difficulties they experienced trying to live in it. For example, Eisenman split the master bedroom in two so the couple could not sleep together, installed a precarious staircase without a handrail, and initially refused to include bathrooms. In his violent opposition to the very idea that a real human being might actually attempt to live (and crap, and have sex) in one of his houses, Eisenman recalls the self-important German architect from Evelyn Waugh’s novel Decline and Fall, who becomes exasperated the need to include a staircase between floors: “Why can’t the creatures stay in one place? The problem of architecture is the problem of all art: the elimination of the human element from the consideration of form. The only perfect building must be the factory, because that is built to house machines, not men.”


A Peter Eisenman building. Note the total lack of plant life. Plant life might accidentally make you feel happy and comfortable, and happiness is a bourgeois illusion. The tiny figures on the left seem to be attempting a picnic on the curve. They are probably cold and windswept—as they should be.
Alexander, by contrast, is one of the few major figures in architecture who believes that an objective standard of beauty is an important value for the profession; his buildings, which are often small-scale projects like gardens or schoolyards or homes, attempt to be warm and comfortable, and often employ traditional—what he calls “timeless”—design practices. In the debate, Alexander lambasted Eisenman for wanting buildings that are “prickly and strange,” and defended a conception of architecture that prioritizes human feeling and emotion. Eisenman, evidently trying his damnedest to behave like a cartoon parody of a pretentious artist, declared that he found the Chartres cathedral too boring to visit even once: “in fact,” he stated, “I have gone to Chartres a number of times to eat in the restaurant across the street — had a 1934 red Mersault wine, which was exquisite — I never went into the cathedral. The cathedral was done en passant. Once you’ve seen one Gothic cathedral, you have seen them all.” Alexander replied: “I find that incomprehensible. I find it very irresponsible. I find it nutty. I feel sorry for the man. I also feel incredibly angry because he is fucking up the world.”


A Christopher Alexander building. Not necessarily saying it’s better, but we do feel we’d be somewhat less likely to be spontaneously overcome with existential dread here.
The 1982 debate is perhaps one of the most aggressive public exchanges in the history of design. It is also illuminating, both because of Eisenman’s honesty in defending buildings that make people unhappy and uncomfortable—“If we make people so comfortable in these nice little structures,” he declared, “we might lull them into thinking that everything’s all right, Jack, which it isn’t”—and because of Alexander’s wildly inaccurate prophecy that architects and the public would soon see through Eisenman’s deconstructionist mumbo-jumbo and return to a love of traditional forms and values. In fact, the opposite happened: Alexander sunk into relative obscurity, and Eisenman became yet more famous, winning the National Design Award and garnering prestigious commissions across the world.




The contemporary architect’s passion is aligning elements in ways that intentionally jarring, disorderly, and frustrating.

Architects often get mad when non-architects conflate the terms “modernism,” “postmodernism,” “Brutalism,” etc. They love telling people that, say, “Frank Gehry is actually REACTING to postmodernism.” These terminological disputes can obscure the fact that everything under discussion is actually just a minor variation on the same garbage.
But can these two schools of design, the comfortable and the unsettling, peacefully co-exist? After all, Eisenman insisted that the world had room for both his brand of monumental, discordant poststructuralist architecture and Alexander’s small-scale, hand-made traditional architecture. The extraordinary fact about architecture over the last century, however, is just how dominant certain tendencies have been. Aesthetic uniformity among architects is remarkably rigid. Contemporary architecture shuns the classical use of multiple symmetries, intentionally refusing to align windows or other design elements, and preferring unusual geometric forms to satisfying and orderly ones. It follows a number of strict taboos: classical domes and arches are forbidden. A column must never be fluted, symmetrical pitched roofs are an impossibility. Forget about cupolas, spires, cornices, arcades, or anything else that recalls pre-modern civilization. Nothing built today must be mistakable for anything built 100 or more years ago. The rupture between our era and those of the past is absolute, and this unbridgeable gap must be made visible and manifest through the things we build. And since things were lovely in the past, they must, of necessity, be ugly now.


If it doesn’t make you feel desperately, crushingly alone, it’s probably not a piece of prize-winning contemporary architecture.
For many socialists in the 20th century, the abdication of decorative elements and traditional forms seemed to be a natural outgrowth of a revolutionary spirit of simplicity, solidarity, and sacrifice. But the joke was on the socialists, really, because as it turned out, this obsession with minimalism was also uniquely compatible with capitalism’s miserable cult of efficiency. After all, every dollar expended on fanciful balusters or stained glass rose windows needed to produce some sort of return on investment. And since such things can be guaranteed to produce almost no return on investment, they had to go. There was a good reason why, historically, religious architecture has been the most concerned with beauty for beauty’s sake; the more time is spent elegantly decorating a cathedral, the more it serves its intended function of celebrating God’s glory, whereas the more time is spent decorating an office building, the less money will be left over for the developer.

But let’s leave aside God’s glory—what about ordinary human happiness? One of the most infuriating aspects of contemporary architecture is its willful disdain for democracy. When people are polled, they tend to prefer older buildings to postwar buildings; very few postwar buildings make it onto lists of most treasured places. Yet architects are reluctant to build in the styles that people find more beautiful. Why? Well, Peter Eisenman has spoken for a lot of architects in being generally dismissive of democracy, saying that the role of the architect is not to give people what they want, but what they should want if they were intelligent enough to have good taste. Eisenman says he prefers to work for right-wing clients, because “liberal views have never built anything of value,” due to their incessant concern with public process and public needs. (On a side note, it’s no accident that Howard Roark, protagonist of Ayn Rand’s The Fountainhead and the arch-hero of the American conservative literary canon, is an architect who intentionally dynamites a public housing project because somebody had the gall to add balconies to his original design without his consent.) Eisenman suggests that if we deferred to public taste in music, we would all be listening to Mantovani rather than Beethoven, and uses this as evidence that architects should impose taste from above rather than deferring to democratic desires. Indeed, there is always a “Thomas Kinkade” problem in believing that art should be “democratic.” If you deferred to public taste as judged by sales volume, Kinkade would be the greatest artist in the world. Taylor Swift would be the best musician, and the Transformers series would be the best cinema. Of course, we don’t trust democratic judgment in matters of taste, because people often like things that are garbage.


The original Penn Station, a breathtaking space for ordinary travelers. It was beautiful, so it had to be destroyed. There’s a proposal to rebuild it like it was, but it’s almost certain something far worse will be built instead.
But architecture is very different from other forms of art: people who hate Beethoven aren’t obligated to listen to it from 9-5 every weekday, and people who hate the Transformers series aren’t obligated to watch it every night before bed. The physical environment in which we live and work, however, is ubiquitous and inescapable; when it comes to architecture, it is nigh-impossible for people to simply avoid the things they hate and seek out the things they like. It’s also true that intellectuals are too quick to write off the public as stupid and unable to decide things for themselves. There are plenty of instances where, when something truly great comes along, the public is perfectly capable of recognizing it. Shakespeare’s plays, for instance, have consistently been incredibly popular, despite being complex and intellectual pieces of literature, because they work on multiple levels. They are accessible enough to be loved and appreciated widely, but deep enough to offer fodder for centuries of reflection and analysis. Likewise, the masses tend to like, for example, Gothic cathedrals and Persian mosques, which are breathtakingly intricate and complex works of art.



The left, in particular, should eagerly embrace a conception of architecture that is both democratic and sophisticated. Many of the worst parts of contemporary architecture have echoes of the “bad” parts of leftism: the dreariness of the Soviet Union, the dehumanizing tendency to try to impose from above a grand conception of a new social order. They exemplify what James Scott calls “high modernism,” the twisted effort to “rationalize” human beings rather than accept them as they are and build places that suit them and that they like. The good kind of leftism, on the other hand, operates from the bottom up rather than the top down. It helps people create their own places, rather than creating monolithic structures into which they are placed for their own good. It looks far more like a village than a tower block, decentralized and with a strong connection between the makers of a place and the inhabitants of a place.


Every public space should do its best to uplift you.
At the moment, the needs or wishes of the people who actually have to use buildings are rarely considered at all. Architecture schools do not actually teach students anything about craft or about emotion; most of the courses are highly mathematical, dedicated to engineering and theories of form rather than to understanding traditional modes of building or understanding what people want out of their buildings. Unless they are an uber-wealthy client, users of buildings rarely have much input into the design process. Students do not get to say what kind of school they would like, office workers do not get to say whether they would prefer to work in a glass tower or in a leafy complex of wifi-enabled wooden pagodas. Some of this may come from the design process itself. Unlike in the age of artisanship, there is today a strong separation between the process of designing and the process of making. Frank Gehry designs his work using CAD software, then someone else has to go out and actually build it. But that rupture means that architecture becomes something imposed upon people. It isn’t participatory, and it doesn’t adapt in response to their needs. It’s prefabricated, assembled beforehand off-site and then dumped on the unwitting populace. We are not meant to live in modern buildings; they are made for people who do not poop. Good architecture is made better by the life that people bring to it, but one gets the sense in a contemporary structure that one is befouling the place with one’s odors and filth.

In fact, everyday good architecture should not even be about the building, it should be about the people. If the building isn’t intended as some kind of public monument or centerpiece, it shouldn’t draw much attention to itself. Frank Gehry is a wanton violator of this rule: when he decided to design homes for the Lower Ninth Ward in post-Katrina New Orleans, he created a discordant batch of hyper-contemporary houses that “riffed” on the region’s traditional vernacular architecture. Rather than being concerned to give people comfortable houses that fit in with their surroundings and suited the preferences of the residents, Gehry designed houses that screamed for attention and were fundamentally about themselves rather than about the people of the city he ostensibly cared about. Good buildings recede seamlessly into their surroundings; Gehry’s blare like an industrial klaxon. Similarly, when a building like Peter Cook and Colin Fournier’s Kunsthaus in Austria (the building at the top of this article) is placed in the middle of an old village, the entire fabric of the village is disrupted. The Kunsthaus (a representative example of “blobitecture”) cannot coexist peacefully with the things surrounding it, because it’s impossible to stop looking at it. Like the streaker at the football game, the building parades in front of us with such vulgar shamelessness that no amount of willpower can peel our eyes away.


“Hey, look at me! I am a series of jarring asymmetric block-shapes like everything else!”
Architecture’s abandonment of the principle of “aesthetic coherence” is creating serious damage to ancient cityscapes. The belief that “buildings should look like their times” rather than “buildings should look like the buildings in the place where they are being built” leads toward a hodge-podge, with all the benefits that come from a distinct and orderly local style being destroyed by a few buildings that undermine the coherence of the whole. This is partly a function of the free market approach to design and development, which sacrifices the possibility of ever again producing a place on the village or city level that has an impressive stylistic coherence. A revulsion (from both progressives and capitalist individualists alike) at the idea of “forced uniformity” leads to an abandonment of any community aesthetic traditions, with every building fitting equally well in Panama City, Dubai, New York City, or Shanghai. Because decisions over what to build are left to the individual property owner, and rich people often have horrible taste and simply prefer things that are huge and imposing, all possibilities for creating another city with the distinctiveness of a Venice or Bruges are erased forever.

Once upon a time, socialists liked to make beautiful things; the works of William Morris, John Ruskin, and Oscar Wilde are filled with both celebrations of classical aesthetics and pleas to liberate human beings from the miseries of economic deprivation. The core idea of leftism is that people should be free to flourish, in both body and mind, and they should thus be able to do so materially, spiritually, intellectually, and artistically. Handcrafts and ornament are not bourgeois, they are democratic, in that a society of artisans is a society of people who are getting to maximize their creative capabilities, whereas a society of people in clean-swept Corbusier-style skyscrapers have been reduced to specks, robbed of their individuality, stripped of their ability to make the world their own.

How, then, do we fix architecture? What makes for a better-looking world? If everything is ugly, how do we fix it? Cutting through all of the colossally mistaken theoretical justifications for contemporary design is a major project. But a few principles may prove helpful.


Need to build more of these…
OVERCOMING FEARS
Postwar architecture has been characterized by fear and taboo. Architects are terrified of producing so much as a fluted column, because they believe their peers will think they are stupid, nostalgic, and unsophisticated. As a result, they produce structures that are as inscrutable and irrational as possible, so that people will think they are clever. But they need not be afraid! Their architect friends might think they are stupid if they put in a decorative archway. But we won’t.

1. THE FEAR OF BEAUTY — There is a misconception that if beauty is “subjective,” it therefore doesn’t exist or can’t be discussed. This is wrong; the fact that people disagree on something doesn’t mean it can’t be discussed, just as the fact that there is no “objectively best film” doesn’t prevent us from having discussions about which films are the best. Even if beauty is subjective, we can still have discussions about it, just as we can still debate morality even though people’s values differ.


London’s Brutalist “Alexandra Road” development, which Sam Kriss says is more beautiful than St. Paul’s Cathedral.

St Paul’s Cathedral.
There is a widespread conception, reinforced by conservative classicists, that “beauty” is just a euphemism for European imperialist art. [Now-disgraced] leftist writer Sam Kriss, who has ludicrously and incorrectly argued that London’s Brutalist Alexandra Road is more beautiful than St. Paul’s Cathedral, writes that “sentimental traditionalists talk a lot about beauty, but if beauty means proportion, regularity and harmony then modernism does it very well. But, of course, that’s not what they mean by beauty; they mean some ineffable organic connection to the life and striving of the nation.” But beauty doesn’t need to just mean “proportion” and it doesn’t mean “the life and striving of the nation.” It can’t simply mean simplicity and proportion, for many things are simple and proportionate that are not beautiful. And it can’t be nationalistic, because ancient mosques and temples are among the most beautiful of structures. When we talk about architectural beauty, we’re talking about a quality held in common across civilizations, one that unifies Indians and Mayans and Spaniards.

People are actually uncomfortable with the idea of beauty because they think it’s subjective. But we can’t actually rid ourselves of it; there are places we find beautiful, and places we don’t, and it’s important to have the conversation if we are to keep ourselves from continuing to make places that we don’t find beautiful. Without developing a language to talk about beauty, we will end up confusing the impressive with the attractive and creating spaces that are extraordinary from an engineering perspective and yet dead and discomforting.


Usually the more elaborate and intricate, the more mesmerizing…
2. THE FEAR OF ORNAMENT — Ornament is not an indulgence; it’s an essential part of the practice of building. In fact, “ornament” really just means attention to the micro-level aesthetic experience. It’s the small things, and small things matter. The idea of decoration as decadent is particularly ludicrous in the age of monumental design projects. How many more resources are wasted trying to make Frank Gehry’s latest pretzel stay standing up than it would take to install some attractive stonework on a far simpler structure? When we sacrifice the possibility of decoration we forfeit a slew of extraordinary aesthetic tools and forgo the possibility of incredible visual experiences. An allergy to ornament sentences humanity to eternal tedium, with nothing interesting to look at, nothing that we will notice on a building the second time that we did not see the first time.



3. THE FEAR OF TRADITION — It was astonishingly hubristic and careless for architects to craft a theory that forbid the possibility of ever again using traditional styles. Tradition is important, and severing one’s self from it is pointless and suicidal. We have inherited a palette of possibilities from the architectural practice of all prior cultures, and to squander it is both ungrateful and needless. Memory and continuity are not mere nostalgia. Of course, tradition has gotten a bad reputation, simply because most “neo-traditional” architecture is so bad and Disneylike. Recreations and pastiches are not the solution, and the mindless conservative love for everything Greek, Roman, and Victorian is a mistake. The point is not to just mindlessly love old things; that gets you McMansions. Rather, instead of recreating the exact look of traditional architecture, one should be trying to recreate the feeling that these old buildings give their viewers. Don’t build a plastic version of Venice. Build a city with canals and footbridges and ornate pastel houses dangling above the water, and give that city its own special identity. McMansions are an attempt to superficially remind people of beautiful things rather than doing the real work it takes to make something beautiful. ​But tradition is crucial, old things were generally better things, and if we abandon them we doom ourselves to creating mindless new shape after mindless new shape.






From the Arab and Indian worlds to the synagogues of Europe and the subways of Moscow, complex symmetries have always mesmerized us. It’s not a Greco-Roman thing. It’s a human thing.
4. THE FEAR OF SYMMETRY — The tendency toward discord has to end. Symmetry is nice. Multiple overlapping symmetries can be dazzling. A building doesn’t need to be lopsided. You can line the windows up. It’s okay. It will look better. Don’t worry. We won’t tell your professor.

5. THE FEAR OF LOOKING FOOLISH — The people who most loudly disdain traditional architecture are those most concerned to convince others of their own intellectual seriousness. Designing a comforting, pleasing, and, yes, nostalgic space is simply not smart enough. People are afraid to say that they don’t “get” a building or find it ugly. It sounds childlike to say you wish it was a pastel color or you wish the two sides matched or you wish it didn’t look like it hated you. But it should be okay to say those things. Buildings shouldn’t hate you. They probably shouldn’t be weird-looking and they shouldn’t grate on the eyeballs. They should be comforting and attractive, because we have to live in them.



BOTH COMPLEXITY AND SIMPLICITY
One of the elements that makes a place truly beautiful is a careful balance of complexity and simplicity. Contemporary architecture frequently just goes for the simplicity and forgets the complexity, or it makes up for the simplicity of its appearance with complexity in the technical processes necessary to build it. But the old buildings that please us most are frequently simple at the larger level and complex at the micro-level. For example: the buildings in New Orleans’ French Quarter are not actually elaborate. Most of them are simple, rectangular structures in a straight line along the street. But they are given pleasant colors, and adorned with colorful shutters and intricate iron galleries, and decorated with flowers and tropical plants. And it’s those complex elements that give the place life. The harmonious balance of simplicity and complexity, the complexity of a floral arrangement combined with the simplicity of a plain building painted well, make a place a delight to stroll through.

INTEGRATING NATURE
Plant life is actually one of the most important elements of architecture. One of the most serious problems with postwar architecture is that so much of its entirely devoid of nature. It presents us with blank walls and wide-open spaces with nary a tree or shrub to be seen. Generally speaking, the more plant life is in a place, the more attractive it is, and the less nature there is, the uglier it is. This is because nature is much better at designing things than we are. In fact, even Brutalist structures almost look livable if you let plants grow all over them; they might even be downright attractive if you let the plants cover every last square inch of concrete. Every building should look like the Hanging Gardens of Babylon. We need plants and water to be happy. One of the reasons tower blocks are so insidious is that they deprive people of access to gardens. Gardens should be integrated seamlessly into everything; there is a reason being banished from a garden was the most terrible fate God could think to inflict on humankind.


Rule of thumb: the greener and more lush a place, the lovelier it becomes.
FEELINGS OVER FORMS
There is, generally speaking, too great of a desire for architecture to convey ideas. Architects obsess over the ideas that they are embodying in their buildings. But most people who use a building don’t understand whatever abstract theoretical notion the architect was trying to convey. Far more important than “ideas” are the feelings that a building generates, the experiences people will have in it, and these should be given priority.

Likewise, “form” is dwelled on excessively; architects care far more about the shape of the building than whether its inhabitants are comfortable. Hence “blobitecture”: the architect precisely designs the exact perfect kind of blob, using elaborate digital design and engineering tools, without stopping to wonder whether people actually like blobs. The website of Zaha Hadid Architects brags that the buildings for a new project are “iconic in both their scale and ambition… creating a unique twisted, intertwined silhouette that punctures the skyline.” But architects should not want to create things that are “iconic in scale” or to “puncture the skyline.” This is precisely the wrong thing to care about; it suggests the architect simply craves attention rather than the creation of perfect beauty and comfort. You’re not supposed to be puncturing! You’re supposed to be adding another delicate and perfect note to the skyline’s gorgeous symphony.

Most of the theoretical justifications for these forms are transparent nonsense. Witness Frank Gehry explaining how he didn’t want to “do” decorations or “historical stuff” and decided instead to be inspired by the shapes of fish:

“I was looking for a way to deal with the humanizing qualities of decoration without doing it. I got angry with it—all the historical stuff, the pastiche. I said to myself, If you have to go backward, why not go back 300 million years before man, to fish? And that’s when I started with this fish shtick, as I think of it, and started drawing the damn things, and I realized that they were architectural, conveying motion even when they were not moving. I don’t like to portray it to other people as a complicated intellectual endeavor. Most architects avoid double curves, as I did, because we didn’t have a language for translation into a building that was viable and economical. I think the study of fish allowed me to create a kind of personal language.”

If this came from an ordinary person, we’d dismiss it as a madman’s ravings. But Gehry is the architects’ favorite architect, so he can get away with admitting that he’s just doodling fish, and people will think he’s very profound.

THE NEED FOR COHERENCE
Frank Lloyd Wright’s Guggenheim Museum is an impressive building. Unfortunately, it doesn’t bear any actually relationship to its surroundings; it could have been placed anywhere. Wright’s Fallingwater house, on the other hand, was designed to cohere with its location. Aesthetic coherence is very important; a sense of place depends on every element in that place working together. The streets of the Beacon Hill neighborhood in Boston are beautiful because there are many different elements, but they are all aesthetically unified. The Tour Montparnasse in Paris is horrifying, because it doesn’t flow with the surrounding buildings and draws attention to itself. Capitalism eats culture, and it makes ugly places. Money has no taste.


Beacon Hill, where everything just fits together perfectly.
DEMOCRATIC VALUES
We can see the fruits of Peter Eisenman’s anti-democratic philosophy in the places he builds. A former student at the Cooper Union recalls seeing an Eisenman design for a dormitory and thinking “I wouldn’t want to live there,” because Eisenman had used oddly-angled walls, making placing furniture well impossible, and putting the windows at floor level, so one would have to get on one’s knees to see outside. The person who assumes they know what people ought to want generally doesn’t actually know very much at all. Places should be liked, they should make people comfortable. Architects should find out which buildings people like best (hint: it’s generally the older ones) and should try to make new buildings that give people those same feelings of pleasure. Brutalism is the opposite of democracy: it means imposing on people something they hate, all for the sake some narrow and arbitrary formalistic conceptual scheme. Deferring to popular taste doesn’t have to mean Las Vegas; it can mean elaborate cathedrals and gardens with fountains.


जैसलमेर का किला (Jaisalmer Fort)
THE ABOLITION OF THE SKYSCRAPER
It should be obvious to anyone that skyscrapers should be abolished. After all, they embody nearly every bad tendency in contemporary architecture: they are not part of nature, they are monolithic, they are boring, they have no intricacy, and they have no democracy. Besides, there is plenty of space left on earth to spread out horizontally; the only reasons to spread vertically are phallic and Freudian. Architect Leon Krier has suggested that while there should be no height limit on buildings, no building should ever be more than four stories (so, spires as tall as you like, and belfries). This seems a completely sensible idea.

But more than just abolishing skyscrapers, we must create a world of everyday wonder, a world in which every last thing is a beautiful thing. If this sounds impossible, it isn’t; for thousands of years, nearly every buildings humans made was beautiful. It is simply a matter of recovering old habits. We should ask ourselves: why is it that we can’t build another Prague or Florence? Why can’t we build like the ancient mosques in Persia or the temples in India? Well, there’s no reason why we can’t. There’s nothing stopping us except the prison of our ideas and our horrible economic system. We must break out of the prison and destroy the economic system.

There’s an easy test for whether a building is beautiful or not. Ask yourself: if this building could speak, would it sound like the Rubaiyat or the works of Shakespeare, or would it make a noise like “Blorp”? For nearly 100 years, we have been stuck in the Age of Blorp. It is time to learn to speak again.
 

wizards8507

Well-known member
Messages
20,660
Reaction score
2,661
Here's a great article about architecture and the need for beautiful buildings.

https://www.currentaffairs.org/2017/10/why-you-hate-contemporary-architecture
<blockquote class="twitter-tweet" data-lang="en"><p lang="en" dir="ltr">Decadence wasn't always so ugly <a href="https://t.co/2hNR1RFJDe">pic.twitter.com/2hNR1RFJDe</a></p>— The American Lyceum (@AmLyceum) <a href="https://twitter.com/AmLyceum/status/927313019499110403?ref_src=twsrc%5Etfw">November 5, 2017</a></blockquote>
<script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
 

Old Man Mike

Fast as Lightning!
Messages
8,979
Reaction score
6,471
Adams is a bit of a conundrum to me. He's a self-described "radical atheist" by which he means to tell you that he REALLY means it, and the famous Hitchhikers Guide is really pure pathos and meaningless at bottom. (no matter how clever/funny.) Given this basic hopeless pessimism about anything enduring, his view of beauty (or not) in architecture leaves one (me) curious --- is Adams merely looking at beauty as temporary solace for a purposeless existence? A better feeling as one wiles away the meaningless minutes of an existential life?

The Greeks were culturally assured that Love and Truth and Beauty were in essence the same things, and were projections into Earthly matters from more godly realms --- so even with their crude understanding they saw something spiritual (and not trivially spiritual) in Beauty.

So, while I appreciate his wit, and echo his call for a more "artful" architecture (this reminds me of Jake Stone's characteristic motto in The Librarians: "Architecture is Art in which we live"), I still believe that he has missed the major "truth" in all of this (and is consequently hardly a wisdom-teller or a prophet.)
 
C

Cackalacky

Guest
Well... as a structural engineer who works with Architects every day IMO there are many drivers in what the final products look like.

Money
Time
Space
Resources
Building Code


Each of these has driven the form out of architecture and more into the realm of function.

Further more complicated architecture required more complicated engineering and in today's world of natural disasters and law suits and cheap building materials, doing what works and being able to produce things at a high level is of utmost importance. I can say, as structural engineers, go we are pushed to the limits as it is and have to back the architects down when they get out there.

I posted this in the thread of the random but I doubt we ever see anything such as this ever again:

551px-Sagrada_Familia_nave_roof_detail.jpg


The days of being able to spend 10-15 years developing and constructing buildings with form and function is over IMO without a massive change in cultural asthetics.
 

Whiskeyjack

Mittens Margaritas Ante Porcos
Staff member
Messages
20,894
Reaction score
8,126
The days of being able to spend 10-15 years developing and constructing buildings with form and function is over IMO without a massive change in cultural asthetics.

Many European cathedrals were built by small communities over a number of years. Such beautiful architecture has endured and continues to attract tourists precisely because it was the product of a place and time in which people had a much clearer conception of what is Good, True and Beautiful. As the article above mentions, the ugliness of our built environment is one of the clearest indictment of the prevailing culture forces. Capitalism is inhumane, which is why it forces us to live in ruthlessly utilitarian boxes of glass, metal and drywall completely devoid of ornamentation. We can reclaim public beauty and walkable neighborhoods once capital stops calling the shots.

N +1 just published a review of Malcolm Harris' Kids These Days: Human Capital and the Making of Millenials:

ON A RECENT VISIT TO MY PARENTS, my mother asked me whether I want to have kids. Being 30 and single, an uncle to a niece and a nephew through both my siblings, I’ve started to get questions from older generations about my plans to reproduce. This began later for me than it does for women and is a fraction as oppressive, but to be honest I’d thought male privilege would shield me from it entirely. When this defense failed, I forestalled a line of inquiry from my mother by talking about climate change. Even as I said it, I knew it was an already hackneyed form of stonewalling. You can defend any uncertainty these days by evoking melting ice sheets and disappearing permafrost.

But she’d never heard anyone take this tack before—at least not since her own generation’s “population bomb” version of the same story. “That,” my mom said slowly, “is so heavy.” Over the course of the rest of my visit, she mentioned it to others my age for confirmation, to others her age in incredulity. “Gabe says nobody in his generation wants to have kids because of climate change. Did you know about this?”

How could the gap between us be so great? What seemed to me such a commonplace as to be evasive and impersonal appeared to my mother as a serious human quandary—which in fact it is. I’m more politically optimistic than my mother, yet I was taken aback to realize how much darker the future seems to me than to her. Then I remembered: she’s a boomer, I’m a millennial, and this is the song of the season.

There hasn’t been a generational divide this pronounced since the 1960s. The flareups that have occurred have been aftershocks of the 1960s—as in the 1992 confrontation between World War II veteran George H. W. Bush and draft dodger Bill Clinton with the wife who didn’t want to bake cookies. Generational analysis rarely got beyond generic psychobabble: The “greatest generation” were stoic, laconic survivors, boomers the spoiled offspring of Dr. Spock, et cetera. The actual “life chances” of the generations were not meaningfully different, and politics did not line up with the generations. Clinton’s best generational slice of the electorate in 1992 was the senior vote, but he performed pretty evenly overall, winning between 41 and 50 percent in every age category. Neither party enjoyed any significant preference from the young or the old in particular.

The contrast with today could hardly be starker. Republicans have consolidated the elder vote and Democrats enjoy the default support of the young, who largely don’t vote anyway: as we know from the maps on our social media feeds, Hillary Clinton would have won something like forty-five states if 18–25 year olds had cast the only ballots. And she was the distant second choice of these so-called young millennials in the Democratic primaries, far behind the left-wing challenger. The reawakening global left of the last decade, of which Bernie Sanders was the American electoral incarnation, is, in terms of its age distribution, uniform: the Indignados and Podemos, Syriza (alas), the Arab Spring, Occupy, the Gezi Park protests, South Africa’s Economic Freedom Fighters, Hong Kong’s Umbrella Movement, the Kurdish revolution, Black Lives Matter, Nuit Debout and La France Insoumise, Momentum and Jeremy Corbyn, the Democratic Socialists of America—all are or were movements of the young.

While striking, this massive political generation gap is a symptom of something deeper. Whatever it is, we register it in complaints about the supposedly “bad” work ethic of young employees or scolding about keeping good habits: our smartphone-induced fidgetiness; our infamous predilection for avocado toast over mortgages; the decline of Applebee’s and Buffalo Wild Wings, laid low by millennial distaste. (Cf the swelling genre of listicles about what consumer brands millennials are killing. Cf also all the articles about millennials and their love of listicles.) You hear it in stories of adult children who move back in with their parents. It’s in the 50 percent of teenagers who have told surveyors that they don’t identify as straight, and the perplexing—if encouraging—news of resurgent youthful interest in public libraries. For good or ill, something has gone profoundly awry in the intergenerational transmission process.

Under ordinary circumstances, the institutions built by the old are repopulated by the young, who adjust them for new circumstances but leave them basically the same, in turn handing them over to the next generation. The possibility of successful passage through the institutions of society is what makes a person follow a normative rather than deviant life course: being a woman or a man roughly the way she or he is supposed to, partnering and reproducing in the socially standard fashion; trying to get ahead or at least get by according to prevailing ethics of education and work. In our society, this has meant (in ideal-typical middle class terms): homeownership, an occasional vacation, sending your kids to college, and retirement. Historical continuity—the integrity of social institutions over time—works itself out on the individual level: people may feel they are making distinct, agonizing life choices, but for the most part they are living out those institutions predictably. An institution is, at the end of the day, just a pattern of social behavior repeated long enough. On the other hand, if the institutions aren’t processing enough people into the proper form—if too many can’t or won’t do family or school or work or sex approximately the way they’ve been done before—then large-scale historical continuity can’t happen. The society can’t look tomorrow like it does today.

WHILE IT FEELS as though we are heading toward some such break, there have not yet been many serious efforts to understand our national crisis in terms of generations. This is why Malcolm Harris’s new book, Kids These Days, is a landmark. Remarkably for an author of a trade book on such an on-trend topic, Harris makes a politically radical argument, undergirded by a coherent and powerful Marxist analysis. You can very well imagine buying this book in an airport, and because Harris is a powerful and funny writer, you’d get through it before you landed. But you might land a different person; the book is devastating. “American Millennials come from somewhere—we didn’t emerge fully formed from the crack in an iPhone screen,” he writes. In Harris’s view, we are, down to our innermost being, the children of neoliberalism. The habits so often mocked and belittled in the press are in fact adaptations to tightening repressive and exploitative pressures, the survival strategies of a demographic “born into captivity.”

Capitalism’s generation-long crisis, in Harris’s diagnosis, has imposed enormous competitive pressure on the young to produce “human capital.” This concept, a core one in neoliberal economic thought, is meant to quantify the bundle of economically valuable human qualities—education, skills, discipline—accumulated over the course of a life. It’s in the book’s subtitle because it’s the key to Harris’s argument. The hidden hand shaping millennials, producing our seemingly various and even contradictory stereotyped attributes, is the intensifying imperative—both from the outside, and also deeply internalized—to maximize our own potential economic value. “What we’ve seen over the past few decades is not quite a sinister sci-fi plot to shape a cohort of supereffective workers who are too competitive, isolated, and scared to organize for something better,” writes Harris. “But it has turned out a lot like that.” Capitalism is eating its young. It’s only feeding us avocados to fatten us up first.

Harris works through this argument by following the millennial through the stages of life—as far as we’ve yet gotten. A remarkable feature of the book is how Harris is able to apply this single explanation to dozens of disparate, if familiar symptoms. Again and again, he yanks the disguise from some behavior seeming to belong to a discrete field—parenting, or education, or pop culture, or the labor market—and finds that it was actually neoliberalism all along. Harris points out that, beginning in childhood, so-called “helicopter parenting” and the measurable decline of unstructured play are actually forms of risk management. Given how social inequality in the world at large has worsened competition to get ahead, “parents are told—and then communicate to their children—that their choices, actions, and accomplishments have lasting consequences, and the consequences grow by the year.”

From the preoccupation with bullying to the design of playgrounds and the school policy of zero tolerance, Harris finds the world of childhood increasingly redefined by actuarial caution. Most of all, though, he finds it in the classroom. School, after all, is just a form of unwaged work, masked by the ideology of pedagogy. The surplus that kid-labor creates, rather than going to any immediately present boss, pools up in the students themselves, to be tapped by future bosses. When they do schoolwork, children labor on themselves. “By looking at children as investments, we can see where the product of children’s labor is stored: in the machine-self, in their human capital.” The steady increase in homework, the growing apparatus of testing and school accountability, and the pressure for longer schooldays and schoolyears is just what you would expect once children have been turned into financial assets. Many of the observed social-psychological attributes of the young generation result from undergoing such processing into a human commodity-form. Childhood is a “high-stakes merit-badge contest,” teaching kids to be “servile, anxious, and afraid.”

At the end of childhood, some millennials go to college to continue accumulating human capital. Harris is a peerless observer of the harrowing economic costs of “meritocracy,” and his chapter on college abounds in withering apercus. “College admissions offices are the rating agencies for kids,” he writes. “And once the kid-bond is rated, it has four years until it’s expected to produce a return.” Because the pressure to accumulate human capital is so intense, students will bear enormous costs to do it. Far from the coddled children of stereotype, Harris points out, most college students are “regular people—mostly regular workers—who spend part of their work-time on their own human capital like they’ve been told to.” Exhaustion, overwork, and even food insecurity are common. Colleges themselves, meanwhile, reap obscene rewards from their gatekeeping position by offering a worse product for a higher price: hollowed out pedagogy from exploited adjuncts and graduate students, masked with “shiny extras unrelated to the core educational mission.” Aggrandizing administrations bloat on student debt, the key to the whole scheme. Student debt, Harris argues, is a bloodsucking Keynesian stimulus, turning the value of the future labor of young borrowers into the capital to build stadiums and luxury dorms today, jacking up tuition even higher and allowing another round of borrowing and building.

But not every kid-bond matures. Students who can’t keep up are diagnosed, drugged, and punished. The extraordinary proliferation of mood and attention disorders among the young, and their development into a lucrative pharmaceutical market, is only the logical complement of the human capital accumulation regime of testing, supervising, and debt-collecting. Depression, Harris notes, is up 1,000 percent over the last century, “with around half of that growth occurring since the late 1980s.” While there’s always a question about changing diagnoses with this sort of figure, Harris is convincing that there’s more to this phenomenon than an artifact of measurement. So too the growing punitive apparatus waiting to catch kids who fall off the track: “We can draw a straight line between the standardization of children in educational reform and the expulsion, arrest, and even murder of the kids who won’t adapt.” On this account, mass incarceration, too, is a generational phenomenon, and it makes its first appearances inside schools, which are now heavily policed zones, as are the public spaces in which working-class kids congregate. “Millennials are cagey and anxious, as befits the most policed modern generation,” Harris writes. In this way, the book effectively argues that widely different experiences of neoliberalism—from the grasping student’s anxiety for good grades to the policed young person of color dodging the cops—are nonetheless part of the same social process.

The immediate impulse driving human capital accumulation is the need to compete on a labor market more unforgiving than anything in memory. This is one of the most familiar elements of the millennial critique of the world we’ve inherited, perhaps best embodied in the meme of “Old Economy Steve”—a yearbook image of a smug and blotchy young white man with an echt mid-’70s look: pageboy haircut, wide-lapeled shirt, some kind of necklace. “Why don’t you call and ask if they’re hiring?” says the supertext on one version; the subtext reads, “Hasn’t been on a job hunt since 1982.” “Pays into social security,” offers another. The kicker: “receives benefits.” The story Harris tells here isn’t new, but it lies at the core of millennial experience. “It’s harder to compete for a good job, the bad jobs you can hope to fall back on are worse than they used to be, and both good and bad jobs are less secure. The intense anxiety that has overcome American childhood flows from a reasonable fear of un-, under-, and just plain lousy employment.” Indeed, the meme itself conveys something distinctively millennial: not just precarious employment, but awareness of our own precariousness, which our elders refuse to accommodate or even acknowledge.

Though media stereotypes often portray millennials as brittle, wheedling, and demanding, for the most part young workers are docile enough to have bent themselves into whatever shape capital has required. Millennials aren’t fragile—they’re overstretched. This is the most human capital-intensive generation in history, productive far beyond the wages it garners. “From our bathroom breaks to our sleep schedules to our emotional availability, millennials are growing up highly attuned to the needs of capital markets,” Harris writes. “We are encouraged to strategize and scheme to find places, times, and roles where we can be effectively put to work. Efficiency is our existential purpose, and we are a generation of finely honed tools.” Racing to stay ahead, young workers accept low wages, sweated working conditions, and precarious arrangements. They do not expect that their employment will grant access to the social benefits enjoyed by their parents and grandparents—written off by Harris as history’s most entitled generations, anomalies not likely to occur again. Rather, the emblematic figure of the millennial workforce, the clearest expression of its tendencies, is the intern: the worker whose labor is disavowed entirely, made out to be for her benefit, like the labor of schoolchildren.

For Harris, even millennial forms of creativity and self-expression are captive to this logic. The young are transforming the entire culture, he argues, through the way that their human capital accumulation strategies are working themselves out in the culture industries. Pointing to professional and college athletes, musicians, tween entertainers, online pornography, and YouTube stars, he repeats his point: depressed entertainment profits have produced an arms race, causing aspiring performers, actors, and writers to laboriously produce their own stardom rather than wait to be discovered. He tells the story of Chicago rapper Chief Keef, who released forty of his own songs for free online and created his own label at age 16 without any corporate involvement. “By the time Interscope signed Keef, he was already a bona fide star, with the kind of brand they would have otherwise had to spend millions developing.”

While the restructuring of these industries allows for some Cinderella stories, its overall effect is to intensify exploitation, by others or by oneself. For Harris, this is both a way of interpreting mass culture today and a metonym for a classic millennial habit. “Older Americans like to complain about the way many young people obsessively track our own social media metrics, but it’s a complaint that’s totally detached from the behavior’s historical and material causes,” he writes. “Personal branding shifts work onto job-seekers.”

On social media—the heart of the matter, naturally, for a book on millennials—personal branding becomes indistinguishable from social life in general. The destruction of childhood as we once knew it by parents, teachers, and police has driven kids into social media for their “flirting, fighting, and friending.” Once online, these formerly free activities can be commodified, the living activities of childhood vacuumed up as data and monetized. Despite cyclical moral panics over new drugs or hookup culture, teens—busy engaging in their social lives online when they’re not doing homework—actually use fewer recreational drugs, drink less, and have less sex than their equivalents in prior generations. It’s risk-aversion again, says Harris—fuel for Silicon Valley profits now and more disciplined workers later.

The summation Kids These Days gives us is harrowing: here is a generation hurrying to give in to the unremitting, unforgiving commodification of the self. Harris predicts a future of debt servitude, confinement for the “malfunctioning,” worsening misogyny (though his gender analysis is less coherent than the rest of his argument), and total surveillance. Millennials, that is, are the first generation to live in the dystopia to come. Harris’s politics are revolutionary, and he dismisses any lesser mode of collective response to the thoroughgoing crisis as—in his metaphor—akin to playing with a toy. Ethical consumption, electoral politics, philanthropy and nonprofits, and social protest are all just switches and buttons, yielding fun noises and flashing lights but not having any effect: “The series of historical disasters that I’ve outlined, the one that characterizes my generation, is a big knot. There’s not a single thread we can pull to undo it, no one problem we can fix to make sure the next generation grows up happier and more secure.” What you do with a knot that you can’t untie is cut it.

There's more to it than what I shared above, but it's an interesting analysis of why my generational cohort through Marxist principles.
 

wizards8507

Well-known member
Messages
20,660
Reaction score
2,661
From our bathroom breaks to our sleep schedules to our emotional availability, millennials are growing up highly attuned to the needs of capital markets.
That might be the most pretentious sentence I've ever read. Translation: Young people check Twitter while they're taking a dump.

This entire review reeks of urban / coastal bias. I'm 28 years old, I live in one of the bluest states on the map, and I work for a company that many consider to be a propaganda arm of the Democrat Party. And I have never, in my life, heard somebody actually claim that they were holding off on having children because of climate change. Yet this is the ubiquitous argument that an entire demographic cohort is supposedly making en masse? The rest of the analysis is similarly disconnected. It's not a generational divide outside of the major metropolitan centers. The people in small towns and suburbs are behaving as their parents did.
 
Last edited:

zelezo vlk

Well-known member
Messages
18,013
Reaction score
5,055
That might be the most pretentious sentence I've ever read. Translation: Young people check Twitter while they're taking a dump.

This entire review reeks of urban / coastal bias. I'm 28 years old, I live in one of the bluest states on the map, and I work for a company that many consider to be a propaganda arm of the Democrat Party. And I have never, in my life, heard somebody actually claim that they were holding off on having children because of climate change. Yet this is the ubiquitous argument that an entire demographic cohort is supposedly making en masse? The rest of the analysis is similarly disconnected. It's not a generational divide outside of the major metropolitan centers. The people in small towns and suburbs are behaving as their parents did.

I have several friends who refuse to have more than a couple kids (1 stopped at a single daughter) due to the bogeyman of overpopulation and its effects on climate change.
 

Old Man Mike

Fast as Lightning!
Messages
8,979
Reaction score
6,471
I never agree with Wizards about anything, but I'll break "tradition" on this: that there is any movement by young people to NOT have families based on environmental issues like Global Climate Change is howlingly preposterous. One can find exceptions to everything, but persons who would really bring climate change into a family creating situation have hysterical personalities.

Getting back to my own "hysteria", GCC exists and I have spent two decades studying the many dimensions of this. What's frustrating for an environmental scientist is that the science is so threatening to the American lifestyle of never denying oneself anything, that such threats are psychologically silenced and a childish ignorance is preferred by our "this moment is all you have so grab it for all it's worth and Devil take the hindmost Culture."

Paradoxically, I wish (just a little) that the emotional insanities would shift (slightly) toward greater concern for the future. Would that mean more hysterical wrecks not having children? Maybe. But if it also meant our citizens giving a damm about their grandchildren genuinely and breaking the hold of "never say no to yourself", I'd take it.
 
Top