zelezo vlk
Well-known member
- Messages
- 18,012
- Reaction score
- 5,055
Great great show
If true, you're the biggest non-prostate owner on this entire site.*reported
It's not a jump at all. "People have been doing it for thousands of years" is a shit reason to defend anything. "It's natural" is also a shit reason to defend anything. People have been doing 100% natural things that are really really bad for them since before the dawn of civilization. If you want to defend something, fine. But find a better argument than those two.
Does vegetarian/vegan fall under culture? Didn't think it fell under PC so putting this here
Pinellas Park Chick-fil-A protest angers parents, scares kids | WFLA.com
But it's still where the poop comes out.
Does vegetarian/vegan fall under culture? Didn't think it fell under PC so putting this here
Pinellas Park Chick-fil-A protest angers parents, scares kids | WFLA.com
That's not how a vagina works.And that makes it less desirable than where the piss comes out?
That's not how a vagina works.
A couple of the women present said that they had forced themselves to have toe-curlingly embarrassing conversations with their teenagers on the subject. “I want my son to know that, despite what he might see on his laptop, there are things you don’t expect a girl to do on a first date, or a fifth date, or probably never,” said Jo.
A GP, let’s call her Sue, said: “I’m afraid things are much worse than people suspect.” In recent years, Sue had treated growing numbers of teenage girls with internal injuries caused by frequent anal sex; not, as Sue found out, because she wanted to, or because she enjoyed it – on the contrary – but because a boy expected her to. “I’ll spare you the gruesome details,” said Sue, “but these girls are very young and slight and their bodies are simply not designed for that.”
Her patients were deeply ashamed at presenting with such injuries. They had lied to their mums about it and felt they couldn’t confide in anyone else, which only added to their distress. When Sue questioned them further, they said they were humiliated by the experience, but they had simply not felt they could say no. Anal sex was standard among teenagers now, even though the girls knew that it hurt.
There was stunned silence among the mothers around that dinner table, although I think some of us may have let out involuntary cries of dismay and disbelief.
For Sue’s surgery isn’t in some inner-city borough where kids may have been brutalised or come from cultures where such practices are commonly used as contraception. Sue works in the leafy heart of Hampshire. The girls presenting with incontinence were often under the age of consent and from loving, stable homes. Just the sort of kids who, only two generations ago, would have been enjoying riding and ballet lessons, and still looking forward to their first kiss, not being coerced into violent sex by some kid who picked up his ideas about physical intimacy from a dogging video on his mobile.
No doubt no doubt.Today I logged on to IE to see wizard talking about how a vagina works.
I needed that after a long day.
If you think that the world today is a much better place than it was 50 or 60 years ago, consider the fact that, as I write this, there is a man in Japan whose company sells robotic sex dolls meant to simulate the experience of raping a child.
Shin Takagi, whose products have been routinely seized by customs authorities in the United Kingdom and Australia, is a self-proclaimed pedophile, albeit one who says he has never harmed a child.
"We should accept that there is no way to change someone's fetishes," he told The Atlantic in an interview last year. "I am helping people express their desires, legally and ethically. It's not worth living if you have to live with repressed desire." He insists that if he had not been allowed to perform repeated acts of self-abuse in the presence of his youthful androids, he would now be a rapist.
It is worth pointing out that Takagi's optimistic view of the quasi-therapeutic use of his products — or, as he insists, works of art — are generally dismissed in the mainstream robotic sex community. "Treating pedophiles with robot sex-children is both a dubious and repulsive idea," Professor Patrick Lin of California Polytechnic Luis Obispo, an expert in "robotic ethics," recently told the Telegraph of London.
The fact that raping kids is wrong even if it's just make believe! is an actual point that needs to be addressed by a credentialed academic in an obscure discipline might seem depressing. But I cannot help but take it as a sign of hope.
Perhaps without realizing it, Lin has pulled the rug out from under higher liberalism, that alliance of progressives who think there is a connection between people doing what they want with their private parts and economic justice, cynical neoliberals, #woke capitalists, and libertarian absolutists. For adherents of the higher liberalism, all moral questions can be decided with reference to the so-called "harm principle" of John Stuart Mill, according to which nothing can be considered immoral — and therefore subject to legal penalty — if it does not inflict injury upon a person other than whoever is responsible for the action in question. It is a simple, elegant, and hysterically wrong-headed argument that is nevertheless difficult to refute.
Part of this is because it is difficult to arrive at a broadly agreed upon definition of what exactly constitutes "harm." Surely bodily injury in the strict sense cannot be the limit. Everyone, even those who do not believe in the soul, acknowledges that there is such a thing as psychological harm. But generally the view of what is morally licit ends up coming down to whether or not two or more adults consent to a given course of action: fornication, sodomy, adultery, appearing in humiliating pornographic videos. (It is generally admitted that there are certain things — cannibalism, for instance — that no one can ever consent to, though an earnest libertarian did once admit to me that he had no problem with someone signing a contract during his lifetime stipulating that his body was to be given to necrophiliacs in exchange for a sum of money.)
Which brings us back to kid sex robots. In this case, there is no issue of anyone, a child or otherwise, being subjected to bodily harm in the strict sense. Nor is there any party to whom consent must be given. It ought, according to standard harm-principle reasoning, be a straightforward case of Let 'er rip!.
Yet one imagines that Lin speaks for most people when he says that this cannot be countenanced. One could say that this is because it is possible that indulging in these appetites in an ostensibly harmless manner will sooner or later encourage pedophiles to seek out the real thing. But I doubt it.
The truth is that all of us at some level or another understand as if by instinct that certain desires are in and of themselves wrong. They should not be acted upon, placated, appeased, or in any sense met halfway. Most of us, one hopes, feel this way about pedophilia and bestiality (how far away are we, I wonder, from dog sex robots?), at the very least. It is the moral duty of people who want to hurt children or animals to banish such desires from their mind, to seek help, to pray — to do whatever it takes to ensure not only that they never carry out their fantasies but that they no longer have them. "Acting upon them" is beside the point; that anyone anywhere is contemplating such things is inherently evil.
Once this truth is acknowledged and it is accepted that certain impulses are immoral not simply because they have potential to lead to others' being harmed but because they are in themselves wicked, it becomes much more difficult to make the usual facile arguments in favor of everything from legalized marijuana to secular liberals' redefinition of marriage. The argument is no longer about abstract "harm" but about those old stalwarts good and evil.
Liberalism is going to need a new toolbox.
Gonorrhea May Soon Be Resistant to all Antibiotics (Scientific American)
In November 4, 2014, sixteen-year-old Cameron Lee, a popular, athletic, straight-A student at Henry M. Gunn High School in Palo Alto, California, leapt in front of a commuter train. His suicide note provided no clear reason for his act; there were no apparent signs of mental illness, and he was not a bullied misfit. His death followed two other student suicides just three weeks prior, one from the same school and another from a nearby private high. Three months later, another senior at Gunn, by then known to local students as “the suicide school,” jumped to his death from the roof of his family’s home.
Gunn High School is located in one of the *wealthiest school districts in the country and has some of the nation’s highest test scores. Its students succeed brilliantly in the meritocratic game of standardized tests and college admissions. But the pressure to perform has left them susceptible to feelings of worthlessness. If one can’t measure up and make the grade—what then?
Gunn saw a similar cluster of suicides in 2009. In separate incidents, three current students, an incoming freshman, and a recent graduate all jumped in front of the local Caltrain. That year another recent graduate of the school died by hanging himself. Following these suicide clusters, a 2014 survey of Palo Alto high school students revealed that 12 percent of them had very seriously contemplated suicide in the past year. Another recent report summarizing national and state-level surveys of American high school students put this number at 17 percent.
The largest school district in California, Los Angeles Unified, recorded more than five thousand incidents of suicidal behavior or deliberate self-harm (such as cutting) last year. When this district began tracking these issues in the 2010–2011 school year, there were only 255 incidents. Angus Deaton, a Princeton economist who won the Nobel Prize for his work on the intricacies of measuring human well-being, has been following what is now a national epidemic of suicide and depression. In a recent study, he found that since 1999 there has been an alarming national increase in deaths from drugs, alcohol abuse, and suicide—a trend that is especially pronounced among white Americans born since 1975. Deaton calls these “deaths of despair.”
Due to this epidemic of premature deaths, the overall life expectancy in the U.S. has begun to decline for the first time since the 1930s. In the year 2000, the outbreak of deaths of despair was concentrated in the Southwest (Nevada, Arizona, New Mexico). By 2007, the trend had spread to Appalachia, *Florida, and the West Coast. By 2014, the epidemic was country*wide, found in both rural and urban areas in every region of the U.S. Add to this the drug overdose epidemic of the past few years—the worst drug crisis in U.S. history in terms of mortality—and these deaths of despair show no signs of slowing.
Depression is now the most common *serious medical or mental health disorder in the United States. According to the World Health Organization, depression is the leading cause of disability worldwide. Sixteen percent of Americans will have an episode of major depression at some time in their lives, and six percent of all Americans—14 million—have suffered from major depression in the past year. Furthermore, rates of disabling depression have markedly increased over the past several decades, particularly among young people. According to data from the Department of Health and *Human Services, more than three million adolescents reported at least one major depressive episode in the past year, and more than two million reported severe depression that impeded their daily functioning. A *recent national study found that the share of twelve- to twenty-year-olds who had suffered major *depression in the last year increased by 37 percent from 2005 to 2014. We are witnessing a rising plague of melancholy.
Most people who die by suicide are suffering some form of depression, whether major depressive disorder, the depressive phase of bipolar disorder, or alcohol- and drug-induced depressive states. The most recent data from the Centers for Disease Control indicates that, between 1999 and 2014, suicide in the U.S. rose dramatically for both men and women in every age bracket up to age seventy-five. Social *scientists have been particularly baffled by the fact that the suicide rate among girls ages ten to fourteen has tripled. We should let these numbers sink in: Suicide is now the second leading cause of death among adolescents and young adults, and the tenth leading cause of death overall in the United States.
Rising rates of suicide, drug abuse, and depression can all be traced to increased social fragmentation. Since the 1980s, reported loneliness among adults in the U.S. increased from 20 percent to 40 percent. The recently retired surgeon general announced last year that social isolation is a major public health crisis, on par with heart disease or cancer. He noted that loneliness is associated with increased risk of heart disease, stroke, premature death, and violence. It works in a way comparable to smoking or obesity: increasing a whole host of health risks and decreasing life expectancy. It is no accident that one of the most severe punishments we inflict on prisoners is solitary confinement—a condition that eventually leads to sensory disintegration and psychosis. It is not good for man to be alone.
Even where familial or other social connections remain intact, these ties are often weaker and the mutual obligations less binding today than in decades past. I recall one young adult patient who had given his depressed mother explicit permission to kill herself if she someday chose to do so. “I don’t want her to do it, but who am I to tell her she needs to continue living? It’s her decision.” If she was dying of despair, he was not going to get in the way.
Economic explanations alone cannot account for the rise in depression and suicide. Adolescent suicide, for example, is equally common among the very wealthy and the very poor. According to Deaton, the rise in suicides depends “on family, on spiritual fulfillment, and on how people perceive meaning and satisfaction in their lives in a way that goes beyond material success.” Family is the first society in which we gain social identity and security, and its *declining fortunes have left many Americans vulnerable to despair. While overall divorce rates have declined *modestly since a peak in the 1980s, divorce rates remain high for those without a college degree, and more Americans are simply opting out of *marriage entirely.
Sociologists have documented the close connection between the retreat from marriage and declining religious participation, especially among the working class. As a consequence of these changes, many Americans have “lost the narratives of their lives,” as Deaton puts it. This leads to a loss of meaning and hope. In a survey of 35,000 people from all fifty states, the Pew Research Center found that the percentage of Americans who believe in God, attend religious services, and pray daily declined significantly from 2007 to 2014. This drop is more pronounced among whites than blacks, and is largely attributable to the “nones”—the growing cohort of Americans, particularly among the millennial generation, who say they do not belong to any organized religion. The religiously unaffiliated now account for 23 percent of the adult population, up from 16 percent in 2007.
What is behind these trends? There are doubtless complex factors in play, including economic problems. Predictably, liberals are calling for a stronger safety net and a single-payer health-care system, while conservatives are calling for a deregulated free market that will spur economic growth and raise all boats. Neither solution addresses the deeper cultural dynamics.
In 1897, Émile Durkheim published Suicide, an early attempt to understand the connection between culture and suicide. Noting the difference in suicide rates between Catholic and Protestant Germans, Durkheim argued that higher levels of social integration in Catholic societies helped reduce suicide, while greater individual autonomy and social isolation in Protestant societies tended to increase it. He identified two typical forms of suicide: There is egoistic suicide, stemming from a lack of integration into a community and leading over time to a sense of meaninglessness and ennui. Then there is anomic suicide, which increases during periods of social and economic upheaval—times at which people lose their communal moorings and drift toward despair.
In recent times, America has experienced both a weakening of social connections and rapid forms of cultural change. Robert Putnam of Harvard has documented a dramatic decline in social capital—the fabric of connections to family, friends, neighbors, and mediating institutions of society—over the past several decades. There has been a loss of blue-collar jobs (with an attendant loss of responsibility and social esteem for men), changing roles and expectations for women, increasingly unstable family structures, isolated suburban living, and absorption in television and the Internet.
What is lost with this decline of social capital? Thick social networks (the real, not virtual, variety) facilitate the exchange of ideas and information, as well as norms of mutual aid and reciprocity, collective action and solidarity. These help form our identities and give our lives a strong sense of purpose and belonging.
Too many people today have lost these moorings. Social bonds are weakening, and the social fabric is fraying. We are at risk of losing a solid identity, a clear orientation, and the coherent narratives that give meaning to our individual and shared lives. In a world stripped of universally binding truths, the sense that we are losing solid foundations leads to free-floating angst. This is a condition that cannot be tolerated for long.
William Styron’s memoir of melancholy is aptly titled Darkness Visible, a phrase taken from Milton’s description of hell in Paradise Lost. Styron recounts that his depression was a condition so mysteriously painful and elusive as to exceed description. The inability of others to understand this experience is part of what makes depression so isolating. Preferring the older term melancholia, Styron lodges a protest against the very word depression, a term used indiscriminately to describe an economic downturn or a rut in the road—a truly wimpy word for such a serious illness.
The medical and psychological sciences have taught us a lot about this affliction, but the full story of depression is more complex. Innate biological and genetic factors contribute, but social and cultural factors also play a role. In short, while depression does indeed involve a “chemical imbalance in the brain,” this does not mean that it is nothing but a chemical imbalance. Your serotonin and dopamine levels may be out of kilter, but you may still have a problem with your Tinder compulsion and dinners alone in front of the television.
We now have a sizable body of medical research which suggests that prayer, religious faith, participation in a religious community, and practices like cultivating gratitude, forgiveness, and other virtues can reduce the risk of depression, lower the risk of suicide, diminish drug abuse, and aid in recovery. To cite just one finding from among a growing body of medical research on this subject, Tyler VanderWeele of Harvard’s T. H. Chan School of Public Health recently published a study of suicide and religious participation among women in the U.S. Against the grim backdrop of increasing suicide rates, this study of 89,000 participants found that some groups remain protected from the rising tide of despair and self-harm. Between 1996 and 2010, those who attended any religious service once a week or more were five times less likely to commit suicide. Those who identified as either Catholic or Protestant had a suicide rate about half that of U.S. women in general. Of the 6,999 Catholic women who said they attended Mass more than once a week, none committed suicide. Religious practice turned out to be more important than mere affiliation; self-identified Catholics who did not attend Mass had suicide rates comparable to those of other women who were not active worshipers.
There are straightforward reasons why religious practice protects against suicide. Church attendance is a social activity that protects people against loneliness and isolation. While this is not of course a unique benefit of religion, certain things are. Judaism, Christianity, and (in most cases) Islam have strong moral prohibitions against suicide. In Hinduism and Buddhism, suicide is considered bad karma. When these moral prohibitions are internalized, they reduce the risk of deliberate self-destruction. Furthermore, religious faith can instill a sense of meaning and purpose that transcends present exigencies; this helps people not only survive periods of intense anguish, but even to find meaning in suffering. As a patient of mine once put it, “If not for my relationship with Jesus, I would have killed myself a long time ago.”
Finally, long-term studies of individuals at high risk for suicide—patients who have been hospitalized for suicidal ideation or a suicide attempt—are telling. To investigate the differences between high-risk patients who survive and those who die by suicide, researchers have analyzed medical and mental health diagnoses, symptoms, physical pain, social and economic factors, and so forth. Over a ten-year span, it turns out that the one factor most strongly predictive of suicide is not how sick the person is, nor how many symptoms he exhibits, nor how much physical pain he is suffering, nor whether he is rich or poor. The most dangerous factor is a person’s sense of hopelessness. The man without hope is the likeliest candidate for suicide.
Hope cannot be delivered by a medical prescription. Yet we know it is essential for mental health. Hope allows us to live today, here, now, even as it orients us toward the future. Those who survived the Nazi concentration camps later recalled that death camp prisoners knew whenever a fellow prisoner had abandoned the last vestiges of hope. The despair could be seen in his eyes and countenance, in the very way that he carried himself. In time, the prisoners developed a name for such people: “the walking dead.” Before long, the person who had lost hope would stop eating or drinking, would come down with a terminal infection, or would straggle and be shot. We cannot live without hope.
Contrary to popular myths about lemmings, suicide is a uniquely human behavior. Man is the only animal that deliberately takes his own life. Suicide is an act that requires rational self-reflection and awareness of one’s future. And it is influenced by one’s philosophical outlook and social context. Behavioral scientists describe depression as a response to toxic environments. Like the pain a child feels when he places his hand on a burner, depression can be a sign that an environment has become dangerous to the human organism. What are the toxic elements of contemporary culture that have led so many to withdraw into depression?
In a meritocratic age, we are valued for our usefulness. Whether in the rich precincts of Palo Alto, where children face high pressure to perform, or the forgotten stretches of West Virginia, Americans are increasingly told that they are valuable only insofar as they contribute to a productive economy. Old sources of meaning—*fatherhood, fraternity, civic involvement, church membership—have receded in significance before the SAT and future earning power. When the useful replaces the good and efficiency becomes the highest value, human beings are instrumentalized. This happens at a personal level when freedom is seen as doing what you want, making life a mere means of gaining pleasure. Rather than opening up new vistas of freedom, economic and social liberation has made men subject to a logic of utility. Among the dreary death works produced by today’s culture industry, there are T-shirts that proclaim, “I’m not saying I hate you, but I would unplug your life support to charge my phone.”
The law is a teacher, and American law *increasingly teaches indifference to life when it runs up against respect for radical autonomy. California and Colorado recently joined four other states in permitting doctors to assist terminally ill patients to take their own lives. In the same week that Gov. Brown signed the California bill, two British scholars published a study showing that laws permitting assisted suicide in Oregon and Washington have led to a rise in overall suicide rates in those states.
These findings should not surprise us. We know that publicized cases of suicide tend to produce copycat cases, often disproportionately among young people. Recall the recent spate of adolescent suicides in Silicon Valley. Social scientists call this “the Werther effect,” from Goethe’s eighteenth-century novel The Sorrows of Young Werther, in which the protagonist, thwarted in his romantic pursuits, takes his own life with a pistol. After the book’s publication, a rash of suicides among young men using the same means alarmed authorities in Germany.
A related phenomenon influences suicide trends in the opposite direction. Portrayals of people with suicidal ideation who do not attempt suicide, but instead find strategies to cope with adversity, are associated with decreased suicide rates. The so-called “*Papageno effect” is named after a lovesick character in Mozart’s opera The Magic Flute whose planned suicide is averted by three child spirits who remind him of alternatives to death.
The case of fourteen-year-old Valentina Maureira, a Chilean girl who suffered from cystic fibrosis, illustrates both effects while highlighting the power of social influences. Maureira made a YouTube video begging her government to legalize assisted suicide. She admitted that the idea to end her life began after she heard about the case of Brittany Maynard, the twenty-nine-year-old woman who campaigned for the legalization of assisted suicide before ending her own life. Maureira, however, later changed her mind after meeting another young woman suffering from cystic fibrosis who encouraged her to persevere in the face of adversity. Her father complained that the media were only interested in her story when she wanted to die.
Besides the impact of publicized cases, we have evidence that suicidal behavior tends to spread person to person through social networks. These effects are measurable and reach up to three degrees of *separation. My decision to take my own life raises not just my friends’ suicide risk; it raises that risk for my friends’ friends’ friends. No man is an island. Living as though we are self-creating, self-*determining, atomized entities is dangerous to ourselves and to others.
As solidarity and mutual affection disappear from our public spaces, as the horizon darkens and loneliness grows, the small lights emanating from cohesive communities—grounded in faith and motivated by charity—will shine more brightly. Connections between one lonely individual and another will become all the more precious in a society that can only value individuals for their utility.
A few years ago, a man in his thirties took his own life by jumping off the Golden Gate Bridge (as more than fifteen hundred other people have done since the bridge was built). After his death, his *psychiatrist went with the medical examiner to the man’s apartment, where they found his diary. The last entry, *written just hours before he died, said, “I’m going to walk to the bridge. If one person smiles at me on the way, I will not jump.”
Aaron Kheriaty is associate professor of psychiatry and director of the Medical Ethics Program at the University of California Irvine School of Medicine.
I'm not sure that statistic implies what the article implies that it implies. It doesn't necessarily say that religious people are any less depressed. It's very possible that they're just as miserable as everyone else but the fear of eternal damnation keeps them from offing themselves.Between 1996 and 2010, those who attended any religious service once a week or more were five times less likely to commit suicide.
I'm not sure that statistic implies what the article implies that it implies. It doesn't necessarily say that religious people are any less depressed. It's very possible that they're just as miserable as everyone else but the fear of eternal damnation keeps them from offing themselves.
I'm not sure that statistic implies what the article implies that it implies...
Whiskey posts an excellent article as always.
I'm not sure that statistic implies what the article implies that it implies. It doesn't necessarily say that religious people are any less depressed. It's very possible that they're just as miserable as everyone else but the fear of eternal damnation keeps them from offing themselves.
With a billionaire real estate tycoon occupying America’s highest office, the effects of riches upon the soul are a reasonable concern for all of us little guys. After all, one incredibly wealthy soul currently holds our country in his hands. An apocryphal exchange between F. Scott Fitzgerald and Ernest Hemingway holds that the only difference between the rich and the rest of us is that they have more money. But is that the only difference?
We didn’t used to think so. We used to think that having vast sums of money was bad and in particular bad for you — that it harmed your character, warping your behavior and corrupting your soul. We thought the rich were different, and different for the worse.
Today, however, we seem less confident of this. We seem to view wealth as simply good or neutral, and chalk up the failures of individual wealthy people to their own personal flaws, not their riches. Those who are rich, we seem to think, are not in any more moral danger than the rest of us. Compare how old movies preached the folk wisdom of wealth’s morally calamitous effects to how contemporary movies portray wealth: For example, Mr. Potter from It’s A Wonderful Life to Tony Stark (that is, Iron Man) in the Avengers series of movies.
The idea that wealth is morally perilous has an impressive philosophical and religious pedigree. Ancient Stoic philosophers railed against greed and luxury, and Roman historians such as Tacitus lay many of the empire’s struggles at the feet of imperial avarice. Confucius lived an austere life. The Buddha famously left his opulent palace behind. And Jesus didn’t exactly go easy on the rich, either — think camels and needles, for starters.
The point is not necessarily that wealth is intrinsically and everywhere evil, but that it is dangerous — that it should be eyed with caution and suspicion, and definitely not pursued as an end in itself; that great riches pose great risks to their owners; and that societies are right to stigmatize the storing up of untold wealth. And we should. That’s why Aristotle, for instance, argued that wealth should be sought only for the sake of living virtuously — to manage a household, say, or to participate in the life of the polis. Here wealth is useful but not inherently good; indeed, Aristotle specifically warned that the accumulation of wealth for its own sake corrupts virtue instead of enabling it. For Hindus, working hard to earn money is a duty (dharma), but only when done through honest means and used for good ends. The function of money is not to satiate greed but to support oneself and one’s family. The Koran, too, warns against hoarding money and enjoins Muslims to disperse it to the needy.
Some contemporary voices join this ancient chorus, perhaps none more enthusiastically than Pope Francis. He’s proclaimed that unless wealth is used for the good of society, and above all for the good of the poor, it is an instrument “of corruption and death.” And Francis lives what he teaches: Despite access to some of the sweetest real estate imaginable — the palatial papal apartments are the sort of thing that President Trump’s gold-plated extravagance is a parody of — the pope bunks in a small suite in what is effectively the Vatican’s hostel. In his official state visit to Washington, he pulled up to the White House in a Fiat so sensible that a denizen of Northwest D.C. would be almost embarrassed to drive it. When Francis entered the Jesuit order 59 years ago, he took a vow of poverty, and he’s kept it.
According to many philosophies and faiths, then, wealth should serve only as a steppingstone to some further good and is always fraught with moral danger. We all used to recognize this; it was a commonplace. And this intuition, shared by various cultures across history, stands on firm empirical ground.
Over the past few years, a pile of studies from the behavioral sciences has appeared, and they all say, more or less, “Being rich is really bad for you.” Wealth, it turns out, leads to behavioral and psychological maladies. The rich act and think in misdirected ways.
When it comes to a broad range of vices, the rich outperform everybody else. They are much more likely than the rest of humanity to shoplift and cheat , for example, and they are more apt to be adulterers and to drink a great deal. They are even more likely to take candy that is meant for children. So whatever you think about the moral nastiness of the rich, take that, multiply it by the number of Mercedes and Lexuses that cut you off, and you’re still short of the mark. In fact, those Mercedes and Lexuses are more likely to cut you off than Hondas or Fords: Studies have shown that people who drive expensive cars are more prone to run stop signs and cut off other motorists.
The rich are the worst tax evaders, and, as The Washington Post has detailed, they are hiding vast sums from public scrutiny in secret overseas bank accounts.
They also give proportionally less to charity — not surprising, since they exhibit significantly less compassion and empathy toward suffering people. Studies also find that members of the upper class are worse than ordinary folks at “reading” people’ s emotions and are far more likely to be disengaged from the people with whom they are interacting — instead absorbed in doodling, checking their phones or what have you. Some studies go even further, suggesting that rich people, especially stockbrokers and their ilk (such as venture capitalists, whom we once called “robber barons”), are more competitive, impulsive and reckless than medically diagnosed psychopaths. And by the way, those vices do not make them better entrepreneurs; they just have Mommy and Daddy’s bank accounts (in New York or the Cayman Islands) to fall back on when they fail.
Indeed, luxuries may numb you to other people — that Louis Vuitton bag may be a minor league Ring of Sauron. Some studies go so far as to suggest that simply being around great material wealth makes people less willing to share. That’s right: Vast sums of money poison not only those who possess them but even those who are merely around them. This helps explain why the nasty ethos of Wall Street has percolated down, including to our politics (though we really didn’t need much help there).
So the rich are more likely to be despicable characters. And, as is typically the case with the morally malformed, the first victims of the rich are the rich themselves. Because they often let money buy their happiness and value themselves for their wealth instead of anything meaningful, they are, by extension, more likely to allow other aspects of their lives to atrophy. They seem to have a hard time enjoying simple things, savoring the everyday experiences that make so much of life worthwhile. Because they have lower levels of empathy, they have fewer opportunities to practice acts of compassion — which studies suggest give people a great deal of pleasure. They tend to believe that people have different financial destinies because of who they essentially are, so they believe that they deserve their wealth, thus dampening their capacity for gratitude, a quality that has been shown to significantly enhance our sense of well-being. All of this seems to make the rich more susceptible to loneliness; they may be more prone to suicide, as well.
How did we lose sight of the ancient wisdom about wealth, especially given its ample evidencing in recent studies?
Some will say that we have not entirely forgotten it and that we do complain about wealth today, at least occasionally. Think, they’ll say, about Occupy Wall Street; the blowback after Mitt Romney’s comment about the “47 percent”; how George W. Bush painted John Kerry as out of touch. But think again: By and large, those complaints were not about wealth per se but about corrupt wealth — about wealth “gone wrong” and about unfairness. But the idea that there is no way for the vast accumulation of money to “go right” is hardly anywhere to be seen.
Getting here wasn’t straightforward. Wealth has arguably been seen as less threatening to one’s moral health since the Reformation, after which material success was sometimes taken as evidence of divine election. However, extreme wealth remained morally suspect, with the rich bearing particular scrutiny and stigmatization during periods like the Gilded Age. This stigma persisted until relatively recently; only in the 1970s did political shifts cause executive salaries skyrocket, and the current effectively unprecedented inequality in income (and wealth) begin to appear, without any significant public complaint or lament.
The story of how a stigma fades is always murky, but contributing factors are not hard to identify. For one, think tanks have become increasingly partisan over the past several decades, particularly on the right: Certain conservative institutions, enjoying the backing of billionaires such as the Koch brothers, have thrown a ton of money at pseudo-academics and “thought leaders” to normalize and legitimate obscene piles of lucre. They produced arguments that suggest that high salaries naturally flowed from extreme talent and merit, thus baptizing wealth as simply some excellent people’s wholly legitimate rewards. These arguments were happily regurgitated by conservative media figures and politicians, eventually seeping into the broader public and replacing the folk wisdom of yore. But it is hard to argue that a company’s top earners are literally hundreds of times more talented than the lowest-paid employees.
As stratospheric salaries became increasingly common, and as the stigma of wildly disproportionate pay faded, the moral hazards of wealth were largely forgotten. But it’s time to put the apologists for plutocracy back on the defensive, where they belong — not least for their own sake. After all, the Buddha, Aristotle, Jesus, the Koran, Jimmy Stewart, Pope Francis and now even science all agree: If you are wealthy and are reading this, give away your money as fast as you can.
In a sepia-toned portrait that looks like a dark relic of the Soviet era, five figures stand frowning in profile: Karl Marx, Friedrich Engels, Vladimir Lenin, Joseph Stalin and finally a computer-generated hot dog wearing green headphones. The image appeared on Twitter in mid-July, where it circulated among various casual users before finding its way to my feed. The wiener is not a socialist icon; in fact, he is a breakdancing sausage from a Snapchat filter. His inclusion in a lineup of the U.S.S.R.’s patron saints doesn’t mean anything. Maybe nothing does.
I am not a nihilist, but a mood of grim, jolly absurdism comes over me often, as it seems to come over many of my young peers. To visit millennial comedy, advertising and memes is to spend time in a dream world where ideas twist and suddenly vanish; where loops of self-referential quips warp and distort with each iteration, tweaked by another user embellishing on someone else’s joke, until nothing coherent is left; where beloved children’s character Winnie the Pooh is depicted in a fan-made comic strip as a 9/11 truther, and grown men in a parody ad dance to shrill synth beats while eating Totino’s pizza rolls out of a tiny pink backpack. In this weird world of the surreal and bizarre, horror mingles with humor, and young people have space to play with emotions that seem more and more to proceed from ordinary life — the creeping suspicion that the world just doesn’t make sense.
When it comes to doubting the essential meaningfulness of the world, millennials have their reasons. Studies show that traditional sources of meaning, such as religion and family formation, are less relevant to the lives of young people than they were to our parents. The moral structure they produced has been vastly loosened and replaced with a soft, untheorized tendency toward niceness — smarminess, really, as journalist Tom Scocca put it in 2013. Long-lasting careers seem out of reach; millennials are told to go to college so they can make money, but mostly they just amass debt and then job-hop in hopes of paying it off. In the meantime, they put off getting married, having kids, buying houses and so on. And waiting feels like — well, waiting. Millennials are not engaged at work (71 percent confessed this to Gallup), they have lost faith in our political system (only 19 percent say a military takeover is unacceptable), and many are lonely (57 percent reported such in a recent Match.com survey). Millennials aren’t strictly pessimistic by any means, but the occasional tussle with feelings of emptiness and despair seems de rigueur for my generation.
Yet the world is full of noise: Information is both more accessible (and perhaps more oppressively omnipresent) than ever and also less reliable; people select their own facts, and business-funded think tanks produce reports indistinguishable from hard data, except that they are not remotely true. Brands pose as friends on social media, especially to millennials, and if the line between real and artificial isn’t obliterated, it certainly seems to matter less than it once did.
Amid these trends, a particular style of expression has spread among young people. Rather than trying to restore meaning and sense where they’ve gone missing, the style aims to play with the moods and emotions of an illegible world. In a way, it’s a digital update to the surreal and absurd genres of art and literature that characterized the tumultuous early 20th century.
Tim Heidecker and Eric Wareheim are a pair of comedians whose work exists in the zone of the weird and grotesque, veering wildly between horror and humor. They made their debut on Adult Swim, basic cable’s top programming among 18-to-34-year-olds, back in 2006 and are due to release a new season of their series “Tim & Eric’s Bedtime Stories ” this fall. Their skits run the gamut from slightly to extremely surreal, with low-fi, retro graphics; distorted audio; and disjointed editing adding to the eerie feel. In one sketch, Tim and Eric compete in an increasingly deranged commercial to sell prices — fine European prices, premium prices, American-made prices, extremely small prices — no products, just prices. “It feels interesting to live in that surreal moment versus the horror of reality sometimes,” Wareheim told me, citing the prolonged, agonizingly uncomfortable shots and freakish close-ups in their show. There’s a sense of dull dread running through Heidecker and Wareheim’s work, but there’s also relief, an invitation to laugh at the awkward and absurd. “It’s an expression of that fear and anxiety,” Wareheim said, referring to one of their many skits focused on the tension of daily life. “But I just feel like it’s fun to watch our show, and you are transported to another dimension of similar things, but it’s not real, so you’re just like ‘ahh’ . . . it’s a pleasant surreal world.”
Tim and Eric are not alone. Other shows, such as Adult Swim’s “Rick and Morty ” and Netflix’s “BoJack Horseman, ” follow in this vein, imagining, as New Yorker critic Emily Nussbaum put it, “bleakness and joy” in a “teeming, surreal alternative universe.” Advertising aimed at young people, too, exhibits the trend. Consider a 2012 candy ad in which two teenagers stand nervously under the bleachers; one picks “Skittles pox” off the other’s pasty skin, then pops them in her mouth. Unlike the subcultural stoner comedy of yesteryear or the giddily absurd humor of classics like Monty Python, this breed of millennial surrealism is both mainstream and tangibly dark — it aims for wide swaths of young people, leaning in to feelings of worry, failure and dread.
Meanwhile, online culture allows more people to get in on the action, producing their own contributions to the meaningless, loopy, sometimes-sinister whirling gyre of the moment in the form of memes. In the simplest terms, memes are any pieces of cultural information that spread among groups by imitation, changing bit by bit along the way. In other words, distortion is a key attribute of this form, a warping effect that occurs as each instance of a meme grows more distant from its origin, sometimes losing any meaning whatsoever. (Gallows humor about the late Cincinnati Zoo gorilla Harambe, for instance, has transformed into a whole genre of jokes only tenuously related to the original ape.) For millennials, memes form the backdrop of life online.
Adam Downer is a 26-year-old associate staff editor at Know Your Meme, an online encyclopedia of the form where the oldest staffer tops out at about age 32, Downer told me. He spends his days scouring the Net for memes, documenting their origins and, when possible, explaining to readers what they mean. Since 2008, Know Your Meme’s staff has indexed some 11,228 memes and adds new entries to its database every day. The strangest meme he ever worked on, Downer says, was a bizarre mind-virus called “Hey Beter.” The meme consists of four panels, the first including the phrase “Hey Beter,” a riff on “Hey Peter,” referring to the main character of the comedy cartoon series “Family Guy.” What comes next seems to make even less sense: In one iteration, the Sesame Street character Elmo (wearing a “suck my a--” T-shirt) calls out to Peter, then asks him to spell “whomst’ve,” then blasts him with blue lasers. In the final panel, readers are advised to “follow for a free iphone 5.” (There is no prize.) “That one was inexplicably popular,” Downer told me. “I think it got popular because it was this giant emptiness of meaning. It was this giant race to the bottom of irony.”
Surrealism and its anarchic cousin dadaism are nothing new; neither is absurdism or weirdness in art. “The absurd,” Albert Camus wrote in 1942, “is born of this confrontation between the human need [for happiness and reason] and the unreasonable silence of the world.” Absurdity is the compulsion to go looking for meaning that simply isn’t there. Today’s surrealism draws aspects of all of these threads together with humor, creating an aesthetic world where (in common Internet parlance) “lol, nothing matters,” but things may turn out all right anyway.
After all, the weird — even the exceedingly weird — doesn’t have to be purely distressing. Consider the long-running Old Spice deodorant commercials in which a handsome hunk on a boat presents “ladies” with an oyster containing “two tickets to that thing you love,” which quickly become diamonds as he teleports onto a horse. (“I’m on a horse,” he coolly informs the 54 million people who have watched the clip on YouTube.) In his book “The Weird and the Eerie,” author Mark Fisher points out that, in most cases, “the response to the apparition of a grotesque object will involve laughter as much as revulsion.” And the weird, Fisher goes on, “is a signal that the concepts and frameworks which we have previously employed are now obsolete.” By staking out a playful space to meditate on emotions that are usually upsetting (like the dread and anxiety of living in a thoroughly postmodern world), millennial surrealism intermixes relief with stress and levity with lunacy.
There may be no mixture better suited for getting through ordinary life. In July, researchers at Harvard University announced that they had managed to store a gif inside living bacteria by altering the bacterium’s DNA. For scientists, the strange little success heralded important achievements in gene modification. Twitter user Honkimus Maximus welcomed the news with a meme depicting the “Simpsons” character Mr. Burns googly-eyed and sedate, receiving an injection of memes directly into his veins. “S O O N,” Maximus captioned the image. It already feels like now.
Elizabeth Bruenig just published an article in WaPo titled "Why is millennial humor so weird?"