Taken literally, Nietzsche's famous aphorism—"What doesn't kill me makes me stronger"—is not entirely correct. Some things that don't kill you can still leave you permanently damaged and diminished.
Joanna Andreasson
Yet in recent years, far too many parents, teachers, school administrators, and students themselves have become taken with the opposite idea—that what doesn't kill you makes you weaker. They have bought into a myth that students and children are inherently fragile. For the most part, this represents an understandable desire to protect children from emotional trauma. But overwhelming evidence suggests that this approach makes kids less psychologically stable. By over-sheltering kids, we end up exposing them to more serious harm.
Making Kids Fragile
Consider the story of one of our children, Max Haidt, on his first day of preschool in 2009. Max was 3 at the time, and before he was allowed to take the first step on his 18-year journey to a college degree, his parents, Jon and Jayne, had to attend a mandatory orientation session where Max's teacher explained the school's rules and procedures.
The most important rule, judging by the time spent discussing it, was: No nuts. Because of the risk to children with peanut allergies, there was an absolute prohibition on bringing anything containing nuts into the building. Of course, peanuts are legumes, not nuts, but some kids have allergies to tree nuts, too, so along with peanuts and peanut butter, all nuts and nut products were banned. And to be extra safe, the school also barred anything produced in a factory that processes nuts—a category that includes many kinds of dried fruits and other snacks.
As the list of prohibited substances grew, and as the clock ticked on, Max's dad asked the assembled group of parents what he thought was a helpful question: "Does anyone here have a child with any kind of nut allergy? If we know about the kids' actual allergies, I'm sure we'll all do everything we can to avoid risk. But if there's no kid in the class with such an allergy, then maybe we can lighten up a bit and instead of banning all those things, just ban peanuts?"
The teacher was visibly annoyed by the question, and she moved rapidly to stop any parent from responding. Don't put anyone on the spot, she said. Don't make any parent feel uncomfortable. Regardless of whether anyone in the class is affected, these are the school's rules.
You can't blame the school for being cautious. Peanut allergies were rare among American children up until the mid-1990s, when one study found that only four out of every 1,000 children under the age of 8 were affected—meaning probably nobody in Max's entire preschool of about 100 kids.
But by 2008, according to the same survey using the same measures, the rate had more than tripled, to 14 out of 1,000—meaning probably one or two kids in Max's school. Nobody knew why American children were suddenly becoming more allergic to peanuts, but the logical and compassionate response was obvious: Kids are vulnerable. Protect them from peanuts, peanut products, and anything that has been in contact with nuts of any kind. Why not? What's the harm, other than some inconvenience to parents preparing lunches?
It turns out, though, that the harm was severe. It was later discovered that allergies were surging precisely because parents and teachers had started protecting children from exposure to peanuts back in the 1990s.
In February 2015, an authoritative report called Learning Early About Peanut Allergy was published. The study had looked at the hypothesis that "regular eating of peanut-containing products, when started during infancy, will elicit a protective immune response instead of an allergic immune reaction." The researchers recruited the parents of 640 babies four to 11 months old who, because they had severe eczema or had tested positive for another allergy, were at high risk of developing a peanut allergy. Half the parents were instructed to follow the standard advice for high-risk kids, which was to avoid all exposure to peanuts and peanut products. The other half were given a supply of a snack made from peanut butter and puffed corn and were told to give some to their child at least three times a week. The researchers followed all the families carefully, and when the children turned 5 years old, they were tested for an allergic reaction to peanuts.
The results were stunning. Among the children who had been "protected" from exposure, 17 percent had developed a peanut allergy. In the group that had been deliberately exposed to peanut products, the number was only 3 percent. As one of the researchers said in an interview, "For decades allergists have been recommending that young infants avoid consuming allergenic foods such as peanut to prevent food allergies. Our findings suggest that this advice was incorrect and may have contributed to the rise in the peanut and other food allergies."
In fact, it makes perfect sense. The immune system is a miracle of evolutionary engineering. It can't possibly anticipate all the pathogens and parasites a child will encounter—especially in a mobile and omnivorous species such as ours—so it's "designed" to learn rapidly from early experience. As a complex, dynamic system that is able to adapt in and evolve with a changing environment, it requires exposure to a range of foods, bacteria, and even parasitic worms in order to develop its ability to mount an immune response to real threats, such as the bacterium that causes strep throat, while ignoring nonthreats such as peanut proteins.
This is the underlying rationale for what is called the hygiene hypothesis, the leading explanation for why allergy rates generally go up as countries get wealthier and cleaner. As developmental psychologist Alison Gopnik has observed, children today play outside less than they used to, which results in less exposure to microbes and weaker immune systems.
That phenomenon isn't limited to physiological developments. "In the same way," Gopnik wrote in Psychology Today, "by shielding children from every possible risk, we may lead them to react with exaggerated fear to situations that aren't risky at all and isolate them from the adult skills that they will one day have to master."
Sometimes children do need protection from real dangers. But teaching kids that failures, insults, and painful experiences will do lasting damage is harmful in and of itself. Human beings need physical and mental challenges and stressors, or we deteriorate.
To understand why this overprotective approach is so foolish, it helps to understand the concept of "antifragility," which New York University risk engineering professor Nassim Nicholas Taleb has explained by distinguishing between three kinds of things.
Some, like china teacups, are fragile: They break easily and cannot heal themselves, so you must handle them gently and keep them away from toddlers. Other things are resilient: They can withstand shocks without being permanently damaged. Parents usually give their toddlers plastic cups precisely because plastic can survive repeated falls to the floor (though the cups obviously do not benefit from such falls).
Taleb asks us to look beyond the overused word resilience, however, and recognize that some things are antifragile: They require stressors in order to learn, adapt, and grow. Many of the important systems in our economic and political life are like our immune systems in this way. Things that are antifragile become rigid, weak, and inefficient when nothing challenges them.
That is exactly what is happening on many college campuses.
The Rise of Safetyism
In the 20th century, as the United States was becoming less dangerous for children, the word "safety" was generally understood to mean physical safety. Yet in the 21st century, especially on some college campuses, the meaning of "safety" has undergone a gradual conceptual expansion to include emotional safety.
Things that are antifragile require stressors in order to learn, adapt, and grow. Many of the important systems in our economic and political life are like this: They become rigid, weak, and inefficient when nothing challenges them.
In 2014, Oberlin College posted guidelines for faculty, urging them to use "trigger warnings"—advance notice that certain kinds of ideas are likely to arise in a class—to "show students that you care about their safety." The rest of the memo makes it clear that what the college was really telling its faculty was: Show students that you care about their feelings.
You can see the conflation of safety and feelings in another part of the memo, which urged faculty to use students' preferred gender pronouns (for example, "zhe" or "they" for individuals who don't want to be referred to as "he" or "she"). The reason was not because this was respectful or appropriately sensitive, but because a professor who uses an incorrect pronoun "prevents or impairs [the student's] safety in a classroom."
If students have been told that they can request gender-neutral pronouns and then a professor fails to use those pronouns, they may well be disappointed or upset. But are these students unsafe? Are they in any danger in the classroom? Professors should indeed be mindful of their students' feelings, but how does it change the nature of class discussions—and students themselves—when the community is told repeatedly that speech should be judged in terms of safety and danger?
Why might an Oberlin administrator have chosen those particular words? In a 2016 article titled "Concept Creep: Psychology's Expanding Concepts of Harm and Pathology," the Australian psychologist Nick Haslam examined a variety of key concepts in clinical and social psychology—including abuse, bullying, trauma, and prejudice—to determine how their usage had changed since the 1980s. He found that their scope had expanded in two directions: The concepts had crept "downward," to apply to less severe situations, and "outward," to encompass new but conceptually related phenomena.
Take the word trauma. In the early versions of the Diagnostic and Statistical Manual of Mental Disorders (DSM), psychiatrists used the word "trauma" only to describe a physical agent causing physical damage, as in the case of what we now call traumatic brain injury. In the 1980 revision, however, the DSM III recognized "post-traumatic stress disorder" (PTSD) as a mental disorder—the first type of nonphysical traumatic injury.
PTSD is caused by an extraordinary and terrifying experience, and the criteria for a traumatic event that warrants a diagnosis of PTSD were (and are) strict: To qualify, something would have to "evoke significant symptoms of distress in almost everyone" and be "outside the range of usual human experience." The DSM III emphasized that this was an objective standard. It had to be something that would cause most people to have a severe reaction. War, rape, and torture were included in this category. Divorce and simple bereavement (as in the death of a spouse due to natural causes) were not, because they are normal, even if unexpected, parts of life.
These latter experiences are painful, to be sure—but pain is not the same thing as trauma. While people in these situations that don't fall into the "trauma" category might benefit from counseling, they generally recover from such losses without any therapeutic interventions. In fact, even most people who do have traumatic experiences recover completely without intervention.
By the early 2000s, however, the concept of "trauma" within parts of the therapeutic community had crept down so far that it included anything "experienced by an individual as physically or emotionally harmful…with lasting adverse effects on the individual's functioning and mental, physical, social, emotional, or spiritual well-being." The subjective experience of harm became definitional in assessing trauma. As a result, the word became much more widely used, not just by mental health professionals but by their clients and patients—including an increasing number of college students.
As with trauma, a key change for most of the concepts Haslam examined was the shift to a subjective standard. It was not for anyone else to decide what counted as trauma, bullying, or abuse; if it felt like that to you, trust your feelings. If a person reported that an event was traumatic or bullying or abusive, his or her assessment would increasingly be taken as sufficient evidence. And if a rapidly growing number of students have been diagnosed with a mental disorder, then there is a rapidly growing need for the campus community to protect them.
Safe Spaces on Campus
Few Americans had ever heard of a "safe space" in an academic sense until March 2015, when The New York Times published an essay by Judith Shulevitz about students at Brown University preparing for an event on campus. Two feminist authors, Wendy McElroy and Jessica Valenti, were scheduled to debate "rape culture," the idea that "prevailing social attitudes have the effect of normalizing or trivializing sexual assault and abuse."
Proponents of the idea, such as Valenti, argue that misogyny is endemic to American culture, and that in such a world, sexual assault is considered less serious than other crimes. It's clear, especially in the #MeToo era, that sexual abuse is far too common. But does that make for a rape culture? It seemed an idea worthy of debate.
Cognitive behavioral therapists treat trauma patients by exposing them to the things they find upsetting. By activating their fears, they help their patients grow accustomed to the stimuli.
McElroy disputes the claim that America is a rape culture, and to illustrate her argument, she contrasts the United States with countries in which rape is truly tolerated. In parts of Afghanistan, for example, "women are married against their will, they are murdered for men's honor, they are raped. And when they are raped they are arrested for it, and they are shunned by their family afterward," she said at the debate. "Now that's a rape culture."
McElroy has firsthand experience of sexual violence: She told the audience at Brown that she was raped as a teenager, and that as an adult she was so badly beaten by a boyfriend that it left her blind in one eye. Nonetheless, she thinks it is untrue and unhelpful to tell American women that they live in a rape culture.
But what if some Brown students believe that they do? Should McElroy be allowed to challenge that belief, or would doing so put them in danger? "Bringing in a speaker like that could serve to invalidate people's experiences," one Brown student told Shulevitz, and that could be "damaging."
The logic seems to be that some Brown students' belief in the existence of a rape culture in America is based, at least in part, on their own lived experience of sexual assault. If, during the debate, McElroy were to tell them that America is not a rape culture, she could be taken to be saying that their personal experiences are "invalid" as grounds for their assertion.
Illustrating concept creep and the expansion of "safety" to include emotional comfort, the student quoted above and some classmates attempted to get McElroy disinvited from the debate in order to protect her peers. That effort failed, but Brown President Christina Paxson announced that she disagreed with McElroy and that at the same time as the debate, the college would hold a competing talk in which students could hear about how America is a rape culture without being confronted by different views.
The competing talk didn't entirely solve the problem, however. Because students could still be retraumatized by the presence of McElroy on campus, the person quoted above worked with other Brown students to create a "safe space" where anyone who felt "triggered" could recuperate and get help. The room was equipped with cookies, coloring books, bubbles, Play-Doh, calming music, pillows, blankets, and a video of frolicking puppies, as well as students and staff members purportedly trained to deal with trauma.
The threat wasn't just that painful personal memories might be reactivated; it was also that people's opinions would be challenged. One student who sought out the safe space put it this way: "I was feeling bombarded by a lot of viewpoints that really go against my dearly and closely held beliefs."
The general reaction to Shulevitz's article was incredulity. Many Americans—and surely many Brown students—could not understand why such extreme measures were needed to keep college kids "safe" from ideas. Couldn't they do that by simply not going to the talk?
But if you understand the fragile-student model—the belief that many college students are fragile in Taleb's sense of the word—it makes sense that all members of a community should work together to protect those students from reminders of past trauma. In this case, members of the Brown community should demand that the president (or somebody) prevent the threatening speaker from setting foot on campus. If you see yourself or your fellow students as flickering candles, you'll want to make your campus a wind-free zone. And if the president won't protect the students, the students themselves must care for one another, which seems to have been the positive motivation for creating the safe space.
But young adults are not candles. They are antifragile, not fragile. Research shows that is true even of victims of violence and those who suffer from PTSD. Studies show that most people report becoming stronger, or better in some way, after suffering through a traumatic experience.
That obviously doesn't mean we should stop protecting young people from potential trauma, but it does mean that the culture of safetyism is based on a fundamental misunderstanding of human nature and of the dynamics of trauma and recovery. It is vital that people who have survived violence become habituated to the ordinary cues and reminders that are inevitably woven into the fabric of daily life. Avoiding triggers is a symptom of PTSD, not a treatment for it.
Cognitive behavioral therapists treat trauma patients by exposing them to the things they find upsetting—at first in small ways, such as imagining them or looking at pictures. By activating their fears, they help their patients grow accustomed to the stimuli. In fact, the reactivation of anxiety is so important to recovery that some therapists advise their patients to avoid using anti-anxiety medication while undertaking exposure therapy.
For a student who truly suffers from PTSD, appropriate treatment is necessary. But well-meaning friends and professors who coordinate to hide potential reminders of painful experiences, or who repeatedly warn the student about the possible reminders he or she might encounter, could be impeding the person's recovery. A culture that allows the concept of "safety" to creep so far that it equates emotional discomfort with physical danger is a culture that encourages people to systematically protect one another from the very experiences they need to have in order to become strong and healthy.
Safety is good, and keeping others safe from harm is virtuous, but virtues can become vices when carried to extremes. When safety becomes a sacred value, people can become unwilling to make trade-offs demanded by other practical and moral concerns. "Safety" trumps everything else, no matter how unlikely or trivial the potential danger.
When children are raised in a culture of safetyism, which teaches them to stay "emotionally safe" while protecting them from every imaginable danger, it may set up a feedback loop: Kids become more fragile and less resilient, which signals to adults that they need additional protection, which makes them even more fragile and even less resilient. The result may be similar to what happened when we tried to keep kids safe from exposure to peanuts: a widespread backfiring effect in which the "cure" turns out to be a primary cause of the disease.
This article is adapted from The Coddling of the American Mind: How Good Intentions and Bad Ideas Are Setting Up a Generation for Failure, by arrangement with Penguin Press, a member of Penguin Random House LLC. Copyright © 2018 by Greg Lukianoff and Jonathan Haidt.
Comments