Mental Health’s Stalled (Biological) Revolution: Its Origins, Aftermath & Future Opportunities
The 1980s, by common consensus, saw a big and remarkably rapid pivot away from previously dominant psychoanalytic and social science perspectives in American psychiatry and toward a so-called medical model foregrounding biology and the brain. The standard understanding is that this happened because, after years of wandering lost in a Freudian desert, the field had finally gained some fundamental new biological understandings of mental illness. The standard understanding is wrong. Nothing of sudden significance had happened on the biological front. There had been no major scientific or therapeutic breakthroughs. Why, then, did the field really pivot? This essay aims to explain. The answer is important, not least because choices made back then have directly shaped the fraught world of psychiatry with which we live today.
In the 1980s, the field of American psychiatry pivoted suddenly and decisively away from previously dominant psychotherapeutic, social scientific, and psychoanalytic approaches to mental disorder, and instead embraced biological, brain-based, and pharmaceutical approaches. Why did all this happen?
For decades, the answer seemed clear: Before the 1980s, American psychiatry was lost in a Freudian wilderness. It had turned its back on all the fundamental principles of medical practice. It had lost interest in rigorous scientific research. It was hobbled by an incredibly sloppy approach to diagnostics. It was in the thrall of fantastical theories, and interminable, ineffective treatment practices. Then, sometime in the early 1980s, just as things could hardly get worse, some heroes arrived: biochemistry and neuroscience researchers armed with new science and new treatments. They made clear that the Freudian dinosaurs had to go. And the Freudians, now outed as the charlatans they were, left. The world celebrated, and psychiatry has never looked back since. As journalist Jon Franklin put the matter in his Pulitzer Prize–winning series, “The Mind Fixers”:
Since the days of Sigmund Freud, the practice of psychiatry has been more art than science. Surrounded by an aura of witchcraft, proceeding on impression and hunch, often ineffective, it was the bumbling and sometimes humorous stepchild of modern science. But for a decade and more, research psychiatrists have been working quietly in laboratories, dissecting the brains of mice and men and teasing out the chemical formulas that unlock the secrets of the mind. Now, in the 1980s, their work is paying off.1
In the years since Franklin’s series, that basic story continued to make the rounds in both textbooks and popular writings for the public. With time, it took on new elements, such as an insistence that German anatomist and diagnostician Emil Kraepelin was the father of modern psychiatry, not Sigmund Freud. By way of example, Richard Noll’s The Encyclopedia of Schizophrenia and other Psychotic Disorders told the updated story this way:
It took major advances in medical technology, specifically the computer revolution and the rise of new techniques in neuroimaging, genetics research and psychopharmacology to swing the pendulum back to Kraepelin’s search for the biological cases of psychotic disorders. Historians of science now regard psychoanalysis as a pseudoscience that inexplicably dominated a subfield of medicine—psychiatry.2
Let us start by conceding the obvious: we have here a great and bracing story, a story with a strong moral message, a story with clear heroes and villains. We also have a story with a purpose: to be inspiring to researchers and members of the general public alike. The only problem with the story is that it is wrong. And not just a little wrong, but wrong in almost all its particulars. And this matters beyond the obvious reason that we should do right by the facts of history. It also matters because it implies that psychiatry, having shaken off the errors of the past, must be today in a stable and upward-trending space, steadily harvesting the fruits of its investments in biological research.
Psychiatry, however, is not in such a space. It is instead in a place of stalemate and uncertainty. On April 1, 2021—in his final essay prior to retiring from The New York Times—long-serving science journalist Benedict Carey told a different story about the state of the field, as he had experienced it over the decades. “When I joined the Science staff in 2004,” he reflected, “reporters in the department had a saying, a reassuring mantra of sorts: ‘People will always come to the science section, if only to read about progress.’ I think about that a lot as I say goodbye to my job, covering psychiatry, psychology, brain biology and big-data social science, as if they were all somehow related.” The truth was, he said, “during my tenure, the science informing mental health care did not proceed smoothly along any trajectory.” It did chalk up the occasional significant discovery (for example, identifying levels of consciousness in brain-injured patients who appear unresponsive), but “almost every measure of our collective mental health—suicide rate, anxiety, depression, addiction—went in the wrong direction.”3 In his 2022 book Healing, Thomas Insel, former director of the National Institute of Mental Health, told a similar story from the vantage point of a long-serving scientific leader in the field:
The scientific progress in our field was stunning, but while we studied the risk factors for suicide, the death rate had climbed 33 percent. While we identified the neuroanatomy of addiction, overdose deaths had increased by threefold. While we mapped the genes for schizophrenia, people with this disease were still chronically unemployed and dying 20 years early.4
The conclusion is obvious: the field is being called to update its image of itself and to forge a path to a different future. To do that successfully, however, it also needs to begin by shedding its attachment to self-serving origin myths and start on a more honest path to understanding how it has arrived in its present state.
When the field declared its liberation from Freud and announced a biological revolution was at hand, nothing of sudden significance had happened on the biological front. There had been no new treatments. All the treatments that were extolled in those years, especially drugs, were thirty years old, products of the 1950s, when the field was supposedly stalled and in the thrall of the Freudians. There had also been no major scientific breakthroughs. The most significant scientific advances in the field, such as they were, had also happened more than a generation earlier, during the alleged Freudian dark ages. In the 1950s and early 1960s, scientists, largely working at the NIH, had shown that different drugs can act to raise or lower levels of various newly discovered neurochemicals, with names like dopamine, norepinephrine, and serotonin. At the time, no one had used that work as the basis for declaring a wholesale revolution in mental health care or treatment.
Why then did the field really pivot? The short answer is: not because of science, but because of complacency, arrogance, and professional overreach that led to an internal revolt. The long answer, however, is more illuminating and worth taking time to understand.
In the decades just before World War II, American psychiatry was an eclectic patchwork of practices and perspectives, some biological and some more environmental. The biologically oriented psychiatrists worked mostly in state hospitals and looked after the severely and chronically mentally ill. While there had been a tendency since the early twentieth century to see hospital psychiatry as a backwater branch of medicine, the 1930s had also seen a modest rise in its public reputation, as new somatic interventions like shock and surgical treatments were introduced.5 Even lobotomies, today remembered as one of the most barbaric and ill-considered technologies ever employed in the history of psychiatry, were back then often discussed by the press in relatively optimistic ways.6
The more environmentally oriented psychiatrists, working largely outside the hospital system, were meanwhile focused on a very different kind of mission: to identify and treat people who were not yet truly mentally ill, but who were also not quite right: troubled people, nervous people, neurotic people, maladjusted people. Virtually everyone admitted that some of these people might be incorrigibly defective, and therefore best handled through institutionalization in a colony of the “feeble-minded” or through more radical measures like sterilization.7
Nevertheless, there was a general view that, for many others, the roots of their troubles lay not in some biological defect but in bad habits, bad neighborhoods, and bad families. This suggested that many might still be salvageable. To rescue them, this branch of psychiatry invented a wide range of new institutions and programs: new kinds of public education efforts, new forms of outreach into schools and communities, new professions like psychiatric social work, and new institutions like child-guidance centers and psychiatric outpatient clinics. By the 1930s, many of the psychiatrists involved in these programs had also discovered psychoanalysis and were incorporating Freudian ideas about unconscious conflict, fantasy, and early childhood trauma into the ways they thought about their patients.8
Through the 1920s and 1930s, the biological and environmental approaches to managing mental distress, disorder, and deficiency coexisted, more or less equitably if a bit uneasily. World War II changed that dynamic. When the war came, it was primarily the psychiatrists who were focused on “nearly normal” populations of patients who stepped up. Their tools and approaches seemed far better suited for treating the epidemic of traumatized soldiers, and patching them back together using techniques they had used on their neurotic and maladjusted patients back home, such as quick psychotherapy and suggestive therapy. They were sent into the fields, and many documented the impressive results of their techniques. “The stuporous become alert, the mute can talk, the deaf can hear, the paralyzed can move, and the terror-stricken psychotics become well-organized individuals.”9
Widely seen as a team that had gotten the job done—even as it was quietly recognized internally that they had fallen short in many ways—the Freudian-leaning contingent of psychiatry next took the position that, because they had helped win the war in ways that their biological colleagues had not, it was they who were now best placed to maintain the peace.10 The battle mentality that had served them so well during World War II now had to be applied to the urgent mental health needs of civilians in a dangerous postwar world, they said. In May 1948, William Menninger—who had served during the war as the Chief Psychiatric Consultant to the Surgeon General of the Army—met with President Harry Truman, and asked if he would be willing to send “a message of greeting” to be read at the upcoming annual meeting of the American Psychiatric Association. Truman approved the following statement—probably written by Menninger himself:
Never have we had a more pressing need for experts in human engineering. The greatest prerequisite for peace, which is uppermost in the minds and hearts of all of us, must be sanity—sanity in its broadest sense, which permits clear thinking on the part of all citizens. We must continue to look to the experts in the field of psychiatry and other mental sciences for guidance in the evaluation of our mental health resources.11
“The greatest prerequisite for peace . . . must be sanity.” This hardly seems like a medical project in the ways that most people would understand the term—because it really wasn’t. It was a political project. Building on the environmentalist thinking of the interwar years that had produced social workers and child-guidance clinics, Menninger and many of his colleagues had come to believe that most social problems had their origins in individual psychological deficits. For this reason, psychiatry in the postwar era was crucial for any and all efforts to tackle the great social and political threats of the age: the allure of authoritarian governments, the persistence of anti-Semitism, and the scourge of chronic poverty, social deviance, crime, and social unrest. In 1946, a group of bold psychiatrists headed by Menninger fashioned themselves into an organization called the Group for the Advancement of Psychiatry (GAP) to map out a new and expansive agenda for their field.12
As they shored up their authority, GAP’s leadership also went to the trouble of explicitly attacking the treatments within biological psychiatry that had once won them some claims to respectability: shock and surgical treatments. Their very first white paper targeted electroshock treatment, warned against its “reported promiscuous and indiscriminate use,” and insisted that it should never be seen as a primary treatment in its own right, but employed, if at all, only as an “adjuvant in a total psychiatric treatment program” that centered psychotherapy and other psychosocial interventions.13
That same year, Truman was persuaded to sign legislation that would establish the very first federal agency devoted to psychiatry. Tellingly, the decision was made to call the agency not the National Institute of Mental Illness or the National Institute for Insanity, but the National Institute of Mental Health (NIMH). The choice of name was intended to signal that the institute was charged to extend beyond a focus on disease, beyond a conventional medical agenda.14 The first director of the NIMH, Robert Felix, had a primary background in public health and a keen interest in the psychosocial causes of drug addiction. As he explained, “I was interested in the stories I was getting from these people about why they relapsed to drugs or why they got on drugs in the first place. I’d get stories like bad companions, disappointment with life, I couldn’t stand the pressure.”15
Felix’s disciplinary leanings helped ensure that, from the beginning, the new NIMH prioritized a community-minded, social science-inflected approach to mental health and illness above the somatic concerns of the old hospital-based psychiatry (though the older concerns were not wholly absent). In 1952, Felix asked a psychoanalyst named Robert Cohen to take charge of developing the NIMH intramural research portfolio. Cohen brought an expansive, interdisciplinary vision to the charge, with lots of space for social science, developmental, and psychoanalytic perspectives, including a laboratory of socioenvironmental studies.16
It was obvious which way the winds were blowing. Already, by 1947, more than half of all American psychiatrists (the elite half) worked in private practice or at outpatient clinics. By 1958, only about 16 percent of psychiatrists—many of them foreign nationals—were working in state hospitals.17 Two years later, 95 percent of medical schools reported teaching psychoanalytic and psychodynamic methods, and virtually every departmental chairperson affirmed that psychodynamic approaches dominated the field.18
Contrary to what many of us today might suppose, the arrival of antipsychotics, anxiolytics, and antidepressants in the 1950s was not widely perceived as a threat to any of this. All products of clinical serendipity rather than biological research, the drugs were, to be sure, almost immediately embraced by clinicians (including general practitioners) for their practical benefits. Within psychiatry, hospital administrators welcomed especially the ability of the class of drugs then known as “major tranquilizers” to manage people with agitated psychoses, and speculated that their existence might even allow the hospitals to begin to discharge more patients.19
Nevertheless, the intellectual leadership within psychiatry was reluctant to pronounce the drugs to be some kind of game-changer for the field. Looking back in 1975, NIMH Director Robert Felix explained his own position at the time. Electroconvulsive treatment, insulin shock therapy, and lobotomy, he recalled, had also once been hyped as game-changers, only to fall short of expectations and cause more harm than good. What reason was there to think that the drugs would be any different?
We had all been praying for the pill or a draught of medicine or whatnot which would cure the madman. Well, we would sit, and over and over again, something would come up, and it was the answer. Shock was. Insulin was. Lobotomy was another one. One thing after another was going to cure all kinds of ills. . . . [For this reason] I wanted to approach [the new drugs] a little more conservatively and I think I was wrong.20
Nevertheless, some mental health activists at the time (led by journalist-turned-lobbyist Michael Gorman) began to put pressure on Congress to allocate funds to the NIMH so its researchers could study these drugs more systematically. And, under pressure, Felix finally agreed in 1956 to create a new research unit within the NIMH: the Psychopharmacology Service Center (PSC). The purpose of this center was to figure out strategies for evaluating the efficacy of the drugs. Did your study need drug-naive subjects? Did you need a placebo in your control group? How long would you look for possible improvement, and what measures would you use to assess it? All these questions needed to be answered, and a young psychiatrist named Jonathan Cole was hired to spearhead the effort.21 The upshot was that not only was the staff at the PSC able to demonstrate that new drugs like chlorpromazine worked better than placebos, but along the way, they also largely invented the toolkit for a new field called clinical psychopharmacology.
By the mid-1950s, some of the new antidepressant drugs had begun to inspire new kinds of laboratory research. More specifically, physiologists at the National Heart Institute of the NIH (not the NIMH itself) had begun to experiment with the behavior and physiology of laboratory animals by first dosing the animals with reserpine (one of the new major tranquilizers), and then injecting them with one of the new antidepressants. They found that a protocol like this first sedated and then energized the animals, while simultaneously altering levels of newly discovered chemicals in their nervous systems (serotonin and norepinephrine). The ongoing efforts to figure out the mechanism responsible for these changes led to Julius Axelrod being awarded a Nobel Prize in 1970 for his work on the ways antidepressants act to inhibit the reuptake of certain neurotransmitters in the synapse.22
Even with these developments, Freudian and psychosocial ideas still dominated both research and practice. Few if any drew the conclusion, at least publicly, that psychopharmaceutical researchers’ wins justified calling for a radical changing of the guard. Quite the contrary, in the years following President Johnson’s declaration of a “war on poverty” in 1964, the NIMH itself doubled down on its commitment to psychosocial research, investing in projects like ongoing outreach for troubled children; understanding the effects of poverty, social isolation, and racism on mental health; and addressing social ills such as juvenile delinquency and violence.
Among their many projects in these years, however, none was more consequential than the so-called community mental health initiative. It envisioned a dramatic recentering of the nation’s care of the severely mentally ill away from the century-old state hospital system and toward community-based care that would allow patients to live among ordinary people in the neighborhoods from which they came.
Discontent with the state mental hospital system went back to at least the immediate postwar years when conscientious objectors undertook a campaign to expose the hospitals’ appalling conditions.23 The most famous of the exposés was a Life magazine spread called “Bedlam 1946.” The photographs in this spread had self-consciously aimed to remind people of other images recently seared in their imaginations: Nazi concentration camps.
Thousands spend their days—often for weeks at a stretch—locked in devices euphemistically called “restraints”: thick leather handcuffs, great canvas camisoles, “muffs,” “mitts,” wristlets, locks and straps, and restraining sheets. Hundreds are confined in “lodges”—bare, bedless rooms reeking with filth and feces—by day lit only through half-inch holes in steel-plated windows, by night merely black tombs in which the cries of the insane echo unheard from the peeling plaster of the walls.24
The idea that mental health care was most successful when carried out in the community was also not new. It had its origins in so-called “first-aid” psychiatry: early-intervention care for soldiers during World War II carried out in settings that kept the men close to their platoons and friends. After the war, when psychiatry began to turn its attention to the mental health challenges found in the civilian population, many remembered these wartime experiences and wondered if there were lessons for the postwar era. Should psychiatry still privilege an approach to care that involved shipping mentally ill people away to remote hospitals, disconnecting them from familiar communities and neighborhoods? Was there possibly another way forward?
Even with all this restless desire for change, no one had been able to imagine a workable alternative to the mental hospital for the seriously or chronically mentally ill. For decades, it was simply assumed that such people either could not care for themselves outside of an institutional setting, that they would pose a risk to society if they lived in the community, or both.
What was different now? Drugs. Not because the leaders in the field believed that the drugs were key to a new biologically based approach to mental health care, but because they were persuaded that the drugs were critical managerial tools for realizing their bold policy goals. The argument was that even if the drugs did not cure any ailment, they might nevertheless be able to stabilize many patients to the point at which they could be discharged to the community. In the optimistic words of John F. Kennedy when he announced his hopes for a new community mental health care program in February 1963: “This approach relies primarily upon the new knowledge and new drugs acquired and developed in recent years which make it possible for most of the mentally ill to be successfully and quickly treated in their own communities and returned to a useful place in society.25
By October 1963, Kennedy had signed the relevant legislation, and the NIMH began to hand out grants for states to build community mental health centers. The centers started to get built, though not as many as had been expected, and with staffing levels that often fell far short of need. The states nevertheless began to release the patients from their hospitals in great numbers. To get a sense of the scale of the shift: In 1955, there were 350 state hospitals with a resident population of about 560,000. By 1977, there were 160,000 patients in public mental hospitals, a drop of 400,000 (71 percent) in just two decades. By 1994, there were only about 70,000 patients being treated in mental hospitals around the country—and this during a time when the U.S. population as a whole nearly doubled (from 150 million to about 260 million). The state governors embraced these changes as an opportunity to slash budgets. The hospitals had always cost too much anyway.26
The drugs were supposed to stabilize all these people sufficiently to make it possible for them to be looked after in the community, but it soon became clear that the drugs achieved this imperfectly. Medicated patients were still often unwell on many levels: they lacked motivation, they still acted in ways that discomfited their neighbors, and they failed to keep appointments. Moreover, because the drugs also produced significant unpleasant side effects, many patients, once they were released from the hospital, stopped taking them. By the late 1970s, countless mentally ill people who had previously lived in hospitals were now living instead in dreary for-profit boarding houses with little health care, on the streets, or in jails. Or, if they were lucky, they were living with their aging parents, who felt betrayed by the system, were desperate for better care and resources, and were becoming increasingly angry.27
Trouble started to brew for the psychiatrists driving all of these programs, and the growing recognized failures of deinstitutionalization were only part of the reason. The 1970s brought a perfect storm of crises that increasingly shook the palace of their authority. Protests against the Vietnam War began to target not just the government but also psychiatry, as clinicians working in the VA hospitals found themselves accused of covering up for the government’s failings by withholding the truth about what the war was doing to soldiers’ mental health.28 Feminism was on the rise, and in that context, psychoanalysts found themselves accused of covering up the scandalous truth of childhood sexual abuse.29 Gay, lesbian, and bisexual activists began to picket outside meetings of the American Psychiatric Association, insisting that they were sick and tired of having their love interests made into a sign of disease.30 Multiple critics associated with a movement sometimes called “antipsychiatry” began to notice that psychiatry did not seem to be very interested in conventional medical issues, and suggested the field only cared about managing social deviance.31 As a recession hit the American economy in the mid-1970s, with all these critiques in the air, health insurance companies began to ask why they should reimburse clinicians who didn’t seem to practice medicine, and didn’t seem to know or care much about disease.
As the storms whipped around psychiatry, the out-of-power biological wing of the field sensed an opportunity and, perhaps, some responsibility to step up. Enough was enough. The field had gotten itself into the problems it had by being both unscientific and hubristic. It was time to pull back and get down to brass tacks—become “medical” once more. Or to put the matter more bluntly, it was time for biologists to be in charge. As Samuel Guze, one of these biologists, mused in 1994: “One of the things we began to realize is that there were people around the country who felt that they wanted something different and were looking for someplace to take the lead.”32
How did they make their case? Tellingly, while they gestured to the research from the 1950s and 1960s, their arguments were largely waged on a platform of common sense. Of course psychiatry is a branch of medicine! Of course mental illnesses are real diseases with real biology! Of course the field should respect scientific methods! Of course exact diagnosis is important! How could we have ever let the situation degenerate to the point where such things could be questioned?33
In 1978, Gerald Klerman, director of the Alcohol, Drug Abuse and Mental Health Administration (which at the time oversaw the NIMH and several related NIH institutes), appointed Herbert Pardes as director to the NIMH, and charged him to turn the institute around. The organization needed to shed its long-standing psychosocial activist mission, and align itself with the medical mission of the rest of the NIH. In pursuing this project, Pardes found an unexpected but ultimately very powerful ally: families of schizophrenic patients. Families who had lived through the traumas of deinstitutionalization and the chronic stresses of trying to navigate a community-based mental health system that generally failed to deliver adequate services. Families who, at the same time, had been told by psychoanalytic psychiatrists that they—and especially the mothers—were responsible for making their children sick in the first place.
In 1982, a young psychiatrist named E. Fuller Torrey published a book titled Surviving Schizophrenia. The audience for the book was not patients or doctors but families. They too needed a manual to help them “survive” the disorder, he said, especially in light of the enormous burden now being placed on them. Surviving Schizophrenia opened by making perfectly clear that these families were as much victims as their offspring. Schizophrenia, Torrey told them, was “now definitively known” to be a “brain disease,” and they could best help both themselves and their children by working to persuade the government and the profession to acknowledge this fact and commit to biological solutions for a biological problem.34
They took this advice to heart. Taking the name of NAMI—the National Alliance for the Mentally Ill—these families embarked on a stunningly successful media, fundraising, and governmental pressure campaign to redirect psychiatry along biological lines. “Remedicalization is what we families want,” declared one of them in 1979.35 Pardes, who attended their first meeting that same year, marveled at their energy and effectiveness.36 One anonymous NIMH official later called NAMI, ferocious as they were, “the barracuda that laid the golden egg.”37 It was perhaps an unlikely partnership, but it worked because both families and a profession in crisis had decided, for different reasons, that biology was a road to redemption for the profession and a fresh start for patients.
And so it went that biology won the day—partly with the help of those activists and partly because Freudian psychiatry proved unable to recover from all of the self-inflicted wounds of the 1970s. In 1980, an initially humdrum project to revise the profession’s diagnostic and statistical manual turned into an opportunity to expunge virtually all psychoanalytic language and concepts from the universe of psychiatric diagnostic categories, and (in the eyes of many) to set the field up for a new era of rigorous, biological practice and research.38 In 1997, Edward Shorter summed up the 1980s consensus (as well as his own at the time):
The appearance of DSM-III was . . . an event of capital importance not just for American but for world psychiatry, a turning of the page on psychodynamics, a redirection of the discipline towards a scientific course, a reembrace of the positivistic principles of the 19th-century, a denial of the antipsychiatric doctrine of the myth of mental illness. . . . Freud’s ideas, which dominated the history of psychiatry for the past half century, are now vanishing like the last snows of winter.39
The biological psychiatrists had declared victory, but had done so in the absence of any new radical breakthroughs in biological understanding or treatment. Their next task was to deliver on the promises that most people thought they had already kept. Reality needed to catch up with rhetoric. Initially, some felt that the 1990s would be the decade when it would all come together. Biological research would finally get the money it had been starved of for so many decades, and new insights and evidence-based treatments would follow in short order.40
Early on, the field was particularly bullish about the potential of new brain imaging technologies (both PET and fMRI) to be a game-changer. The hope was that, in due course, technologies like these would allow psychiatrists to look at the brains of their patients in the same way that a cardiologist looks at the heart of patients using an angiogram—in order to “see” what is wrong. Intensive investment in these technologies failed, however, to move knowledge of mental illness forward in the definitive ways that so many psychiatrists had hoped. There were plenty of findings, but they varied across studies and proved hard to replicate and interpret.41 Above all, the new neuroimaging work failed to have any appreciable impact on how the overwhelming majority of patients were diagnosed and treated. As Thomas Insel, director of NIMH, soberly concluded in 2010: “During the so-called Decade of the Brain, there was neither a marked increase in the rate of recovery from mental illness, nor a detectable decrease in suicide or homelessness—each of which is associated with a failure to recover from mental illness.”42
What about genetic research? In the late 1980s, it briefly looked like there had been a decisive breakthrough, when the claim was made that a certain segment of DNA on a particular chromosome was found in some 80 percent of people suffering from manic depression—at least, in a particular community of Amish people, where the work had been carried out.43 But that turned out to be a false lead, and the original hope that there would be a “bipolar gene” was deemed naive, and gave way to a hunt for multiple genes.44 This was followed by a recognition that genetic risk factors might be shared across disorders. And it all led to a growing reluctant understanding that research into the genetics of mental disorders was going to be very complicated, and it could be not years but decades before any of the work yielded practical results for patients. In 2001, David Dunner, a leading researcher on mood disorders, reflected wistfully on this period of recalibration:
I am disappointed that we have never identified the “bipolar gene.” . . . I realize now how complicated it is and how naïve we were. Very good people are now looking for the genes, not a single gene. I am not going to be the one to find them, but it would be nice to know that there really are genes when patients ask, “Is this a genetic disorder?” and I can only say, “Well, we think so.”45
There were also no fundamental breakthroughs in drug development. New variants on older drugs—like the SSRI (selective serotonin reuptake inhibitor) antidepressants and the new antipsychotics like clozapine—were an improvement in the sense that they caused fewer acute side effects than their predecessors—no small thing. Their side-effects profile also meant they tended to be far more widely prescribed than their counterparts had been. But they generally did not work better than the older drugs, they did not work for everyone, and over time their own long-term health consequences began to become clearer.46
Nevertheless, and rather paradoxically, this was still the era when drugs began to dominate virtually all conversations about how to handle mental suffering, certainly among psychiatrists (as opposed to psychologists and social workers). This new consensus, however, did not happen simply because everyone now “believed” in the medical model, or because prescribing privileges were one of the few things that still allowed psychiatrists to assert their identity as physicians, or because in the 1990s, psychoanalysis continued to suffer an onslaught of steady blows to its reputation. All these factors were true and relevant, but by the late 1980s, they were dramatically amplified by a critical mass of clinicians and researchers who had aligned their professional interests with the commercial interests of the pharmaceutical industry. Feeling like the poor relations of the medical world—and financially pinched by the incursion of psychology and social work onto their turf—the siren call of consulting work was difficult to resist. In 2008, disclosure reports filed by 273 speakers at the annual meeting of the American Psychiatric Association revealed that, among them, the speakers had signed 888 consulting contracts and 483 contracts to serve on so-called speakers’ bureaus for drug companies.47
None of these developments, though, changed the bottom line: there had been no significant scientific advances to guide drug development since the 1960s. In spite of what the public believed, when drugs dominated conversations about mental health from the 1990s through 2010, that period was in fact, as one article from Nature Review admitted, “a barren time for the discovery of novel drugs for psychiatric disorders.”48 As their patents ran out, as they struggled with a growing and puzzling placebo-effect problem, and as nothing genuinely new seemed to be coming through the pipeline, the drug companies began to abandon the field. They just couldn’t figure out any new ways to make big money anymore.49
And then came one final blow. Psychiatry’s diagnostic manual, the so-called DSM, once hailed as a foundational text for a new, medically minded psychiatry, came under public attack—not just by disgruntled outsiders (that had been happening since the 1990s), but by informed insiders. More specifically, in 2013, Insel, director of the NIMH, declared that the DSM had not only failed to deliver on its promise to drive biological research but had actually impeded such research, adding: “Biology never read that book.” He announced that the NIMH would no longer be using it as a basis for any of its research initiatives. It was an amazing slap-down. This, after all, was the book that was supposed to act as the foundation for psychiatry’s biological mission.50
The DSM upset happened in 2013. Two years later, in 2015, Insel made another move that suggested the malaise within the field had now reached endemic levels. He declared that he was resigning from the directorship of the NIMH and abandoning biological research, because, despite billions of dollars in investment, it just hadn’t been able to deliver on its promises. A year or two later, he told a journalist what had driven his thinking at the time.
I spent 13 years at NIMH. . . . I succeeded at getting lots of really cool papers published by cool scientists at fairly large costs—I think $20 billion—I don’t think we moved the needle in reducing suicide, reducing hospitalizations, improving recovery for the tens of millions of people who have mental illness. . . . I hold myself accountable for that.51
The conclusion seems clear. The “revolutionary” biological psychiatry that was born in the 1980s had, by 2017 or so, largely run into the sands. It just had not been able to advance at a pace needed to maintain its relevance in response to the urgent mental health needs of the times.
A year or two after that moment of confession, though, there were some signs that the story around drugs might be shifting for the first time in years. In 2019, the FDA approved Janssen Pharmaceuticals’ request to market what some hailed as the first truly new kind of antidepressant in decades: esketamine, a reworked version of an old veterinary anesthetic drug, but better known to most as a trance-inducing party drug called Special K. Later that same year, in November, the FDA designated the psychedelic psilocybin (magic mushrooms) a breakthrough therapy for severe depression. The “breakthrough therapy” category is used for drugs deemed to have so much promise that the FDA wants to expedite the process of bringing them to the market.52 In July 2022, the U.S. Department of Health and Human Services under the Biden administration indicated that the FDA was also now on track to approve, within two years, not just psilocybin but also MDMA (ecstasy) as treatments for depression and post-traumatic stress disorder, respectively.
Both psilocybin and MDMA are currently classified as Schedule 1 drugs under the Controlled Substances Act, meaning they had previously been deemed to have both no recognized medical use and a high potential for abuse. The new drive to reframe them as promising psychotherapeutic tools is of course partly a response to the flight of the pharmaceutical industry from the mental health sector, and the sense that something has to be done.53 But we also need to understand these developments as part of a larger political story: the growing backlash against the legacies of the 1970s and 1980s War on Drugs, a phenomenon that became shamefully racialized, especially in the United States. In that context, some have already begun to call attention to the ongoing if more quiet racial politics operating behind the partial rehabilitation of the psychedelics. Efforts to decriminalize psychedelics, in the absence of a more wholesale review of the relationship between currently illegal drug use and our carceral system, they say, represents a kind of “psychedelic exceptionalism” that implicitly privileges the experiences of the wealthy and the white.54
Both hope and hype seem to have returned, at least in this one modest sector of the field. For the first time in decades, we see newspapers announcing a new “revolution” in mental health care.55 We see investors getting excited: the market for psychedelic substances has been projected to grow from $2 billion in 2020 to $10.75 billion by 2027.56 We learn from a new generation of company websites that we are no longer dealing with the psychopharmaceutical industry of our parents’ or grandparents’ generation. This new version of pharma is no longer big but intimate. It is no longer run by middle-aged white men but by a new generation of diverse visionaries. It “thinks differently” than the industry that failed patients for so long, and is “redefining” the field so that “unmet needs” can finally be addressed.57
The story here is unfinished, but there is good reason to think that future scholars will go far if they focus on following the money. It is notable, for example, that Compass Pathways has recently (in 2021) come under scrutiny for its allegedly “scorched earth” approach to the filing of international patents for multiple aspects of its treatment protocols and target disorders.58 Meanwhile, while the therapeutic benefits of these developments for patients remain unclear, the turn to psychedelics does not represent an obvious professional win for biological psychiatry, at least the kind of biological psychiatry that has dominated in the field for the past forty or more years. On the contrary, the psychedelic therapies together challenge a basic assumption of conventional biological psychiatry: namely, that the way to address symptoms of depression or anxiety is to take a pill and wait for one’s symptoms to improve. The model here is different: to ingest a substance in order to create a mind-altering experience—supported by one or more trained psychotherapists—that is supposed to result in new and enduring insights and emotional recalibrations. At a 2017 conference held on the promise of psychedelics, Insel noted that he was struck by the way that people involved in this new work emphasized that it was “psychedelic-assisted psychotherapy.” In all his years as a psychiatrist and as director of the NIMH, he commented wryly, he had never heard anyone ever talk about “antidepressant-assisted psychotherapy.”59
Back in the 1980s, biological psychiatry was largely successful in stepping in and setting the agenda and funding priorities for the field of mental health care as a whole. It could do so because the field was at risk of losing its medical identity, as well as its credibility, and there was little perceived room for compromise. But it is not the 1980s. The field no longer needs to protect itself from imagined powerful rivals. There is an opportunity now to do a reset, in which the field locates itself not at the top of the hierarchy but in a larger and more collaborative ecosystem of mental health research and care. Embedded in such an ecosystem, biological psychiatry will come to discern when its approaches will dominate that system and when they will play a smaller role.
Here is just one recent example of when its approaches should not dominate. In May 2021, responding to the nationwide reckonings with racial inequity triggered by the murder of George Floyd, the American Psychiatric Association declared that the theme of their annual meeting would be “Finding Equity through Advances in Mind and Brain in Unsettled Times.” It was a remarkably unstable title, one that seemed to still be trying to hold onto a conventional medical research mission (“advances in mind and brain”), even as it acknowledged the “unsettled times” in which the field now had to pursue that mission.60 There is little reason to suppose that a conventional research strategy focused on “advances in mind and brain” will help the field “find equity.” Brain scientists and geneticists can be as committed to a social and political mission of reform as much as anyone else, but they do not possess the tools or expertise to lead the way. Something different is needed, and, if this point gets made more and more plainly, we are likely to see the emergence of new kinds of leaders who will insist on funding priorities, research questions, and forms of training for clinicians that will have little to do with advancing conventional biological research. And that is okay. Knowing when to step up and when to step back is arguably one of the most powerful acts of leadership that any discipline or field can offer. This is the kind of future I wish for the field of American psychiatry.
Endnotes
- 1Jon Franklin, “The Mind-Fixers,” The Evening Sun, July 23, 1984.
- 2Richard Noll, The Encyclopedia of Schizophrenia and Other Psychotic Disorders, 3rd ed. (New York: Infobase Publishing, 2009). See also the preface to Michael R. Trimble and Mark George, Biological Psychiatry, 3rd ed. (New York: Wiley, 2010).
- 3Benedict Carey, “Science Plays the Long Game. But People Have Mental Health Issues Now,” The New York Times, April 1, 2021.
- 4Thomas Insel, Healing: Our Path from Mental Illness to Mental Health (New York: Penguin Press, 2022), xvii.
- 5“New Hope for Mental Patients: Insulin ‘Shock Treatment’: Remarkable Cures of the ‘Split Mind,’” The Manchester Guardian, July 13, 1938, 9; John J. O’Neill, “Insane Now Get Chance to Live Life Over Again: Insulin Shock Rolls Back Time for Their Mentality While They Are in Coma They Wake to New World Then Re-Evolve, Minus the Previous Burden of Errors,” New York Herald Tribune, May 15, 1937; and Jonathan Sadowsky, “Beyond the Metaphor of the Pendulum: Electroconvulsive Therapy, Psychoanalysis, and the Styles of American Psychiatry,” Journal of the History of Medicine and Allied Sciences 61 (1) (2006): 1–25.
- 6Gretchen J. Diefenbach, Donald Diefenbach, Alan Baumeister, and Mark West, “Portrayal of Lobotomy in the Popular Press: 1935–1960,” Journal of the History of the Neurosciences 8 (1) (1999): 60–69.
- 7Eugenic Archives, “The Burden of the Feebleminded,” American Philosophical Society, 1913 (accessed July 24, 2019); Henry Herbert Goddard, The Kallikak Family: A Study in the Heredity of Feeble-Mindedness (New York: Macmillan, 1912); and Elwood Street, “State Makes Forward Step in Mental Health: Sterilization of Feeble Minded, Colonies, Institutions Aimed at Ending Pauper Idiot Situation in Kentucky,” The Courier-Journal, May 24, 1920.
- 8Raymond Gosselin, “Mental Hygiene in the Community: By Clara Bassett. New York: The Macmillan Company, 1934,” Journal of Mental Science 80 (330) (1934): 571–573; Porter Raymond Lee, Marion E. Kenworthy, Sarah Col Ivins, et al., Mental Hygiene and Social Work (New York: Commonwealth Fund, Division of Publications, 1939); Kathleen W. Jones, Taming the Troublesome Child: American Families, Child Guidance, and the Limits of Psychiatric Authority (Cambridge, Mass.: Harvard University Press, 1999); and Sol Cohen, “The Mental Hygiene Movement, the Development of Personality and the School: The Medicalization of American Education,” History of Education Quarterly 23 (2) (1983): 123–149.
- 9Roy Richard Grinker and John P. Spiegel, War Neuroses (Philadelphia: Blakiston Company, 1945); Hans Pols, “The Tunisian Campaign, War Neuroses, and the Reorientation of American Psychiatry During World War II,” Harvard Review of Psychiatry 19 (6) (2011): 313–320; and Hans Pols and Stephanie Oak, “WAR and Military Mental Health,” American Journal of Public Health 97 (12) (2007): 2132–2142.
- 10Andrew Scull, “The Mental Health Sector and the Social Sciences in Post-World War II USA. Part 1: Total War and Its Aftermath,” History of Psychiatry 22 (1) (2011): 3–19.
- 11See “Proceedings of Societies,” The American Journal of Psychiatry 105 (1949): 851.
- 12Albert Deutsch, The Story of GAP: Relating to the Origins, Goals, and Activities of a Unique Medical Organization, the Group for the Advancement of Psychiatry, and Its Contributions to Professional and Social Progress (New York: Group for the Advancement of Psychiatry, 1959).
- 13Group for the Advancement of Psychiatry, “Report No. 1: Shock Therapy,” September 15, 1947.
- 14Robert H. Felix, “Psychiatry Comes of Age,” The Science News-Letter 54 (6) (1948): 90–92.
- 15“Interview, Robert Hanna Felix, May 27, 1975,” National Institute of Mental Health Oral History Collection, sound recording and text, produced 1975–1978, 24 (accessed November 5, 2022).
- 16Ingrid G. Farreras, Caroline Hannaway, and Victoria Angela Harden, Mind, Brain, Body, and Behavior: Foundations of Neuroscience and Behavioral Research at the National Institutes of Health (Amsterdam: IOS Press, 2004).
- 17Nathan G. Hale, Jr., The Rise and Crisis of Psychoanalysis in the United States: Freud and the Americans, 1917–1985 (Replica Books, 2001), 246–248.
- 18Gerald N. Grob, From Asylum to Community: Mental Health Policy in Modern America (Princeton, N.J.: Princeton University Press, 1991).
- 19Addison M. Duval and Douglas Goldman, “The New Drugs (Chlorpromazine and Reserpine): Administrative Aspects,” Psychiatric Services 51 (3) (2000): 327–331; and Kenneth S. Smith, “Mental Ill Weapons,” The Wall Street Journal, December 29, 1954.
- 20Felix, “Psychiatry Comes of Age,” 84.
- 21Jonathan Cole became the first director of psychopharmacology research at the NIMH, establishing protocols and supporting foundational research that led to his being widely credited as the “father of clinical psychopharmacology.” He was chief of psychopharmacology at McLean Hospital in Belmont, Massachusetts, where he had a particular interest in treatments for depression and bipolar disorder. The Jonathan O. Cole Mental Health Consumer Resource Center, based in Waverley, Massachusetts, is named in his honor.
- 22Max Chessin, Edward R. Kramer, and Charles C. Scott, “Modifications of the Pharmacology of Reserpine by Iproniazid,” Journal of Pharmacology and Experimental Therapeutics 119 (4) (1957): 453–460; Alfred Pletscher, Parkhurst A. Shore, and Bernard B. Brodie, “Serotonin Release as a Possible Mechanism of Reserpine Action,” Science 122 (3165) (1955): 374–375; Erminio Costa, Alexander G. Karczmar, and Elliot S. Vesell, “Bernard B. Brodie and the Rise of Chemical Pharmacology,” Annual Review of Pharmacology and Toxicology 29 (1) (1989): 1–22; and Julius Axelrod, “An Unexpected Life in Research,” Annual Review of Pharmacology and Toxicology 28 (1) (1988): 1–24.
- 23Steven J. Taylor, Acts of Conscience: World War II, Mental Institutions, and Religious Objectors (Ithaca, N.Y.: Syracuse University Press, 2009).
- 24Albert Q. Maisel, “Bedlam 1946: Most U.S. Mental Hospitals Are a Shame and a Disgrace,” Life Magazine 20 (18) (1946): 102–118.
- 25John F. Kennedy, “Special Message to the Congress on Mental Illness and Mental Retardation,” February 5, 1963, The American Presidency Project.
- 26E. Fuller Torrey, “Deinstitutionalization: A Psychiatric ‘Titanic,’” Frontline (accessed November 17, 2022).
- 27Edward H. Thompson Jr. and William Doll, “The Burden of Families Coping with the Mentally Ill: An Invisible Crisis,” Family Relations 31 (3) (1982): 379–388; and Phyllis Vine, Families in Pain: Children, Siblings, Spouses, and Parents of the Mentally Ill Speak Out (New York: Pantheon Books, 1982).
- 28Chaim F. Shatan, “Post-Vietnam Syndrome,” The New York Times, May 6, 1972; and Robert Jay Lifton, Home from the War: Vietnam Veterans: Neither Victims Nor Executioners (New York: Simon and Schuster, 1973).
- 29Florence Rush, “The Sexual Abuse of Children: A Feminist Point of View,” in Rape: The First Sourcebook for Women by New York Radical Feminists, ed. Noreen Connell and Cassandra Wilson (New York: Plume Books, 1974), 64–75; and Florence Rush, “Freud and the Sexual Abuse of Children,” Chrysalis 1 (1) (1977): 31–45.
- 30Ronald Bayer, Homosexuality and American Psychiatry: The Politics of Diagnosis (Princeton, N.J.: Princeton University Press, 1981); and Jack Drescher and Joseph P. Merlino, eds., American Psychiatry and Homosexuality: An Oral History (New York: Harrington Park Press, 2007).
- 31Peter Sedgwick, Psycho Politics: Laing, Foucault, Goffman, Szasz, and the Future of Mass Psychiatry (New York: Harper & Row, 1982); and Norman Dain, “Antipsychiatry,” in American Psychiatry after World War II, ed. Roy W. Menninger and John C. Nemiah (Washington, D.C.: American Psychiatric Press, 2000), 277–342.
- 32“Samuel B. Guze Interviewed by Marion Hunt,” Washington University School of Medicine Oral History Interview, 1994.
- 33Gerald L. Klerman, “The Evolution of a Scientific Nosology,” in Schizophrenia: Science and Practice, ed. John C. Shershow (Cambridge, Mass.: Harvard University Press, 1978), 99; and Samuel B. Guze, “Nature of Psychiatric Illness: Why Psychiatry Is a Branch of Medicine,” Comprehensive Psychiatry 19 (4) (1978): 295–307.
- 34Edwin Fuller Torrey, Surviving Schizophrenia: A Family Manual (New York: Harper & Row, 1983).
- 35Athena McLean, “Contradictions in the Social Production of Clinical Knowledge: The Case of Schizophrenia,” Social Science & Medicine 30 (9) (1990): 974.
- 36Herbert Pardes, “Citizens: A New Ally for Research,” Psychiatric Services 37 (12) (1986): 1193.
- 37McLean, “Contradictions in the Social Production of Clinical Knowledge,” 976.
- 38Hannah S. Decker, The Making of DSM-III: A Diagnostic Manual’s Conquest of American Psychiatry (New York: Oxford University Press, 2013).
- 39Edward Shorter, “Preface,” A History of Psychiatry: From the Era of the Asylum to the Age of Prozac (New York: John Wiley & Sons, 1997), vii.
- 40Murray Goldstein, “The Decade of the Brain,” Neurology 40 (2) (1990): 321.
- 41André Zugman, João R. Sato, and Andrea P. Jackowski, “Crisis in Neuroimaging: Is Neuroimaging Failing 15 Years after the Decade of the Brain?” Brazilian Journal of Psychiatry 38 (4) (2016): 267–269.
- 42Thomas R. Insel, “Understanding Mental Disorders as Circuit Disorders,” in A Decade after The Decade of the Brain, Cerebrum: The Dana Foundation, February 26, 2010 (accessed August 25, 2023).
- 43Ronald Kotulak, “Dark Heritage: Amish Study Shows Mental Illness Isn’t All in the Mind,” Chicago Tribune, May 10, 1988.
- 44John R. Kelsoe, Edward I. Ginns, Janice A. Egeland, et al., “Re-Evaluation of the Linkage Relationship between Chromosome 11p Loci and the Gene for Bipolar Affective Disorder in the Old Order Amish,” Nature 342 (6247) (1989): 238–243.
- 45Peter R. Martin, ed., “Interview with David Dunner, 2001,” in Recollections of the History of Neuropharmacology through Interviews, Conducted by Thomas A. Ban (Córdoba, Argentina: International Network for the History of Neuropsychopharmacology, 2014).
- 46John Crilly, “The History of Clozapine and Its Emergence in the U.S. Market: A Review and Analysis,” History of Psychiatry 18 (1) (2007): 39–60; and Rachel Liebert and Nicola Gavey, “‘There Are Always Two Sides to These Things’: Managing the Dilemma of Serious Adverse Effects from SSRIs,” Social Science and Medicine 68 (10): 1882–1891.
- 47Lisa Cosgrove, Harold J. Bursztajn, David J. Kupfer, and Darrel A. Regler, “Toward Credible Conflict of Interest Policies in Clinical Psychiatry,” Psychiatric Times 26 (1) (2009); and Ray Moynihan, “Key Opinion Leaders: Independent Experts or Drug Representatives in Disguise?” The BMJ 336 (7658) (2008): 1402–1403.
- 48“Psychiatric Drug Discovery on the Couch,” Nature Reviews Drug Discovery 6 (3) (2007): nrd2268.
- 49Greg Miller, “Is Pharma Running Out of Brainy Ideas?” Science 329 (5991) (2010): 502–504.
- 50Christopher Lane, “The NIMH Withdraws Support for DSM-5,” Psychology Today, May 4, 2013.
- 51David Dobbs, “The Smartphone Psychiatrist,” The Atlantic, July 1, 2017.
- 52Yasemin Saplakoglu, “FDA Calls Psychedelic Psilocybin a ‘Breakthrough Therapy’ for Severe Depression,” Live Science, November 25, 2019.
- 53Mattha Busby, “Biden Administration Plans for Legal Psychedelic Therapies within Two Years,” The Intercept, July 26, 2022.
- 54Elijah C. Watson, “Black Americans Are Building a Space in Psychedelic Drug Culture after Being Ignored For Decades,” Okayplayer, October 30, 2018; and Tehseen Noorani, “Making Psychedelics into Medicines: The Politics and Paradoxes of Medicalization,” Journal of Psychedelic Studies 4 (1) (2020): 34–39.
- 55Andrew Jacobs, “The Psychedelic Revolution Is Coming. Psychiatry May Never Be the Same,” The New York Times, May 9, 2021.
- 56Joshua Phelps, Ravi N. Shah, and Jeffrey A. Lieberman, “The Rapid Rise in Investment in Psychedelics—Cart before the Horse,” JAMA Psychiatry 79 (3) (2022): 189–190; and Shayla Love, “Get Ready for Pharmaceutical-Grade Magic Mushroom Pills,” Vice, May 26, 2020.
- 57For example, see Compass Pathways, a company focused on psilocybin treatments, founded in 2016; and ATAI Life Sciences, founded in 2018 to test and develop psychedelic treatments for mental illnesses.
- 58Russell Hausfeld and David Nickles, “Compass Pathways Is Trying to Patent Psilocybin for More Mental Health Conditions Than You Can Name,” Psymposia, March 18, 2021.
- 59“Paul Summergrad and Thomas Insel: Future of Psychedelic Psychiatry,” Multidisciplinary Association for Psychedelic Studies (MAPS), filmed April 26, 2017, at Psychedelic Science 2017, Oakland, California, video.
- 60“American Psychiatry Association 2021 Kicks Off,” Physician’s Weekly, April 30, 2021.