Sara Bhatia | Washington Monthly https://washingtonmonthly.com Sun, 02 Nov 2025 23:17:37 +0000 en-US hourly 1 https://washingtonmonthly.com/wp-content/uploads/2016/06/cropped-WMlogo-32x32.jpg Sara Bhatia | Washington Monthly https://washingtonmonthly.com 32 32 200884816 Hitting His Stride  https://washingtonmonthly.com/2025/11/02/nick-thompson-hitting-his-stride/ Sun, 02 Nov 2025 23:17:30 +0000 https://washingtonmonthly.com/?p=162404 The Running Ground

Nick Thompson was an above-average runner who suddenly, in middle age, started breaking world records—a mysterious success inspired by a complicated relationship with his father. 

The post Hitting His Stride  appeared first on Washington Monthly.

]]>
The Running Ground

Lanky and awkward, Nicholas Thompson joined his high school’s indoor track team in his sophomore year. At first, he was just an average runner, but he trained hard, eager to improve. That winter, to his surprise, the coach entered him in a two-mile race at the New England Prep School Championships. Thompson did not anticipate being a top finisher. Indeed, expectations for his performance were so low that no one had bothered to tell him that the dimensions of the course were different from those at his own high school’s track; mid-race, he was puzzled by his own split times, even as he noticed that he was lapping other, more accomplished runners.  

The Running Ground: A Father, a Son, and the Simplest of Sports 
by Nicholas Thompson 
Random House, 272 pp. 

To his astonishment, Thompson set a school record. For the first time, he had not allowed his expectations to determine his performance. Years later, Thompson reflected, “If I had understood how fast I was running, I wouldn’t have been able to run that fast. Because I didn’t know the track, because I didn’t know how long the laps were, I didn’t get scared and shut down my body. I just kept going. To do it, I had to first forget that I couldn’t do it.” 

Today, Nick Thompson is a trailblazer in the worlds of technology journalism and magazine publishing. A former editor at the Washington Monthly, Thompson oversaw The New Yorker’s website before becoming editor in chief of Wired. Now, as the CEO of The Atlantic, he has engineered a remarkable turnaround, steering the magazine to profitability, growing its subscriber base to more than 1 million, and overseeing a hiring spree of Pulitzer Prize–winning writers.  

He is also an exceptional, record-holding long-distance runner, who has achieved his greatest success in his 40s, long after most athletes have hit their prime. At age 44, he completed the Chicago Marathon in 2:29, a speed that elevated him to elite status, ranking him among the world’s fastest runners in his age group. Having achieved his goals as a marathon runner, he set his sights on ultramarathons—races of more than 26 miles. At 46, he set an American age group record for the 50K, and then became the top-ranked runner in the world for his age group for the 50-mile run.  

Thompson’s athletic life—and the way it has fueled his professional success and shaped his personal life—is the focus of The Running Ground, an engrossing, unconventional memoir. The book traces a serpentine course, simultaneously a family history, an autobiography, an inspirational guide to middle age, and, most meaningfully, a meditation on running and its lessons for a life fully lived.  

Thompson frames the book, subtitled “A Father, a Son, and the Simplest of Sports,” around his relationship with his father, W. Scott Thompson, an avid runner and occasional marathoner, who introduced Nick to the sport. One of Thompson’s earliest memories is from the age of seven, when he stood in the shadow of the Queensboro Bridge clutching a bottle of orange juice and a fresh pair of sneakers to hand to his dad, who was competing in the New York City Marathon.

“We all can go faster,” Thompson writes. “We just need to persuade our brains not to start the subconscious shutdown process right away. But the only thing we can use to trick our brains is our brains. Training becomes a hide-andseek with oneself.”

In a poignant moment of reflection, Thompson writes, “I run because of my father. Running connects me to my father; it reminds me of my father; and it gives me a way to avoid becoming my father.” 

Thompson paints a vivid and compassionate portrait of Scott Thompson, a complex, flamboyant, inexhaustible figure of tremendous talent and intellect, who emerged from a hardscrabble childhood in rural Oklahoma to become a Rhodes scholar, a White House fellow, and a celebrated academic. But at midlife, the elder Thompson’s life careened off track. He came out of the closet and walked out on his family, including seven-year-old Nick. Thompson writes with remarkable frankness about his father’s foibles during the subsequent decades—Scott was an alcoholic and a self-proclaimed sex addict with a proclivity for very young men. He grew unable to hold a job, and spent his later years living in Asia, where he had fled to avoid the IRS and an unpaid tax bill of over $300,000.  

In Nick’s 20s, the two lived together—more like roommates than father and son—and even collaborated on a book. Their home was a Washington salon, with raucous parties filled with diplomats, congressmen, and young journalists. The two shared professional interests in foreign policy and politics, a passion for music and, most meaningfully, running. Thompson writes, “My father led a deeply complicated and broken life. But he gave me many things, including the gift of running—a gift that opens the world to anyone who accepts it.” 

A love of running connects The Running Ground’s two primary narratives—the father-son memoir, and the story of Thompson’s athletic life. Little in Thompson’s early life foreshadowed the great success he has achieved as a marathoner and ultramarathoner in his 40s. At Stanford, a preseason stress fracture derailed his college running career. After graduation, he returned to the sport, flirting with longer distances, including the occasional marathon. Throughout his 20s, Thompson writes, “running was my unrequited crush. I trained like a dilettante and searched for physiological shortcuts that don’t exist. I humiliated myself in races.” Likewise, Thompson comments wryly that his “professional life was the same goat rodeo as my running. I had fallen in love with journalism. But journalism hadn’t fallen in love with me.” 

At 29, he got serious, about both running and his career. On the brink of quitting journalism and starting law school, he applied for a job as an editor of the technology magazine Wired. A week before enrolling at school, he took a grueling, 20-mile predawn run up Cadillac Mountain in Maine, and returned with a renewed focus. “I had just done the hard thing of running up a mountain,” Thompson told his wife. “And it convinced me that I could do this much harder thing of betting on myself. If I didn’t get the job at Wired, I’d write a book.” He de-matriculated from law school, got the job at Wired, then wrote a book. Then he found a coach, established a training regimen, and focused on achieving a major goal, breaking the three-hour time at the New York City Marathon. But just one year later, two weeks after smashing that goal with a 2:43 time (finishing in 146th place out of 37,000 entrants), he was diagnosed with thyroid cancer. Upon recovery, he was determined to repeat that finish, and did so triumphantly two years later, shaving 13 seconds off his personal record. 

Approaching 40, profoundly grateful for his health, and as a busy professional and devoted husband and father, Thompson had little additional time for training and assumed that he had reached his potential as a runner. Even so, he maintained his fitness and speed with remarkable consistency throughout the next decade, completing eight marathons within a minute or two of his pre-cancer time. 

In 2018, at age 43, Thompson received an email from a team at Nike, inviting him to participate in an experimental program to pair “regular” runners with elite coaches, to maximize performance. The email had landed at an opportune moment—Thompson was grieving the death of his father and contemplating the meaning of middle age. 

Thompson’s father had warned him repeatedly that his life would fracture at around age 40. His paternal grandfather’s life had also splintered in middle age, when a scandal derailed his career as a minister. For Thompson, the pattern was a cautionary tale—if middle age was a point of inflection, how might he avoid the fate of his father and grandfather before him?  

Running, he thought, might be the key. Scott Thompson had run his fastest race at age 40, before his life spun out of control. In his 40s and 50s, he continued to run, but sporadically, for shorter distances, and at slower speeds. Still, it was a healthy habit in an increasingly unhealthy life, and offered structure and discipline. Nick recalls, “As my father descended into mania, the days when he ran were the days he kept everything else in control. If he had run more, could he have done more?” 

Thompson committed to the training program. The Nike coaches challenged common assumptions about the inevitability of runners’ declines in their 30s and 40s, pointing to certain biological advantages that come with age, like the strengthening of tendons and the trade-off of speed for endurance. They offered new technologies that sharpened Thompson’s understanding of his gait, and pushed him to collect data that informed his training. They stressed the need for more intense practices—time spent running fast—rather than additional mileage, and the importance of key metrics Thompson had long ignored, along with recommendations for a healthier diet and a nonnegotiable eight hours of sleep.  

Thompson also benefited from the psychological insights of the training program, particularly a theory coined by the sports physiologist Tim Noakes, the “Central Governor Model,” which posits that pain and fatigue can be psychological phenomena, with the unconscious mind seeking to protect the body. This phenomenon explains racers’ ability to sprint at the end of a long race, despite physical exhaustion. Thompson had learned a similar lesson in his high school race 30 years earlier: “We all can go faster. We just need to persuade our brains not to start the subconscious shutdown process right away. But the only thing we can use to trick our brains is our brains. Training becomes a hide-and-seek with oneself.” 

Thompson came to realize that his relatively modest running goals had held him back. Reflecting on the decade following his cancer recovery, he recalled that all he had wanted to do was “to match the Nick I had been before the diagnosis.” The goal was to maintain his prior speed, not exceed it. He recalls, “I hadn’t been able to run a fast marathon in the past because I hadn’t wanted to. Or, more precisely, I hadn’t really cared about going that fast because all I really wanted was something else.”  

Thompson writes with remarkable frankness about his father’s foibles—Scott was an alcoholic and a self-proclaimed sex addict with a proclivity for very young men. He spent his later years living in Asia, where he had fled to avoid an unpaid tax bill of over $300,000.

Within a year, Thompson had broken his own record in the New York City Marathon by five minutes and then exceeded his highest expectations for the next seven races. He has done so despite training “only” 65 to 70 miles a week, far less than the mileage of a professional marathoner.  

Thompson muses about the reasons for his success—perhaps his body responds to training better than others’, and he has been remarkably free of injuries—but it is hard to escape the conclusion that he simply works harder and smarter than most. To lean on a cliché, Thompson reminds us that we can do hard things. He runs even in the most miserable weather, and regardless of the location—he has run through Times Square at midnight, through cities to the airport, and to black-tie events with a tuxedo tucked in his backpack. He runs despite aches and pains, nausea and fatigue. “The deeper truth,” he reminds readers, is that “you have to learn to run when you hurt, and you have to learn to hurt when you run.”  

Thompson’s own pre-race rituals and preparation offer a window into his own intensity and the arcana of the sport, in which an improvement of just a few seconds can be meaningful. For instance, before each race, Thompson pays careful attention to his feet, clipping his toenails, shaving the hairs on his toes, and applying Vaseline. As an ultramarathoner, he has taught himself to urinate while running. 

In a poignant moment of reflection, Thompson writes, “I run because of my father. Running connects me to my father; it reminds me of my father; and it gives me a way to avoid becoming my father.”

The reader is left craving more such details, both about the sport—for instance, that it is tradition for a record-breaker to drink champagne from his sweaty running shoe—and also the ways it has impacted Thompson’s professional life. The reader who comes to this memoir with a familiarity with Thompson’s storied career and reputation for a relentless work ethic and talent for untangling knotty problems will be disappointed by the virtual absence of workplace anecdotes. While he describes his major career pivots (and the long, contemplative runs he often takes while weighing his options), Thompson writes in a too-broad fashion about the ways in which running has improved his professional life. For example, he muses, “I had learned that our minds create limits for us when we’re afraid of failure, not because it’s actually time to slow or stop.” The memoir is filled with similar axioms about the instructive lessons from running, like teaching concentration, the value of discipline, and the need for setting goals, but is disappointingly light on specifics about the impacts on his own professional life.  

Thompson’s prose is lean and spare, like the strides of a runner. At times, he veers into inspirational cliché, but at its best, the writing is almost Zen-like, when he captures the quality of running in nature, perfectly in sync with the rhythm of each step. He describes being so in touch with his body’s rhythms that he can run a mile and, without glancing at his watch, predict the time within a second or two. While most of Thompson’s training is on mundane urban courses, including his daily eight-mile round trip commute, his description of runs among the mountains of New England exude sheer joy: “To run through the Andover bird sanctuary in October is to cross into a Winslow Homer painting. The palette changes subtly each day as the maple trees flip from green to scarlet while the oak trees stubbornly hold on to their russet leaves.” 

Thompson intersperses his own narrative with five excellent chapters profiling other exceptional long-distance runners. The profiles interrupt the biographical flow of the memoir, but they are among the most compelling stories in the book and serve as a reminder that not all runners are motivated by a competitive drive.  

The most interesting of those profiled is Suprabha Beckjord, a world-record-holding ultramarathoner. For 13 years, Beckjord completed the 3,100-mile Self-Transcendence Race, organized by the Indian guru Sri Chinmoy. This astonishing race course—a distance greater than from San Francisco to New York—consists simply of circumnavigating around a public high school occupying a single square block in Queens, New York. Successful runners complete approximately 60 miles per day—day after day, for nearly two months—throughout the hot New York City summer. For Beckjord, the race is one of spiritual transcendence and self-awareness, and the mundane course offers an opportunity to notice the tiniest variations in one’s surroundings—an insect on a tree, a subtle change in the weather, a chip in the sidewalk. 

The Running Ground crackles with big ideas, about intergenerational inheritance, the power of love and forgiveness, the inevitability of aging, the mind-body connection, and the value of hard work. The memoir’s intertwined stories—Thompson’s relationship with his father alongside Thompson’s own journey as a marathon runner hitting his stride midlife—are compelling narratives. There is so much of interest in this lean, slim memoir. The downside is that Thompson races toward the finish line, without offering sufficient time to fully explore each of these individual themes. He writes, “One can run as a way to seek spiritual awakening, and one can run to fulfill ambition. It’s often hard to do both.” Perhaps a memoir, too, is best written as a journey of spiritual awakening, a meandering journey of self-knowledge, rather than a sprint to conclusion.  

The post Hitting His Stride  appeared first on Washington Monthly.

]]>
162404 9780593244128
Hitting His Stride  https://washingtonmonthly.com/2025/10/28/hitting-his-stride-nicholas-thompson-running-ground-review/ Tue, 28 Oct 2025 09:00:00 +0000 https://washingtonmonthly.com/?p=162174 The Running Ground

Nick Thompson was an above-average runner who suddenly, in middle age, started breaking world records—a mysterious success inspired by a complicated relationship with his father. 

The post Hitting His Stride  appeared first on Washington Monthly.

]]>
The Running Ground

Lanky and awkward, Nicholas Thompson joined his high school’s indoor track team in his sophomore year. At first, he was just an average runner, but he trained hard, eager to improve. That winter, to his surprise, the coach entered him in a two-mile race at the New England Prep School Championships. Thompson did not anticipate being a top finisher. Indeed, expectations for his performance were so low that no one had bothered to tell him that the dimensions of the course were different from those at his own high school’s track; mid-race, he was puzzled by his own split times, even as he noticed that he was lapping other, more accomplished runners.  

The Running Ground: A Father, a Son, and the Simplest of Sports 
by Nicholas Thompson 
Random House, 272 pp. 

To his astonishment, Thompson set a school record. For the first time, he had not allowed his expectations to determine his performance. Years later, Thompson reflected, “If I had understood how fast I was running, I wouldn’t have been able to run that fast. Because I didn’t know the track, because I didn’t know how long the laps were, I didn’t get scared and shut down my body. I just kept going. To do it, I had to first forget that I couldn’t do it.” 

Today, Nick Thompson is a trailblazer in the worlds of technology journalism and magazine publishing. A former editor at the Washington Monthly, Thompson oversaw The New Yorker’s website before becoming editor in chief of Wired. Now, as the CEO of The Atlantic, he has engineered a remarkable turnaround, steering the magazine to profitability, growing its subscriber base to more than 1 million, and overseeing a hiring spree of Pulitzer Prize–winning writers.  

He is also an exceptional, record-holding long-distance runner, who has achieved his greatest success in his 40s, long after most athletes have hit their prime. At age 44, he completed the Chicago Marathon in 2:29, a speed that elevated him to elite status, ranking him among the world’s fastest runners in his age group. Having achieved his goals as a marathon runner, he set his sights on ultramarathons—races of more than 26 miles. At 46, he set an American age group record for the 50K, and then became the top-ranked runner in the world for his age group for the 50-mile run.  

Thompson’s athletic life—and the way it has fueled his professional success and shaped his personal life—is the focus of The Running Ground, an engrossing, unconventional memoir. The book traces a serpentine course, simultaneously a family history, an autobiography, an inspirational guide to middle age, and, most meaningfully, a meditation on running and its lessons for a life fully lived.  

Thompson frames the book, subtitled “A Father, a Son, and the Simplest of Sports,” around his relationship with his father, W. Scott Thompson, an avid runner and occasional marathoner, who introduced Nick to the sport. One of Thompson’s earliest memories is from the age of seven, when he stood in the shadow of the Queensboro Bridge clutching a bottle of orange juice and a fresh pair of sneakers to hand to his dad, who was competing in the New York City Marathon.  

In a poignant moment of reflection, Thompson writes, “I run because of my father. Running connects me to my father; it reminds me of my father; and it gives me a way to avoid becoming my father.” 

Thompson paints a vivid and compassionate portrait of Scott Thompson, a complex, flamboyant, inexhaustible figure of tremendous talent and intellect, who emerged from a hardscrabble childhood in rural Oklahoma to become a Rhodes scholar, a White House fellow, and a celebrated academic. But at midlife, the elder Thompson’s life careened off track. He came out of the closet and walked out on his family, including seven-year-old Nick. Thompson writes with remarkable frankness about his father’s foibles during the subsequent decades—Scott was an alcoholic and a self-proclaimed sex addict with a proclivity for very young men. He grew unable to hold a job, and spent his later years living in Asia, where he had fled to avoid the IRS and an unpaid tax bill of over $300,000.  

In Nick’s 20s, the two lived together—more like roommates than father and son—and even collaborated on a book. Their home was a Washington salon, with raucous parties filled with diplomats, congressmen, and young journalists. The two shared professional interests in foreign policy and politics, a passion for music and, most meaningfully, running. Thompson writes, “My father led a deeply complicated and broken life. But he gave me many things, including the gift of running—a gift that opens the world to anyone who accepts it.” 

A love of running connects The Running Ground’s two primary narratives—the father-son memoir, and the story of Thompson’s athletic life. Little in Thompson’s early life foreshadowed the great success he has achieved as a marathoner and ultramarathoner in his 40s. At Stanford, a preseason stress fracture derailed his college running career. After graduation, he returned to the sport, flirting with longer distances, including the occasional marathon. Throughout his 20s, Thompson writes, “running was my unrequited crush. I trained like a dilettante and searched for physiological shortcuts that don’t exist. I humiliated myself in races.” Likewise, Thompson comments wryly that his “professional life was the same goat rodeo as my running. I had fallen in love with journalism. But journalism hadn’t fallen in love with me.” 

At 29, he got serious, about both running and his career. On the brink of quitting journalism and starting law school, he applied for a job as an editor of the technology magazine Wired. A week before enrolling at school, he took a grueling, 20-mile predawn run up Cadillac Mountain in Maine, and returned with a renewed focus. “I had just done the hard thing of running up a mountain,” Thompson told his wife. “And it convinced me that I could do this much harder thing of betting on myself. If I didn’t get the job at Wired, I’d write a book.” He de-matriculated from law school, got the job at Wired, then wrote a book. Then he found a coach, established a training regimen, and focused on achieving a major goal, breaking the three-hour time at the New York City Marathon. But just one year later, two weeks after smashing that goal with a 2:43 time (finishing in 146th place out of 37,000 entrants), he was diagnosed with thyroid cancer. Upon recovery, he was determined to repeat that finish, and did so triumphantly two years later, shaving 13 seconds off his personal record. 

Approaching 40, profoundly grateful for his health, and as a busy professional and devoted husband and father, Thompson had little additional time for training and assumed that he had reached his potential as a runner. Even so, he maintained his fitness and speed with remarkable consistency throughout the next decade, completing eight marathons within a minute or two of his pre-cancer time. 

In 2018, at age 43, Thompson received an email from a team at Nike, inviting him to participate in an experimental program to pair “regular” runners with elite coaches, to maximize performance. The email had landed at an opportune moment—Thompson was grieving the death of his father and contemplating the meaning of middle age. 

Thompson’s father had warned him repeatedly that his life would fracture at around age 40. His paternal grandfather’s life had also splintered in middle age, when a scandal derailed his career as a minister. For Thompson, the pattern was a cautionary tale—if middle age was a point of inflection, how might he avoid the fate of his father and grandfather before him?  

Running, he thought, might be the key. Scott Thompson had run his fastest race at age 40, before his life spun out of control. In his 40s and 50s, he continued to run, but sporadically, for shorter distances, and at slower speeds. Still, it was a healthy habit in an increasingly unhealthy life, and offered structure and discipline. Nick recalls, “As my father descended into mania, the days when he ran were the days he kept everything else in control. If he had run more, could he have done more?” 

Thompson committed to the training program. The Nike coaches challenged common assumptions about the inevitability of runners’ declines in their 30s and 40s, pointing to certain biological advantages that come with age, like the strengthening of tendons and the trade-off of speed for endurance. They offered new technologies that sharpened Thompson’s understanding of his gait, and pushed him to collect data that informed his training. They stressed the need for more intense practices—time spent running fast—rather than additional mileage, and the importance of key metrics Thompson had long ignored, along with recommendations for a healthier diet and a nonnegotiable eight hours of sleep.  

Thompson also benefited from the psychological insights of the training program, particularly a theory coined by the sports physiologist Tim Noakes, the “Central Governor Model,” which posits that pain and fatigue can be psychological phenomena, with the unconscious mind seeking to protect the body. This phenomenon explains racers’ ability to sprint at the end of a long race, despite physical exhaustion. Thompson had learned a similar lesson in his high school race 30 years earlier: “We all can go faster. We just need to persuade our brains not to start the subconscious shutdown process right away. But the only thing we can use to trick our brains is our brains. Training becomes a hide-and-seek with oneself.” 

Thompson came to realize that his relatively modest running goals had held him back. Reflecting on the decade following his cancer recovery, he recalled that all he had wanted to do was “to match the Nick I had been before the diagnosis.” The goal was to maintain his prior speed, not exceed it. He recalls, “I hadn’t been able to run a fast marathon in the past because I hadn’t wanted to. Or, more precisely, I hadn’t really cared about going that fast because all I really wanted was something else.”  

Within a year, Thompson had broken his own record in the New York City Marathon by five minutes and then exceeded his highest expectations for the next seven races. He has done so despite training “only” 65 to 70 miles a week, far less than the mileage of a professional marathoner.  

Thompson muses about the reasons for his success—perhaps his body responds to training better than others’, and he has been remarkably free of injuries—but it is hard to escape the conclusion that he simply works harder and smarter than most. To lean on a cliché, Thompson reminds us that we can do hard things. He runs even in the most miserable weather, and regardless of the location—he has run through Times Square at midnight, through cities to the airport, and to black-tie events with a tuxedo tucked in his backpack. He runs despite aches and pains, nausea and fatigue. “The deeper truth,” he reminds readers, is that “you have to learn to run when you hurt, and you have to learn to hurt when you run.”  

Thompson’s own pre-race rituals and preparation offer a window into his own intensity and the arcana of the sport, in which an improvement of just a few seconds can be meaningful. For instance, before each race, Thompson pays careful attention to his feet, clipping his toenails, shaving the hairs on his toes, and applying Vaseline. As an ultramarathoner, he has taught himself to urinate while running. 

The reader is left craving more such details, both about the sport—for instance, that it is tradition for a record-breaker to drink champagne from his sweaty running shoe—and also the ways it has impacted Thompson’s professional life. The reader who comes to this memoir with a familiarity with Thompson’s storied career and reputation for a relentless work ethic and talent for untangling knotty problems will be disappointed by the virtual absence of workplace anecdotes. While he describes his major career pivots (and the long, contemplative runs he often takes while weighing his options), Thompson writes in a too-broad fashion about the ways in which running has improved his professional life. For example, he muses, “I had learned that our minds create limits for us when we’re afraid of failure, not because it’s actually time to slow or stop.” The memoir is filled with similar axioms about the instructive lessons from running, like teaching concentration, the value of discipline, and the need for setting goals, but is disappointingly light on specifics about the impacts on his own professional life.  

Thompson’s prose is lean and spare, like the strides of a runner. At times, he veers into inspirational cliché, but at its best, the writing is almost Zen-like, when he captures the quality of running in nature, perfectly in sync with the rhythm of each step. He describes being so in touch with his body’s rhythms that he can run a mile and, without glancing at his watch, predict the time within a second or two. While most of Thompson’s training is on mundane urban courses, including his daily eight-mile round trip commute, his description of runs among the mountains of New England exude sheer joy: “To run through the Andover bird sanctuary in October is to cross into a Winslow Homer painting. The palette changes subtly each day as the maple trees flip from green to scarlet while the oak trees stubbornly hold on to their russet leaves.” 

Thompson intersperses his own narrative with five excellent chapters profiling other exceptional long-distance runners. The profiles interrupt the biographical flow of the memoir, but they are among the most compelling stories in the book and serve as a reminder that not all runners are motivated by a competitive drive.  

The most interesting of those profiled is Suprabha Beckjord, a world-record-holding ultramarathoner. For 13 years, Beckjord completed the 3,100-mile Self-Transcendence Race, organized by the Indian guru Sri Chinmoy. This astonishing race course—a distance greater than from San Francisco to New York—consists simply of circumnavigating around a public high school occupying a single square block in Queens, New York. Successful runners complete approximately 60 miles per day—day after day, for nearly two months—throughout the hot New York City summer. For Beckjord, the race is one of spiritual transcendence and self-awareness, and the mundane course offers an opportunity to notice the tiniest variations in one’s surroundings—an insect on a tree, a subtle change in the weather, a chip in the sidewalk. 

The Running Ground crackles with big ideas, about intergenerational inheritance, the power of love and forgiveness, the inevitability of aging, the mind-body connection, and the value of hard work. The memoir’s intertwined stories—Thompson’s relationship with his father alongside Thompson’s own journey as a marathon runner hitting his stride midlife—are compelling narratives. There is so much of interest in this lean, slim memoir. The downside is that Thompson races toward the finish line, without offering sufficient time to fully explore each of these individual themes. He writes, “One can run as a way to seek spiritual awakening, and one can run to fulfill ambition. It’s often hard to do both.” Perhaps a memoir, too, is best written as a journey of spiritual awakening, a meandering journey of self-knowledge, rather than a sprint to conclusion.  

The post Hitting His Stride  appeared first on Washington Monthly.

]]>
162174 9780593244128
Clever the Twain Shall Meet https://washingtonmonthly.com/2025/05/15/clever-the-twain-shall-meet/ Thu, 15 May 2025 09:00:00 +0000 https://washingtonmonthly.com/?p=159106

An epic new biography of Samuel Clemens confirms the Missourian’s literary mastery but contends that the most important character he ever created was his own.

The post Clever the Twain Shall Meet appeared first on Washington Monthly.

]]>

In March, the John F. Kennedy Center for the Performing Arts bestowed its annual Mark Twain Prize for American Humor on the former late-night talk show host Conan O’Brien. At the awards ceremony, there was a frisson of tension in the audience. Just one month earlier, President Donald Trump had attacked the center’s programming as too “woke,” dismissed its leadership, and installed himself as chairman of the board, sparking widespread protest in the arts community.  

Mark Twain by Ron Chernow 
Penguin Press, 1,200 pp. 

On the Kennedy Center’s stage, many of the comedians roasting O’Brien also took aim at the president. But O’Brien, who has built his reputation as a nonpartisan observer, seemingly kept his powder dry. Instead, he spoke of the legacy of Mark Twain and the profound honor of receiving the award.  

“Don’t be distracted by the white suit and the cigar and the riverboat,” O’Brien chided. “Twain is alive, vibrant, and vitally relevant today.” He spoke of Twain’s hatred of bullies, his support for underdogs ranging from the formerly enslaved to Chinese immigrants—“he punched up, not down”—along with his hatred of intolerance, racism, and anti-Semitism, and his suspicion of populism and jingoism. Though O’Brien never mentioned Trump, the audience slowly awakened to the comedian’s subversive subtext. He concluded to sustained applause, “Twain wrote, ‘Patriotism is supporting your country all of the time, and your government when it deserves it.’” 

And thus with just a few spare sentences, Conan O’Brien made Mark Twain—the mustachioed, wisecracking author of America’s Gilded Age—once again relevant to American politics.  

So closely did O’Brien echo the praise of the writer’s “core principles” and irreverent wit that I wondered if someone had slipped him the advance galleys of Ron Chernow’s sparkling new biography. In Mark Twain, the acclaimed biographer takes a sledgehammer to the mythology of the quintessential American author. Like O’Brien, Chernow challenges the “sanitized view of a humorous man in a white suit, dispensing witticisms with a twinkling eye,” to demonstrate that Twain was among our nation’s most trenchant and biting social critics. Chernow asserts that “far from being a soft-shoe, cracker-barrel philosopher, he was a waspish man of decided opinions delivering hard and uncomfortable truths. His wit was laced with vinegar, not oil.”  

In his personal life, too, Twain belied his deliberately crafted, jovial public persona. Chernow wryly notes, “Mark Twain could serve as both a social critic of something and an exemplar of the very thing he criticized.” Indeed, the author who charmed audiences with his folksy demeanor and sought to create “a new democratic literature for ordinary people” while skewering elites and their institutions was exceptionally well read and cosmopolitan. He had lived for more than a decade in Europe, residing in grand châteaus and villas, and traveled the world, crossing the Atlantic 29 times. Twain’s barefoot boyhood on the banks of the Mississippi River was the stuff of legend, but the author spent most of his adulthood in New England, in a 25-room mansion with a fleet of servants, purchased with cash from his wealthy wife.  

“Mark Twain discarded the image of the writer as a contemplative being, living a cloistered existence, and thrust himself into the hurly-burly of American culture, capturing the wild, uproarious energy throbbing in the heartland. Probably no other American author has led such an eventful life.”

Chernow, the Pulitzer Prize– and National Book Award–winning writer of popular biographies of Ulysses S. Grant, George Washington, and Alexander Hamilton, tackles his complicated, often contradictory subject with nuance and prolific research. Chernow explores the author’s enormous oeuvre—a gratifying surprise for those whose familiarity with Twain resides in hazy middle school memories of The Adventures of Tom Sawyer.  

But this is no literary critique. Chernow asserts that Twain was “the most original character in American history,” and he is fascinated by him more as a man than as an author, reveling in his theatricality, both on the stage and off. He writes,  

Mark Twain discarded the image of the writer as a contemplative being, living a cloistered existence, and thrust himself into the hurly-burly of American culture, capturing the wild, uproarious energy throbbing in the heartland. Probably no other American author has led such an eventful life.  

Mark Twain is a massive brick of a book, comprising more than a thousand pages, and it is the mining of Twain’s private life and its intertwining with his public image that lends the book its physical heft and its most surprising and compelling content. Chernow concludes that “Mark Twain’s foremost creation—his richest and most complex gift to posterity—may well have been his own inimitable personality, the largest literary personality that America has produced.” 

Twain was easily the most famous writer in Gilded Age America, an era whose name was coined by Twain himself. He was the nation’s first celebrity author, a consummate storyteller, the nation’s most quoted person, and for many outside the U.S., the archetypal American. He mastered a vast array of literary formats, including travelogues, novels, essays, political tracts, plays, and historical romances. He created a uniquely American voice that captured the vernacular speech of the young nation. As famous an orator as a writer, Twain elevated storytelling into a wholly original theatrical genre, conducting speaking tours that attracted massive crowds and took him around the world, from Hawaii to Australia.  

Despite his myriad achievements, Twain felt unappreciated by the literary establishment, and chafed at the label “humorist,” fearing that audiences saw him as little more than vaudevillian. In 1907, Oxford University presented him with an honorary degree. For a man of humble origins who had left school at 12, Twain considered the diploma the pinnacle of his career, and he proudly donned the resplendent scarlet graduation gown to wear at formal events for the remainder of his life—including, charmingly, his daughter’s wedding. 

The broad outlines of Twain’s formative years are generally well known. Born Samuel Clemens to a downwardly mobile, slave-owning family in 1835, he was raised in the bustling river town of Hannibal, Missouri. It was a nostalgic setting the author returned to time and again in his writing, but rarely in person. Following his father’s death in 1847, Clemens went to work as a printer’s apprentice, and later, as a riverboat pilot, a job that fed his appetite for adventure and provided an endless stream of amusing anecdotes harvested for literary purposes. Clemens essentially sat out the Civil War—after a two-week stint as a Confederate soldier, he fled to the Nevada Territory, where he launched his writing career at the Territorial Enterprise, a newspaper catering to silver miners more interested in entertainment than reporting, and, as Chernow writes, “an ideal home for someone with Sam’s outsize powers of invention and casual relationship with facts.”  

In Nevada, Clemens adopted the nom de plume “Mark Twain,” a wink at his stint as a river pilot: On the Mississippi, a leadsman would yell out “mark twain!” upon lowering a weighted rope measuring 12 feet, to ascertain depth for safe passage. Twain’s comical sketches of the Far West and accounts of his journey established him as a travel writer. The Innocents Abroad, a humorous and irreverent travelogue of a five-month organized excursion to Europe and the Holy Land, became Twain’s best-selling book during his lifetime.  

Today, of course, Twain is revered as an iconic American novelist, whose books Tom Sawyer and Adventures of Huckleberry Finn are literary mainstays. Yet in his own time, these novels received mixed critical responses, with some reviewers troubled by the groundbreaking use of vernacular speech and questionable immorality. (Little Women’s Louisa May Alcott scolded, “If Mr. Clemens cannot think of something better to tell our pure-minded lads and lasses, he had best stop writing for them.”) Twain was so discouraged by the modest sales and lukewarm reaction to Tom Sawyer that he briefly swore off writing fiction. While modern readers might assume that these works constituted the apex of Twain’s career, Chernow covers their publication in the first third of his book, leaving the bulk of the biography to discuss Twain’s lesser-known writings, and his personal dramas. 

Twain reached the peak of his celebrity long after his fiction career had largely ended. A master of self-promotion, Twain tightly controlled the marketing of his books and lent his name and likeness to cigars, whiskey, and shirt collars. He created his own brand identity, with his shock of white hair, moustache, and signature white suits—he purportedly owned 14—and delighted in public recognition. For his massive speaking tours, he printed his own witty signage, with a trademark kicker that read, “Doors open at 7 o’clock[.] The trouble to begin at 8 o’clock.”  

Twain was his own best character. He sought the spotlight, and relished interactions with the press and the public. On the road, he held court with a gaggle of reporters from his hotel bed, clad in a nightshirt while puffing on a cigar. (He reportedly smoked 40 cigars a day and quipped, “It has always been my rule never to smoke when asleep, and never to refrain when awake.”) To his family’s mortification, the flamboyant Twain ofttimes created a spectacle. Once while in London, Twain strolled from his hotel to a public bath club in a state of undress, attracting a throng of spectators, including the press corps. After reading the London Times’ coverage, one of Twain’s daughters cabled from Connecticut, scolding, “Much worried remember proprieties.” 

Twain’s literary reputation rested on his masterful ability to mask deeper messages with a light veneer. The British playwright George Bernard Shaw riffed that Twain “has to put matters in such a way as to make people who would otherwise hang him believe he is joking.” Like Twain’s most beloved characters, the author himself often portrayed this same duality—a humorous facade that disguised a darker, more introspective core. It is at this crossroads—the intersection of Twain’s lighthearted persona and the darker underbelly—that the biography is most engaging.  

Despite his jovial front, Twain could be mercurial, petulant, and demanding. He had an explosive temper and was notoriously vindictive, holding grudges for decades. He was litigious, filing lawsuits against anyone he believed had crossed him, including his publisher, business associates, and family members. Twain filed three lawsuits, two civil and one criminal, against a woman he dubbed “the reptile”—the titled landlady of his 60-room rented Italian villa. After a farcical series of events involving leaking sewage, severed telephone lines, and a rabid donkey, Twain smirked, “I was losing my belief in hell until I got acquainted with the Countess Massiglia.”   

A master of self-promotion, Twain lent his name and likeness to cigars, whiskey, and shirt collars. He created his own brand identity, with his shock of white hair, moustache, and signature white suits. For his massive speaking tours, he printed his own witty signage: “Doors open at 7 o’clock[.] The trouble to begin at 8 o’clock.”

Haunted by his financially precarious childhood, Twain was a compulsive speculator, relentlessly chasing get-rich schemes, and with dismal results. He was fascinated by technology, and dreamed up endless inventions, including a bed clamp to prevent kicking off blankets, and a self-pasting scrapbook. He lost a fortune—nearly $6 million in today’s money—investing in a failed typesetting machine. Certain that his publisher was cheating him, Twain founded a rival publishing house, and managed it into bankruptcy. The heavy weight of debt hung over the Clemens family for decades, pushing them into European exile in the belief that it would be cheaper to maintain a household on the continent than in Connecticut, and forcing Twain to accept creatively unsatisfying but lucrative writing and speaking gigs. He was such a notoriously awful businessman that The Washington Post opined, “One good way to locate an unsafe investment is to find out whether Mark Twain has been permitted to get in on the ground floor.” 

Twain’s private life, too, was more complex than it appeared. Twain was a fiercely devoted husband to his wife, Livy, who served as both personal and professional partner. There was not, Chernow notes, “the least hint of scandal” in their marriage. Yet after her death, Twain pursued cringey friendships with dozens of adolescent girls he termed his “angelfish.” While Chernow stipulates that there is no evidence of sexual impropriety, the biographer openly struggles with reconciling these relationships, which involved young girls visiting the lonely widower for a week at a time, often unchaperoned. In Twain’s later years, letters exchanged with the angelfish comprised fully half of the esteemed author’s correspondence.  

Twain was a doting father to his three daughters when they were young, but grew stern and overprotective as they matured, reluctant to allow them to marry and have independent lives of their own. Later, as he pursued his angelfish, he became neglectful of his daughters’ escalating needs.  

Angelfish aside, the Clemens women deserve their own biography. Livy was an heiress to a coal fortune; Twain squandered much of her inheritance on poor investments. An invalid for much of her married life, Livy was periodically forbidden by her doctor from seeing Twain in person (but not other household members) because his manner was “too excitable” and thought to precipitate her heart palpitations. Banished from the bedroom, Twain slipped love notes to his wife throughout the day. As for the Clemens daughters, Twain plucked his eldest, Susy, from Bryn Mawr after her freshman year, likely in response to a romantic entanglement with a female classmate; she later died of meningitis after refusing conventional medical care, at her father’s direction. The two younger daughters, Clara and Jean—the youngest suffering with epilepsy—spent years of their life in sanatoriums, receiving “rest cures” that limited intellectual stimulation and contact with the outside world. Twain’s devoted, besotted secretary Isabel Lyon also occupied the dysfunctional familial orbit. Her own mental health travails, her fraught relationship with the Clemens daughters, her intimate (albeit asexual) codependence with the widowed Twain, and her ultimate betrayal of the author form a major subplot in the final third of the biography. 

Twain’s literary reputation rested on his masterful ability to mask deeper messages with a light veneer. The British playwright George Bernard Shaw riffed that Twain “has to put matters in such a way as to make people who would otherwise hang him believe he is joking.” This humorous facade disguised a darker, more introspective core.

Most readers will naturally be drawn to Chernow’s narratives of Twain’s writerly life, as he seeks connections between the author’s personal views and his large body of work. Mark Twain is not scholarly literary analysis, but there’s plenty of discussion of the author’s most familiar texts as well as dozens of lesser-known and unpublished writings to satisfy most.  

Chernow is particularly interested in tracing Twain’s growth in racial tolerance from the raw bigotry of his youth (in New York City for the first time, he was struck by the “mass of human vermin”), to his bold critique of slavery in Huckleberry Finn, to his vocal defense of Black, Jewish, and Indigenous people in his later career (even as he used vocabulary and stereotypical tropes that trouble the modern ear). Serendipitously, Chernow’s Mark Twain hit bookshelves the same week that James, Percival Everett’s revisionist take on Huckleberry Finn—retold from the perspective of Jim, an escaped enslaved man—won the Pulitzer Prize. For a current-day reader, when compared side by side, James is electrifying and Huckleberry Finn feels dated. Chernow acknowledges the challenges of reading not just Huckleberry Finn but also much of Twain’s writing with a 21st-century sensibility, even as he reminds readers of how radical Twain was for his time, and the ways in which his literature can inform our understanding of the past.  

In addition to race, Twain was remarkably progressive on a range of political issues—he was an early advocate for women’s suffrage, and spoke out against anti-Semitism, municipal corruption, and colonialism. But he also pulled his punches, ever conscious of the fine line he walked as a southerner-turned-Yankee confidant (and publisher) of former Union generals in post–Civil War America. While he boldly confronted slavery’s evils in Huckleberry Finn, he “shamefully ducked” the contemporary concerns of its aftermath, including Reconstruction and the rise of the Ku Klux Klan. Fervently opposed to lynching, Twain began work on an entire book on the subject, but ultimately abandoned the project, concluding, “I shouldn’t have even half a friend left, down there [the South], after it issued from the press.”  

As he grew older, and with his reputation secured, Twain felt emboldened to opine on current affairs. His interests were eclectic, ranging from alternative medicine to Philippine independence, and they provide us with a particular sight line on a moment in American history when the nation was moving from the travails of Reconstruction into the explosive growth of industrialization and nationalism. Twain had always evinced a harsh and bitter critique of society and its institutions in his fiction, but in his later years his work grew darker, drifting from humor and fiction toward essays lacking the softening mask of humor. He turned his pen against missionaries, Russian czars, the Catholic Church, and, particularly, imperialism. Not everyone was pleased. In response to Twain’s calls for the American withdrawal from the Philippines, Teddy Roosevelt called him a “prize idiot,” and The New York Times scolded Twain for “disregarding the grin of the funny man for the sour visage of the austere moralist.”   

More than a century after his death, Twain remains a mainstay of the literary canon, even as fewer of his books remain in circulation, and his most famous works are banned from many secondary schools. But Chernow is persuasive in his argument that even in his own lifetime, Twain was a larger-than-life character, “embodying something more than a great writer, that he had come to personify, at home and abroad, the country that had spawned him of which he stood as such a unique specimen.” 

The post Clever the Twain Shall Meet appeared first on Washington Monthly.

]]>
159106 Twain
Hillbilly Legacy  https://washingtonmonthly.com/2025/01/05/hillbilly-legacy/ Sun, 05 Jan 2025 23:35:00 +0000 https://washingtonmonthly.com/?p=156924

In December 1763, the “Paxton Boys,” a vigilante group of white settlers on the Pennsylvania frontier, killed and scalped six Conestoga Indians living under the colonial government’s protection on a contested tract of land. Two weeks later, the vigilantes rode into neighboring Lancaster and shot, scalped, and dismembered 14 more Conestoga—three elderly men, three women, […]

The post Hillbilly Legacy  appeared first on Washington Monthly.

]]>

In December 1763, the “Paxton Boys,” a vigilante group of white settlers on the Pennsylvania frontier, killed and scalped six Conestoga Indians living under the colonial government’s protection on a contested tract of land. Two weeks later, the vigilantes rode into neighboring Lancaster and shot, scalped, and dismembered 14 more Conestoga—three elderly men, three women, and eight children—living in a provincial workhouse under the auspices of the local government. The men celebrated in town, “hooping and hallowing” and firing their guns. There were dozens of witnesses, but no one interfered; one resident opined that “too many approved of the massacre.” In February, 200 Paxton Boys and their allies marched to Philadelphia, a three-day journey, to protest the colonial government’s failure to defend them against Indian attacks, rebuking the governor, “You peacefully drink your tea and coffee etc., live carefree, and we have to stand constantly at the ready on the borders expecting to be destroyed by Indians.”

Hard Neighbors: The Scotch-Irish Invasion of Native America and the Making of an American Identity by Colin G. Calloway Oxford University Press, 528 pp.

Colin G. Calloway, a history professor at Dartmouth, has written extensively about the tangled relations between Native Americans and European colonists. In his aptly titled Hard Neighbors, Calloway turns his gaze toward the violent, contested borderlands of the early American frontier and, in particular, the Native people’s primary antagonists, the Scotch-Irish, “a population of white frontiersmen who cut a bloody swath through Indian country and were often the cutting edge of the colonial dispossession of Native people.” The Paxton Boys, Calloway writes, saw themselves as “the front line of frontier defense and the last line of frontier justice.” The so-called Paxton riots were not merely violent retribution toward Native Americans but also a protest against provincial elites in Philadelphia, and against the imperial government, which they believed pandered to their adversaries, even as it failed to protect settlers from Indian attacks. 

At its core, Hard Neighbors is a story about a collision of cultures, and a government’s failure to defend its citizenry. Calloway explores the ways in which border conflicts not only forged American history but also shaped the Scotch-Irish themselves, whose cultural and political influence continues to this day.

Hard Neighbors depicts a harsh and gruesome history, mythologized in popular culture and broadly familiar to even casual armchair historians: tribal abductions of white women; the burning of Indian villages in retribution; the torching of settlers’ crops; the scalping and dismemberment of victims for bounties; and the signing and then betrayal of peace treaties. Calloway fills in these broad outlines with deeply researched histories, offering a densely detailed chronology of violent encounters and shifting alliances, set against a backdrop of clashing colonial empires and the territorial sprawl of a young nation. 

At the center of the conflict were the Scotch-Irish, Presbyterians of meager means who emigrated to America in the 18th century, lured by the promise of free land. Colonial governments throughout North America sought out immigrants—notably Germans, as well as the more numerous and bellicose Scotch-Irish—explicitly intended to settle along their colonies’ loosely defined western borders, thereby creating a buffer zone between Indian tribes and the settled interior and coastal towns. The Scotch-Irish were thought to be particularly desirable as border settlers, since their Scottish forefathers had a history of acting as buffers—they themselves had been lured to Ireland in the 17th century by Queen Elizabeth I and King James I, in an attempt to quell the Catholic “barbaric Irish.” 

Some colonies offered generous enticements. South Carolina, for example, offered free passage, 50 acres of land, tools, temporary tax exemptions, and a year’s provisions. The campaign was remarkably successful—by the eve of the American Revolution, Scotch-Irish colonists had created 500 settlements tracing the Appalachian Mountains, extending 1,500 miles from Maine to Georgia.

In the early years of the new republic, Federalist administrations endeavored to restore peace on the frontier. But for the Scotch-Irish, decades of violent conflict had hardened into a culture of hatred for Indians and a squatters’ sense of frontier justice and entitlement to Native lands.

Contrary to governmental promises, much of the frontier was not “free” but contested, and new arrivals squatted on property claimed by others, ignoring legal claims and surveyors’ studies, and antagonizing local Indian tribes. James Logan, a colonial official in Pennsylvania, came to rue his role in encouraging Scotch-Irish settlement on the frontier, writing in 1731, “Great Numbers of wilful people from the North of Ireland [have] over-run all the back parts of the Province as far as the Susquehannah and are now to the further disaffection of the Indians, passing over it.” They were, he regretted, “troublesome to the government and hard neighbors to the Indians.” 

Nor were they particularly good settlers—the Scotch-Irish proved itinerant, journeying along river valleys and reversing course when they encountered Indigenous resistance. Many arrived in the backcountry so impoverished that they had no resources to invest in housing, tools, seed, or livestock. 

In the first decades of the 18th century, Native Americans and Scotch-Irish interests often converged and the communities coexisted, trading, working side by side, and intermarrying. The Scotch-Irish learned Indian methods of hunting and agriculture, and paddled canoes. Some adopted Native dress and lived in crude bark or log houses. 

Eastern elites looked down on the Scotch-Irish, viewing them as excessively assimilated with their Native neighbors, and routinely decried them as “white Natives.” George Washington, who earned his military bona fides on the frontier, described the Scotch-Irish as “a Parcel of Barbarians and an Uncouth Set of People.” Provincial Secretary Richard Peters mused that the frontier might be more safely inhabited by the colony’s Iroquois allies than by “the lower sort of People who are exceedingly loose and ungovernable.”

Proximity led to alliances and cooperation, but also conflict, particularly over land. Calloway notes, “Borderlands and frontiers may be zones of flux where identities become blurred, but they can also be places where identities form and harden.” The Scotch-Irish were expected to serve as “expendable defensive barriers,” protecting eastern elites and settled farming and coastal communities from Indian attacks and, over time, from French Catholics and slave rebellions. 

This settler colonialism served the distant agendas of eastern elites and the British monarchy, rather than the immediate interests of Scotch-Irish communities. Frontiersmen were expected to serve in militias to fight in Indian wars but frequently deserted, fearful of leaving their homes and families unprotected. Governments, Calloway notes wryly, “outsourced the dirty work of empire building” to the Scotch-Irish.

Global forces far beyond the frontier brought Native American and settler relations to the boiling point. The clash between the French and British colonial powers that ignited the French and Indian War (1754–63) began on the North American frontier. Indian tribes allied with the French raided British settlements, burning houses, torturing settlers, and kidnapping women and children. By 1756, Indian raiders had killed more than 1,000 colonial soldiers and frontiersmen, and settlers had fled nearly 30,000 square miles of territory.

Backcountry settlers like the Paxton Boys fought back, sometimes adopting Native styles of warfare, including scalping and the use of tomahawks. Outflanked, settlers begged for assistance, sending petitions to their governors asking for arms and ammunition. Settlers in Pennsylvania described their situation as “Lamentably Dangerous.” They feared for their lives, “being in such imminent Peril of being inhumanely Butchered by our Savage neighbours.” When the Quaker legislature responded that the Scotch-Irish must defend themselves, 100 settlers, “naked and defenceless,” petitioned the king, to no avail. Instead, the governor placed bounties on Indian scalps as a way to encourage disheartened settlers to remain in the backcountry and fight, rather than flee the contested territory. A group of frontiersmen drove a wagonload of frozen corpses—among the 47 settlers slaughtered by Delaware warriors—and laid them before the statehouse in protest. Calloway explains, “They saw themselves as a beleaguered people left to fight for themselves by a distant government.”

For the Scotch-Irish, who served as the tip of the spear, the French and Indian War was a formative experience. Years of coexistence with local tribes, dotted with periodic conflict, were replaced by an era of unrelenting violence and a hostility toward Native Americans. The shared atrocities of warfare and fear of Indian attacks bonded communities, created tight kinship networks, and built a folk culture in which they viewed themselves as both victims and heroes. After a decade bearing the military burden on the western frontier with little support from colony or king, the Scotch-Irish settlers developed a suspicion of authority and outsiders, antipathy toward government, and enthusiasm for the mounting demands for independence. 

In the early years of the new republic, Federalist administrations endeavored to restore peace on the frontier, penning treaties to establish relations with Native nations, and planning for an orderly, titled national expansion. But for the Scotch-Irish, who had spent decades embroiled in violent conflict, the political exigencies of the French and Indian and Revolutionary Wars had hardened into a culture of hatred for Indians and a squatters’ sense of frontier justice and entitlement to Native lands. Calloway writes, “The language of savagism that angry Scotch-Irish on the mid-Atlantic frontier had employed in the midst of a brutal war became part of the everyday talk of white Americans, justifying dispossession.” 

Scotch-Irish hostility toward elites and authority persisted, exemplified in the Whiskey Rebellion, a series of skirmishes in response to an excise tax intended to help pay for Revolutionary War debts. Along the frontier, where whisky was the most important item of trade, the tax infuriated backcountry settlers as both an attack on their income and a reminder of the government’s ongoing failure to protect them against the Indians. In 1794, 31 years after the Paxton Boys had marched on Philadelphia, 7,000 settlers from throughout Appalachia marched on Pittsburgh, threatening to burn the city occupied by cultural elites, wealthy merchants, and land speculators. In a massive show of force, President George Washington sent an army of 13,000 militia to quell the rebellion. 

By the early 19th century, the descendants of the Scotch-Irish had found a political and cultural home with the Jeffersonian Republicans and, later, Jacksonian Democrats, who embraced settlers’ role as active participants in national expansion and land appropriation, and hostility toward Native tribes.

Hard Neighbors is a scholarly book, well researched, deeply documented, and set in the colonial and early American past. The author’s explicit aim—which he achieves admirably—is to detail the complexity of relations between Native Americans and the Scotch-Irish, and break down monolithic notions of “white colonists” and “European settlers.” Calloway makes only the most glancing allusions to current events. And yet, in the shadow of the 2024 election, it’s hard not to hear echoes of present-day politics. And for amateur readers of history, probing this sort of connective tissue can prove deeply satisfying.

The Scotch-Irish sound curiously like the cultural strain of MAGA: “a reputation for fierce independence, clannishness, a touchy sense of honor, eye-for-an-eye standards of justice, defiance of authority, dislike of elitism, a populist version of democracy, a strong military tradition, and general combativeness.”

Calloway’s Scotch-Irish settlers are the literal forefathers of J. D. Vance’s Appalachian hillbillies, eulogized in his best-selling memoir. When Calloway describes the personal characteristics of the Scotch-Irish, borne from years of hard frontier living under threat of Indian attack and abandoned by the provincial government, they sound curiously like the cultural strain of MAGA: “a reputation for fierce independence, clannishness, a touchy sense of honor, eye-for-an-eye standards of justice, defiance of authority, dislike of elitism, a populist version of democracy, a strong military tradition, and general combativeness.” 

The vigilante Paxton Boys bear more than a passing resemblance to today’s Proud Boys, and the events of January 6 recall settlers’ own theatrical demands for provincial support and defense on the grounds of the Pennsylvania capitol. When the Scotch-Irish expressed terror in the light of gruesome Indian attacks and resentment over contested hunting and farmlands, it is easy to hear echoes of Americans today along the U.S.-Mexico border, begging for government aid in enforcing safety, providing essential public services, and supporting economic well-being. Indeed, Governor Greg Abbott’s deportation of 100,000 migrants from Texan border towns to northern sanctuary cities mirrors settlers’ angry refusals to serve as their government’s “tip of the spear.”

It’s not a perfect analogy, of course—today the much-maligned “incursion” is by Latin Americans and other immigrants of color, while in the nation’s formative years, white European settlers were the immigrants, invading Native lands. And while Donald Trump has made the invidious criminality of immigrants central to his political identity (even as data on crimes committed by immigrant groups belies the argument), it is a poor parallel for the colonial and early American frontier, where white settlers faced genuine peril from Indian tribes. But while the actual threats may not be equivalent, the rhetoric surrounding the existential threat of the “other” is remarkably similar. Calloway’s history serves as a cautionary tale of how fractious relations with other ethnic and racial groups—especially on the border—can boomerang with resentments toward the government for failing to protect its citizens.  

The post Hillbilly Legacy  appeared first on Washington Monthly.

]]>
156924 Jan-25-Books-Calloway Hard Neighbors: The Scotch-Irish Invasion of Native America and the Making of an American Identity by Colin G. Calloway Oxford University Press, 528 pp.
The Regressive Era  https://washingtonmonthly.com/2024/10/29/the-regressive-era/ Tue, 29 Oct 2024 23:17:26 +0000 https://washingtonmonthly.com/?p=155869

A new biography of Woodrow Wilson puts the 28th president’s racism and sexism at the center of its narrative—and his world-historic domestic and international achievements on the periphery.

The post The Regressive Era  appeared first on Washington Monthly.

]]>

In February 1915, President Woodrow Wilson hosted the first screening of a motion picture at the White House. It was a gala affair, and VIPs clad in formal evening wear gathered together in the East Room, where President Abraham Lincoln had once laid in state. The movie, Birth of a Nation, an incendiary film glorifying the Ku Klux Klan, had opened in Los Angeles two weeks before, where it was met with both critical acclaim and scathing public protest. 

Woodrow Wilson: The Light Withdrawn by Christopher Cox Simon & Schuster, 640 pp.

The movie was not a random Hollywood selection. Rather, the film was based on an equally inflammatory best-selling novel, The Clansman, written by one of Wilson’s oldest and most intimate friends, Thomas Dixon. And Wilson didn’t merely endorse the movie—his own academic writings as a scholar of American history had provided the film (and book’s) historical framework. One intertitle card that accompanied the silent film quoted Wilson’s description of Reconstruction in his History of the American People as a misbegotten scheme to “put the white south under the heel of the black south.” As white-sheeted Klansmen gathered on the screen, a second intertitle quoted Wilson’s celebration of the rise of white supremacy: “At last there had sprung into existence a great Ku Klux Klan, a veritable empire of the South, to protect the Southern country.” 

Wilson was delighted by Birth of a Nation, and discussed with the director, D. W. Griffith, how the administration might use the new medium of motion pictures to sway public opinion. He volunteered to assist Griffith in future historical projects. Only months later, after Griffith and Dixon had publicly touted the White House’s implicit approval and after the movie sparked protests throughout the Northeast and Midwest (even while setting box office records that would persist for 25 years), did Wilson, under pressure from his aides, implausibly claim to have been “entirely unaware of the character” of the film. 

Today, Birth of a Nation is widely credited for normalizing the Klan and rekindling its long-dormant membership, and the White House event is often cited as a stain on the Wilson administration. But as Christopher Cox demonstrates in his deeply researched, important new biography, Woodrow Wilson: The Light Withdrawn, this event was neither an aberration in Wilson’s life nor simply a reflection of the casual racism typical of the time. Rather, Cox argues, white supremacist ideology and the related theme of the protection of white womanhood were central to Wilson’s life’s work, both in academia and in public office. Among contemporary scholars, the implicit racism of Wilson’s administration is widely acknowledged, but Cox’s biography is richly detailed and provides an array of shocking examples that might be new to armchair historians.

In recent years, cancel culture has come for Woodrow Wilson, with activists citing his academic writings and federal policies implemented during his presidency as evidence of overt racism. In 2020, in the wake of Black Lives Matter protests, Princeton—where Wilson served as a professor and university president—removed his name from a residential college and from its school of policy and international affairs. Monmouth University also removed Wilson’s name from a marquee building, and in 2022, Washington, D.C., renamed Woodrow Wilson Senior High School, the city’s largest public high school, Jackson-Reed. 

While biographers have often lauded Wilson as a liberal hero and the first modern president, Cox argues that Wilson had more connective tissue with the Confederate past than with the future. Cox assembles a convincing body of evidence that Wilson was committed to white supremacy as a matter of public policy.

Cox’s biography offers a scholarly justification for this denouncement of the 28th president and a vigorous counterargument to the generations of Wilson biographers who unequivocally celebrated their subject as a liberal hero. While such biographers have often lauded Wilson as the first modern president, Cox argues that Wilson had more connective tissue with the Confederate past than with the future. Examining Wilson primarily through the lens of racial equality and gender, Cox assembles a convincing body of evidence that Wilson was committed to white supremacy as a matter of public policy manifest not just in his racial politics but also in his hostility toward women’s suffrage. As Cox—a Republican who served as a U.S. congressman from California for 17 years—writes, “As the first southern Democrat to occupy the White House since the Civil War era, he was superbly unsuited for the moment.” 

The Light Withdrawn is an important, long-overdue complement to the existing literature. But the hefty volume is a narrow study of the 28th president, with a particular focus on Wilson’s lifelong opposition to racial equality, and how this ideology affected the federal campaign for women’s suffrage, which forms the book’s narrative heart. 

In 1948, the venerated historian Arthur Schlesinger listed Wilson among the six greatest presidents of American history. Like Schlesinger, most Wilson biographers have pointed to their subject’s many progressive reforms, ranging from the creation of a progressive income tax to the birth of government agencies such as the National Park Service, the Federal Reserve, and the Federal Trade Commission. Wilson burnished his legacy as an eloquent champion of democratic ideals during World War I and as an architect of the League of Nations, which established the principle of collective security among allies that has guided American foreign policy for a century. 

Cox treats Wilson’s many achievements—and, in particular, domestic policies—as peripheral to his main narrative, and the imbalance takes some of the force out of the book. A biographer’s responsibility, after all, is to paint as complete a portrait of their subject as possible, and a more ambitious biography would concern itself with the moral tension in Wilson’s legacy. Detailing his achievements would not serve to mitigate Wilson’s racism, but would provide a more robust, fulsome accounting of his profound influence on the 20th century. 

Born in Virginia in 1856, Wilson carried the racial prejudices of his southern upbringing for his whole life. His father was a former Confederate officer, and Wilson’s childhood home was staffed by enslaved people. Wilson spent his formative teenage years in South Carolina during Reconstruction, which shaped his worldview. 

As a professor, first at Bryn Mawr—where he expressed open contempt for the women’s college’s formidable president, Martha Carey Thomas—and then at Princeton, Wilson wrote textbooks on U.S. history and government that reflected his commitment to white supremacy. In his textbook The State, for example, Wilson constructed a racial hierarchy with Aryans at the top, and “primitive” and “savage” races, comprising most of the world’s population, at the bottom. He disparaged eastern European immigrants, describing them as “shiftless,” and supported the exclusion of immigrants from China and Japan. In the classroom, he asserted that slavery “had done more for the negro in two hundred and fifty years than African freedom had done since the building of the pyramids.” In his early writings, Wilson described universal suffrage as “the foundation of every evil in this country.” On campus, Wilson was well known for his exaggerated imitations of Black dialect and his racial jokes.

When he became president of Princeton in 1902, Wilson put his ideology into practice, squashing discussions of racial integration and musing that it would be “extremely unlikely” that admissions of Black students would “ever assume a practical form.” A 1910 research report comparing 14 elite universities noted that Princeton alone refused to admit Black students; the school was also strikingly anti-Semitic. “Harvard’s ideal is diversity,” the researcher pointedly concluded, while “the aim of Princeton is homogeneity.” 

Wilson entered politics the same year, winning his race for the New Jersey governorship, which served as a stepping-stone to the presidency. Wilson was elected president in 1912 in a fluke election, thanks to Theodore Roosevelt, whose third-party candidacy split the Republican vote. Nominated at a contested convention on the 46th ballot, Wilson was a compromise candidate for a Democratic Party divided between its northern and southern leadership and all but shut out of presidential politics since the Civil War. 

Wilson brought his racial politics with him to Washington. Within weeks of his inauguration, his cabinet began implementing Jim Crow policies in the previously integrated federal government. Segregation soon marked the entire federal civil service, with separate office spaces, cafeterias, and bathrooms designated by race. Wilson replaced senior Black appointees hired by the Taft administration with white men. When challenged, Wilson defended the segregation of the civil service as in “the best interests of both races in order to overcome friction.” Wilson’s actions cast a long shadow: The federal government remained segregated until 1948.

Wilson’s fraught relationship with the women’s suffrage movement comprises much of The Light Withdrawn’s central narrative. Cox offers a rarely told, behind-the-scenes account of the fight from the perspective of both lawmakers and suffragists. 

Wilson assumed the presidency in 1913, just as the movement was reaching critical momentum. Most histories depict him as a lukewarm proponent of suffrage, unwilling to expend much political capital on the issue, but an eventual convert and essential advocate. Still, Cox argues that Wilson deserves little credit for the passage of the Nineteenth Amendment. Rather, the president spent years trying to foil suffragists’ demands, first by ignoring them, then by censoring them, and finally by denying protesters’ civil liberties. Wilson ultimately supported women’s suffrage when doing so was politically expedient and the passage of the amendment became inevitable late in his presidency.

Until Wilson reached office, women’s suffrage had been an issue left to the states. Although a proposal for a constitutional amendment had been submitted to Congress each session since 1878, it had never received serious consideration. A bipartisan anti-suffrage coalition in Congress and throughout the country had long opposed the women’s vote because of traditional beliefs in feminine purity and the ideology of separate spheres. Wilson shared these beliefs, musing that if women were granted the vote, “it is the home that will be disastrously affected.” 

But Wilson’s opposition to women’s suffrage for most of his presidency rested on more than idealized gender roles. Like many southerners, his opposition was deeply entangled with white supremacy. 

For decades, white southerners had successfully limited Black male suffrage, relying on Jim Crow laws that restricted voting and mandated whites-only primary elections, which effectively blocked the power of Black men’s votes. But the so-called Susan B. Anthony amendment would guarantee suffrage to all citizens, including Black women, and the right would be enforceable by the federal government. White southerners considered this an existential threat. It would be “absolutely intolerable,” a Tennessee congressman asserted, “to double the number of ignorant voters by giving the colored woman the right to vote.”

Wilson understood that a race-based argument against women’s suffrage was unpalatable for a national audience. He was remarkably successful in evading the subject, even as the proposal for a constitutional amendment to guarantee women’s suffrage became the nation’s most contentious domestic issue. Year after year, he declined to mention it in his annual address to Congress. When pushed, Wilson continued to argue that the decision should remain with the states, even as he confided to the suffragist Harriot Stanton Blatch that the states’ rights argument was simply a facade. “Dismiss from your minds the idea that my party or I are concerned about states’ rights,” Wilson told her. “It is the negro question, Mrs. Blatch, that keeps my party from doing as you wish.” Meanwhile, behind the scenes, he encouraged Democrats in Congress to do what they could to block what would become the Nineteenth Amendment, and privately supported altering its language to allow states the right to control enforcement, effectively permitting racial voting restrictions.

Wilson’s opposition to women’s suffrage for most of his presidency rested on more than idealized gender roles. Like many southerners, his opposition was deeply entangled with white supremacy. As he confided to the suffragist Harriot Stanton Blatch, “It is the negro question, Mrs. Blatch, that keeps my party from doing as you wish.”

The suffragists were a perennial thorn in Wilson’s side. In 1916, Alice Paul’s militant National Woman’s Party urged already enfranchised women to vote Wilson and his party out of office, as punishment for failing to support the cause. In January 1917, after he won reelection by a whisker, the party began a campaign of quiet protest, with “silent sentinels” picketing at the gates of the White House. The protests, which continued for a year and a half, involved thousands of suffragists and initially attracted much press attention. Embarrassed by the picketers’ lingering presence, Wilson intervened, suppressing press coverage and, after the U.S. entered World War I, directing the wartime propaganda bureau to label the protests as unpatriotic. He ordered surveillance of suffrage leaders, and condoned police harassment of the pickets. 

Chillingly, Wilson was complicit in the arrest of hundreds of protesters on the trumped-up charge of obstructing the sidewalk. Suffragists were sentenced up to seven months in squalid prisons and workhouses, where they were denied adequate food and water, legal representation, and communication with their families. When some protesters began a hunger strike, prison guards—under Wilson’s direction—commenced force feedings, while Wilson directed the head of his propaganda agency to deny maltreatment of the prisoners and to assert that “the treatment of the women picketers has been grossly exaggerated and distorted.”

During the war, Wilson issued an executive order permitting government officials to restrict international travel to anyone deemed a threat to public safety; at the war’s conclusion, the administration extended the ban to deny passports to virtually all Black applicants, along with members of the National Woman’s Party.

Because Wilson’s domestic and international achievements fall outside the central narrative of The Light Withdrawn, the results feel curiously reductive. Conveying the full breadth of Wilson’s achievements wouldn’t balance the moral scales, but they are nonetheless fundamental to his complicated, contradictory, often infuriating story. Which is also, of course, the history of the United States.

Suffragists and civil rights leaders pointed to the hypocrisy of Wilson’s soaring rhetoric extolling American democracy while denying its fruits to all Americans. At the conclusion of the war, one suffragist decried Wilson’s lofty evangelism for democratic ideals. “While President Wilson has sailed away to Europe to obtain democracy for the world,” she bemoaned, “American women, after six years, know how hollow his words are.” 

Cox notes in his introduction that more than 2,000 English-language books have been written about Woodrow Wilson, but until Arthur Walworth’s Pulitzer Prize–winning two-volume study in 1958, not one had mentioned either the women’s suffrage movement or the racial segregation of the federal government. Wilson’s exalted status as a progressive titan was seldom challenged before the public reckoning of recent years. By thoroughly excavating the president’s racial and gender ideology, Cox’s book is an important contribution to the scholarship. But it has limits, too, as a corrective. 

The Light Withdrawn does not ignore Wilson’s formidable achievements altogether-indeed, Cox praises Wilson in his introduction as “enormously consequential” for a progressive laundry list ranging from the progressive income tax to the Clayton Antitrust Act. He explains that Wilson was not simply a reactionary and takes pains to show how political alignments in Wilson’s day didn’t fit neatly into contemporary categories. Today, left-leaning economic policies often go hand in hand with calls for racial equality, but in the early 20th century, white supremacy was consistent with—even foundational to—white southern progressivism. Like other progressives, Wilson was concerned with ridding government of corruption, breaking up concentrations of financial and corporate power, and empowering democracy through political reform—even as he introduced racial segregation into the civil service. 

But because Wilson’s domestic and international achievements fall outside the central narrative of The Light Withdrawn, the results feel curiously reductive, as if Wilson’s life and role in history can be distilled to his white supremacy and sexism. Conveying the full breadth of Wilson’s achievements wouldn’t balance the moral scales, but they are nonetheless fundamental to his complicated, contradictory, often infuriating story. Which is also, of course, the history of the United States. As a result, Cox’s biography feels both politically charged and incomplete, even as more traditional biographies, which venerate Wilson but ignore his racism, likewise fall short. 

Despite his damning narrative, Cox labels Wilson merely a disappointment—both to suffragists and to civil rights leaders who had trusted in his democratic ideals, and to contemporary students of history disenchanted by the president’s many shortcomings. Cox points to lesser-known historical figures like Alice Paul, the civil rights leader William Monroe Trotter, the presidential appointee and confidant Dudley Field Malone, and Representative Frank Mondell, as the true heroes in the realization of women’s suffrage. 

The author shows considerable restraint in his conclusions about Wilson. Cox writes, “As the poet Whittier teaches, all of us who are Woodrow Wilson’s heirs owe it to ourselves to remember the man in full, and to ‘pay the reverence of old days to his dead fame.’ ” Cox’s thoughtful, deeply researched biography goes a long way toward stripping away the hero worship; perhaps the next biographer will build on this scholarship to offer the more comprehensive treatment this complex historical figure merits, and readers deserve.

The post The Regressive Era  appeared first on Washington Monthly.

]]>
155869 Nov-24-Cox-Books Woodrow Wilson: The Light Withdrawn by Christopher Cox Simon & Schuster, 640 pp.
A New Look at the Feminist Earthquake https://washingtonmonthly.com/2024/08/25/a-new-look-at-the-feminist-earthquake/ Sun, 25 Aug 2024 22:05:00 +0000 https://washingtonmonthly.com/?p=154470

Clara Bingham's masterful "The Movement" shows how women's liberation transformed America and why our understanding of 1963-1973 needs to include more voices.

The post A New Look at the Feminist Earthquake appeared first on Washington Monthly.

]]>

On August 26, 1970, the 50th anniversary of women’s suffrage, an estimated 50,000 women marched down New York City’s Fifth Avenue as part of a daylong general strike. Betty Friedan, the author of the seminal work The Feminine Mystique, which had sparked the so-called second wave of feminism, had called for the strike as a way to cast media attention on the nascent women’s movement. She declared the day “a resistance both passive and active, of all women in America against the concrete conditions of their oppression.” Never one to shirk the spotlight, Friedan led the brigade, flanked by a gaggle of former suffragists, veterans of feminism’s first wave now in their 70s and 80s. Women chanted, held signs, and cheered, “Liberté, égalité, sororité,” “Don’t iron while the strike is hot,” and “Uppity women unite.”

The city permit restricted the march to a single lane of traffic, but the teeming crowd soon flooded the street. Friedan instructed protesters to “Lock arms, sidewalk to sidewalk!” “I never saw so many women; they stretched back for so many blocks you couldn’t see the end,” she recalled. “There were so many of us they couldn’t stop us; they didn’t even try.”

The Movement: How Women’s Liberation Transformed America, 1963–1973 by Clara Bingham, Simon & Schuster, 576 pp.

Friedan’s group, the National Organization of Women, had coordinated the strike, but dozens of disparate organizations gathered under NOW’s tattered umbrella, including the radical Redstockings, the Professional Women’s Caucus, the National Coalition of American Nuns, Black Women’s Liberation, the lesbian group Daughters of Bilitis, and the League of Women Voters. Friedan had originally conceived of NOW as a civil rights organization, essentially an “NAACP for women,” working incrementally within existing legal and governmental structures to expand women’s access to jobs, equal pay, financial tools like mortgages and credit cards, and access to single-sex institutions. But in recent years, younger, more radical women had joined the movement. Many were alumnae of the civil rights, antiwar, and youth movements who brought with them new, more confrontational tactics including guerilla actions and a broader vision for feminism, one that critiqued traditional families and gender hierarchies and embraced sexual freedom.

But on the day of the strike, such differences were cast aside. Time magazine—which featured the event on its cover—described the movement as “diffuse, divided, but grimly determined.” Nearly everyone involved viewed it as a watershed event. Mary Jean Collins, the president of Chicago NOW, recalled, “It was the best idea ever in the history of the world, because it doubled the size of the women’s movement.” Friedan boasted, “On August 26, it suddenly became both political and glamorous to be a feminist.”


This watershed moment of unity is recounted at length in Clara Bingham’s The Movement, a compulsively readable oral history and a timely addition to the historical literature. The Movement spans the first 10 years of second-wave feminism, bookended by the publication of Friedan’s The Feminine Mystique and the Supreme Court decision Roe v. Wade, which legalized abortion.

Bingham’s title—The Movement—might more accurately include a pair of air quotes, because the singular women’s movement was, in fact, an amalgam of dozens of distinct factions with varied demographics, beliefs, and goals—a diversity vividly on display at the 1970 strike. Writing for The Village Voice, the journalist Vivian Gornick summarized the movement as “running the entire political spectrum from conservative reform to visionary radicalism, and powered by an emotional conviction rooted in undeniable experience, and fed by a determination that is irreversible.”

As is typical of oral histories, The Movement incorporates minimal expository or analytical text, relying instead on the voices of the interviewees and news clippings to create a narrative. Bingham places her subjects in conversation with each other, documentary style. The effect is utterly engrossing. It is easy to forget that Bingham’s subjects are women in their 70s and 80s recalling events from more than a half century ago. The most riveting interviews are those that divulge intimate, even womanly, details, like the political operative Esther Newberg’s recollection of her job interview with Bella Abzug while the New York congresswoman changed outfits, shimmying into an old-fashioned girdle. Or the vivid recollections of Judith Arcana, an unlicensed medical practitioner and young mother who expressed her own breast milk into a filthy jailhouse sink after her arrest for performing illegal abortions.


Not surprisingly, The Movement includes all the principal architects of second-wave feminism. Bingham intertwines her own interviews with celebrated leaders like Gloria Steinem and Susan Brownmiller with archive-sourced commentary from deceased luminaries like Friedan, Abzug, and Shirley Chisholm. But delightfully, the great majority of Bingham’s 120 interviewees will be unfamiliar to the average reader. These interviews make The Movement sparkle, and include congressional staffers, flight attendants, artists, athletes, office workers, labor organizers, academics, plaintiffs in precedent-setting court cases, and the “Janes,” an underground network of women who trained one another to perform abortions.

Each speaker is introduced simply by name, with no title or description, although the reader can find short bios at the back of the book. More than just an editorial quirk, this democratization of voices serves as a metaphor for a movement that challenged traditional male power hierarchies and favored collective action and decision-making. Rita Mae Brown, an author and radical lesbian activist, once famously proclaimed, “We don’t need spokespeople and we don’t need leaders. All women can speak, and all women can write.” Indeed, in Bingham’s telling, all voices carry an equal weight. Bingham’s inclusion of lesser-known actors effectively evokes a largely grassroots movement that included thousands of activists in a vast array of professional fields, along with the millions fighting gender battles on the home front.

Mindful of the persistent critique that the most celebrated leaders of the women’s movement represented the interests of educated, straight, middle-class white women at the expense of working-class women, women of color, and lesbians—a refrain ever since Friedan’s The Feminine Mystique essentially ignored all questions of class, race, and sexual preference—Bingham has worked hard to rebalance the historical record. Bingham’s inclusion of a greater range of voices is not some dutiful box-ticking exercise in political correctness: She is offering up a richer, more accurate encapsulation of the movement than most prior histories.

To do so, Bingham places government bureaucrats committed to enforcement of Title VII side by side with counterculture hippies exploring the boundaries of sexual freedom and deftly explores Black women’s complicated relationship with the women’s movement. For many Black women, the concerns of the women’s movement felt less pressing than civil rights. Barbara Smith, a writer and early founder of Black women’s studies as an academic discipline, recalled of the movement’s early days: “I could not even wrap my mind around [women’s liberation] because, it’s like, white women? What do white women have to complain about?” Smith later became an early member of the National Black Feminist Organization, which spoke to issues of both racial and gender inequality. For others, misogyny in the Black Power movement awakened their feminism. “The men were having a revolution that was not going to include us except in a subsidiary, docile, baby-having way,” the cultural critic Michele Wallace recalled. “That’s when I became a feminist.”


With so many competing agendas and diverse voices, the women’s movement was famously quarrelsome. Bingham gamely wades into the messy brew of personal attacks, factionalism, and intra-movement politics. Marilyn Webb, the cofounder of the first women’s consciousness-raising group in Chicago, recalled the collective eye roll by her fellow younger activists over the relatively modest ambitions of Friedan’s generation. “The women in NOW essentially accepted the patriarchy,” she told Bingham. “We wanted to talk about how their roles are defined in marriage, family and social living, how women are treated as a colonized class.” NOW, comprised primarily of middle-aged, professional women, likewise labeled the theatrical tactics of younger, radical activists—like crowning a sheep “Miss America”—“a little nutsy.”

Indeed, at the 1970 “Congress to Unite Women”—which Brownmiller wryly dubbed “the Congress to Divide Women”—NOW nearly combusted over its public embrace of lesbians, who Friedan famously termed the “lavender menace.” Looking back, some interviewees saw the conflict as inevitable. “A certain amount of cannibalizing seems to go with the territory whenever activists gather to promote social change,” Brownmiller reflected. Bingham goes further, seemingly celebrating the competing interest groups as a sign of breadth, diversity, and ideological vigor in the movement. Gornick wrote cheekily in The Village Voice, “If five feminists fall out with six groups within half an hour they’ll all find each other … within 48 hours a new splinter faction will have announced its existence, and within two weeks the manifesto is being mailed out. It’s the mark of a true movement.”

Not surprisingly, for a movement that gave birth to the phrase “The personal is political,” many women were spurred to activism by events in their own lives. Bingham includes confessional stories of how interviewees’ personal journeys—an illegal abortion, a denied promotion, a rape —led them to the movement. Sally Roesch Wagner, who later founded one of the nation’s first women’s studies programs, recalled being forced to forego college as a pregnant teenager. “I was literally transferred from my father to my husband,” she told Bingham. “I wept because all my possibilities were gone.”

Some events, like the fight for the equal rights amendment, Billie Jean King’s “Battle of the Sexes,” and the founding of Ms. magazine, have been well documented elsewhere, although they are much enriched by Bingham’s oral history. But there are plenty of lesser-known stories, too, like that of Bobbi Gibb, the first woman to run the Boston Marathon, in 1966. After her registration form was rejected, Gibb ran anyway, disguised in a hooded sweatshirt. As she raced past onlookers at Wellesley, a woman’s college, she cast off the sweatshirt to applause and cheers. “They just went crazy. They were crying and leaping in the air and screaming. And one woman was going, ‘Ave Maria, Ave Maria,’” Gibbs recalled. “I really felt at that moment that the world would never be the same again.”

The Movement’s tone is triumphant, and Bingham’s interviewees understandably bask in the glow of victory. In 1973, the FBI officially closed its files on the women’s liberation movement, a remarkable admission that its essential principles, once perceived as radical and threatening, had become mainstream. It is a testament to the enduring success of the movement that it’s almost impossible for a young woman today to imagine a world in which whole career paths were inaccessible, women couldn’t apply for a mortgage or credit card without their husband or father’s signature, and an unmarried woman could not get a prescription for birth control. Today, women represent the majority of students receiving undergraduate, law, and medical degrees. The legalization of gay marriage and broad recognition of LGBTQ rights reflect a revolutionary transformation in the acceptance of nontraditional families. A quarter century after the passage of Title IX, women’s athletic scholarships now add up to more than $200 million, and for the first time this year, the NCAA women’s basketball championship game attracted more viewers than the men’s.

Yet it’s impossible to read this book today without reflecting on how one of the movement’s crowning achievements—the right to abortion access—was overturned in 2022 and on how Republican-led state governments have criminalized abortion care in the wake of the decision. Bingham and her interviewees mostly refrain from weighing in on our current state of affairs—this is a history book with a narrow focus on a particular decade. But one of the lessons of The Movement is that the tenets of feminism suffused and reconstituted American culture so completely that its right-wing opponents today are fighting a deeply unpopular battle. Conservatives paid a heavy political price in the 2022 midterms for attacking reproductive rights—though what happens to those and other rights after the November elections remains to be seen. The era that The Movement documents “was not just political or legal, social or cultural disruption—it was all that and more,” Bingham writes. “It was a bedroom and a boardroom and an assembly-line revolution—a restructuring of how women and men in America saw each other, a reinvention of roles and a fundamental identity shift.” That fundamental identity shift endures, and the weight of history may be against those who want to reverse it.

The post A New Look at the Feminist Earthquake appeared first on Washington Monthly.

]]>
154470 The Movement hc c (1)
The Tastemaker https://washingtonmonthly.com/2024/06/23/the-tastemaker/ Sun, 23 Jun 2024 22:55:00 +0000 https://washingtonmonthly.com/?p=153801

The literary editor Judith Jones made the celebration of food part of America’s multicultural identity. A new biography restores her place in publishing history.

The post The Tastemaker appeared first on Washington Monthly.

]]>

On a dreary afternoon in 1949, Judith Jones perched at her typewriter in Doubleday’s Paris office. As she composed rejection letters for unsolicited manuscripts, a slim volume buried in the slush pile caught the 25-year-old secretary’s eye. The cover photo was a haunting image of a young girl with a searching gaze and dark, wavy hair. The book, an advance copy slated for a limited print run in France, was a diary by a 13-year-old German Jewish girl named Anne Frank who had spent years hiding in an Amsterdam attic before her death in a Nazi concentration camp. Jones read the entire book in one sitting, transfixed by the intimacy of the diary entries and the author’s singular voice.

The Editor: How Publishing Legend Judith Jones Shaped Culture in America by Sara B. Franklin, Atria Books, 330 pp.

Stunned that the book had been relegated to the reject pile, Jones pleaded with a senior editor to buy the English-language rights and publish the diary for the American market. As Jones later recalled, he asked, “What, that book by that kid?” Jones persisted, urging him to reconsider. It was a tough sell: In the postwar environment, publishers were reluctant to revisit the dark days of the Holocaust. But it was good—and lucrative—advice. When Doubleday finally published the U.S. version of Anne Frank: The Diary of a Young Girl in 1952, it was an overnight phenomenon, soon dubbed a “classic” by The New York Times. Anne Frank became one of the best-selling books in history and, in 2000, was ranked number two on the Boston Public Library’s “100 Most Influential Books of the Century.”

One might assume that it was Jones’s discovery of Anne Frank that launched her luminous, era-defining career. But as the historian Sara B. Franklin writes in The Editor, the first book-length biography of Jones, that’s not what happened. Indeed, Jones’s pivotal role in Anne Frank’s U.S. publication remained publicly unacknowledged for decades. In the almost uniformly male world of 1950s book publishing, Jones was still treated as little more than a secretary for years after the memoir’s publication, without standing to acquire her own authors or attend editorial meetings.

It would take her improbable discovery of a second overlooked manuscript to later appear on the “100 Most Influential Books” list to finally change that. Nearly a decade after the U.S. publication of Anne Frank, Jones was back in her hometown of New York City and working at Alfred A. Knopf when she came across the manuscript of Mastering the Art of French Cooking, a 750-page tome of authentic recipes written by an unknown American expat named Julia Child and two French colleagues. 

The story of her discovery of Mastering, and her four-decade partnership with Child, provides the spine of this delightful biography that explores Jones’s instrumental role in fostering a literature dedicated to the celebration and culture of food in America, while simultaneously contributing to the American canon of literary fiction and poetry. Franklin convincingly argues that Jones’s impact was expansive, shaping “what we cook and eat, the stories we read and the ones we tell.” And yet Jones remains relatively unknown to the public. This biography, Franklin writes, “is my attempt to give the editor, the woman, her due.”

By the time Mastering’s massive manuscript arrived at Knopf, it had been rejected by multiple publishers. It was perceived to be too long, too technical, too expensive to print, and, in 1950s America, where postwar cooking was characterized by canned ingredients and prepackaged foods, too old-fashioned. And yet Jones, by then a junior editor, was instantly charmed. During her years in Paris, she had become a proficient home cook, even briefly cohosting a pop-up restaurant. Back in New York, she struggled to re-create beloved French dishes. Jones took the manuscript home, and over the course of months, tested dozens of recipes. 

At the time, cookbooks were little more than dull manuals. Mastering seemed a marvel—exquisitely detailed, with step-by-step instructions, tailored to home cooks lacking professional skills, and with an eye to ingredients available in American supermarkets. “Reading and studying this book seems to me as good as taking a basic course at the Cordon Bleu,” Jones told her fellow editors. And the cookbook was more than a how-to book—it teemed with discussion of French culture and history, and practical, even cheeky, advice. A section on making quenelles, for example, was labeled “In Case of Disaster,” and counseled the reader that if a prepared cream paste turned out to be too soft, “it will taste every bit as good if you declare it to be a mousse.” 

Jones’s pivotal role in the U.S. publication of Anne Frank’s The Diary of a Young Girl was publicly unacknowledged for decades. In the male world of ’50s book publishing, Jones was still treated as little more than a secretary for years after the memoir’s publication.

Still, there were formidable hurdles to publication, beginning with Jones herself. While she had edited some of Knopf’s most promising authors, including John Updike and Elizabeth Bowen, she did so anonymously, rarely working directly with the writers. With limited standing at the publisher, Jones worried that her own reputation might suffer—that she might be “pigeonholed as unserious and unduly interested in ‘women’s stuff,’ ” Franklin writes. Equally important, cookbooks were well outside Knopf’s usual literary wheelhouse and of little interest to the predominantly male editorial team.

Without a seat at the editorial table, Jones worked behind the scenes, trumpeting the manuscript to her colleagues. To her delight, a senior editor greenlit the manuscript and it was assigned to Jones. With no house style for cookbooks, Jones created her own.

Editing Mastering took two years. Jones helped the authors “organize, clarify, and finesse” their manuscript, and gave substantive advice, like adding a chapter on hearty peasant dishes. Julia Child “used me as a guinea pig,” Jones recalled, sending her to scout American grocery stores for “obscure” ingredients, including, astonishingly, mushrooms. Jones tested hundreds of recipes in her own home kitchen.

When First Lady Jacqueline Kennedy hired a French chef for the White House, the timing for Mastering’s release in October 1961 seemed propitious. After Craig Claiborne at The New York Times praised the cookbook as “the most comprehensive, laudable and monumental work on the subject,” Jones booked an appearance on NBC’s Today Show, where Child and coauthor Simone Beck prepared a French omelet live on the air. Jones scheduled a book signing and demonstration at Bloomingdale’s for the next morning; the authors were mobbed by crowds. By the end of the first week, Knopf had doubled the initial press run of 10,000 books and arranged for a third. Jones hurriedly organized a West Coast tour with live demonstrations and book signings. In an era in which virtually all the publicity done for a new release consisted of courting book reviewers, Jones was an early innovator in using promotional campaigns to market books.

Jones continued to build on Mastering’s momentum. That winter, a guest spot on the Boston public television station, WGBH, to promote the book led to an invitation for Child to pilot a live cooking show, The French Chef. National syndication followed, along with surging book sales. Jones capitalized on the synergy between Child’s cookbook and her television program, with the show driving book sales and the book attracting viewers. She wrote to Child, “I am flabbergasted at the way you seem to have catapulted into fame overnight.” Buoyed by the success of the TV show, Knopf published a follow-up called The French Chef Cookbook; it sold 200,000 copies.

The extraordinary, unprecedented success of Mastering ignited Jones’s interest in cookbooks of all kinds, and bolstered Knopf’s confidence in their marketability. Following Mastering’s blueprint, Jones innovated a whole new genre, elevating cookbooks to a literary form. 

Jones ambitiously aimed to expand the nation’s palate. In an age in which the most outré “ethnic” dish Americans were comfortable with was chicken chow mein, she published Claudia Roden’s A Book of Middle Eastern Food, Madhur Jaffrey’s An Invitation to Indian Cooking, and Irene Kuo’s The Key to Chinese Cooking. Jones was drawn to each author not merely for their culinary expertise, but also for their ability to write. Kuo “had a gift with language,” Jones recalled. “She talked about ‘velveting’ little pieces of chicken, ‘slippery coating’ them. You could practically eat the words, they were so seductive.”

Jones actively sought out many of her cookbook authors, ignoring celebrity chefs in favor of authentic regional and ethnic cooks like Edna Lewis, the granddaughter of an emancipated slave raised in a town in Tidewater Virginia founded by freedmen. Lewis’s The Taste of Country Cooking was as genuinely vernacular as American jazz, Franklin notes, blending local ingredients like catfish with the French haute cuisine enslaved cooks had prepared for the planter elite. Lewis’s book, which combined recipes with memoirs and history, marked a turning point for Jones, who embarked on a sort of culinary anthropology tour with her husband, traveling across the U.S. and sampling regional foods. Jones published previously unknown American cooks like Bill Neal (Biscuits, Spoonbread, and Sweet Potato Pie) and Himilce Novas (Latin American Cooking Across the U.S.A.), as well as more established names, like Joan Nathan (Jewish Cooking in America) under a new imprint, “Knopf Cooks American.” The series helped popularize the idea of food as culture. The elevation of diverse voices contributed to a celebration of a rich multicultural American identity that today seems self-evident. 

Jones was as responsible for the culinary transformation of 20th-century America as anyone. “Food started getting serious respect largely because of her,” Ruth Reichl, former editor of Gourmet magazine, once commented. “When you talk about the cookbook revolution, she was the revolution.” But while Jones’s legacy rests primarily on her work with cookbooks, her career was equally notable for its extraordinary breadth. She spent more than 50 years at Knopf, the apogee of literary publishing, working well into her 80s, and her impeccable taste made her a legend in her industry. She shared in Knopf’s erudite, prodigious talent pool, working with authors like Updike, Anne Tyler, John Hersey, and Peter Mayle, whose A Year in Provence sold 6 million copies and bridged the gap between literary memoir and food writing. Jones’s true love was poetry, and she edited collections by Sylvia Plath, Langston Hughes, and Sharon Olds.

Part of Jones’s success was her ability to identify large cultural shifts and take advantage of emerging markets. She did so in the world of literature, editing Updike’s Couples, an early literary exploration of sexual freedom published after the “summer of love,” and in the world of cooking, capitalizing on the back-to-the-land movement and vegetarianism, with Kit Foster’s The Organic Gardener and Anna Thomas’s The Vegetarian Epicure.

Jones was a hands-on editor, beloved by her authors. Those who required constant attention—like Updike, who “knew everything that he wanted,” Jones said—received daily calls and letters. The food writer Phyllis Richman praised Jones’s restrained editorial style, saying, “The gentle handling that a soufflé requires is nothing compared to the handling of an author.” With no budget for recipe testing, Jones joined her food writers in their busy kitchens, taking copious notes and sampling unfamiliar ingredients. Jones was “there all the time,” Madhur Jaffrey recalled, and grew to “know the book intimately, just as intimately as I know the book. How rare is that?” 

Jones nurtured collegial, collaborative relationships among her authors, encouraging them to share publicity strategies and contacts, and to host events for one another. She gathered her cookbook authors together in her own kitchen and joined them as they prepared meals. And she distributed copies of their books among her other writers. Updike remarked in one letter that the newly published Anne Tyler was “not merely good, but wickedly good.” 

“The work of editors,” Franklin notes, is “inconspicuous by design.” To read Jones’s authors is to encounter “the invisible hand of an extraordinary editor,” as Julia Child’s own biographer, Laura Shapiro, put it. “Of course you don’t see her. That’s why she was great.” But while invisibility might be a desirable trait in an editor, it can be befuddling for a biographer. Reading The Editor, it can be hard to discern the exact nature of Jones’s contributions to the novels and collections she edited. Franklin endeavors to meet the challenge, thanks to a rich archive comprising oral histories and decades of personal and professional correspondence. The result is a joyful feast for students of literary history. Still, the more serious reader of Updike or Tyler might be stymied in their desire to understand more explicitly how Jones’s invisible hand shaped beloved characters or specific themes.

The story of Jones’s discovery of Mastering the Art of French Cooking, and her four-decade partnership with Julia Child, provides the spine of this delightful biography that explores Jones’s instrumental role in fostering a literature dedicated to the celebration and culture of food in America.

Franklin links Jones’s editorial invisibility to a second kind of invisibility—her virtual absence from popular recognition. She argues that Jones’s “veiled historical import owes much to the fact that she is a woman; misogyny shaped the arc of her life and career and continues to diminish her legacy today.” Indeed, as Franklin illustrates, Jones’s professional colleagues at Knopf—unlike those in the culinary world, who lauded Jones as a visionary—were slower to recognize her legacy. Throughout her career, she was woefully underpaid and underappreciated, a fact Franklin attributes to her gender. 

Most significantly, Franklin asserts, “Judith’s impact on American culture and literature has been further muted due to the genre with which, to the extent she is known at all, she is associated: cookbooks.” Perhaps no longer. With Franklin’s sterling new biography, Judith Jones finally takes the seat of honor at the head of the table.

The post The Tastemaker appeared first on Washington Monthly.

]]>
153801 Jul-24-Books-Franklin The Editor: How Publishing Legend Judith Jones Shaped Culture in America by Sara B. Franklin Atria Books, 330 pp.
The Art of the “Get” https://washingtonmonthly.com/2024/04/03/the-art-of-the-get/ Wed, 03 Apr 2024 09:00:00 +0000 https://washingtonmonthly.com/?p=152306

Barbara Walters hawked dog food, hid in bathrooms, and flirted with Castro to get what she wanted. And because of her, newswomen today have far better options.

The post The Art of the “Get” appeared first on Washington Monthly.

]]>

In 1976, ABC named Barbara Walters co-anchor of the network’s nightly news, with an unprecedented salary of $1 million. In the golden era of television news, Walters was the first woman to hold the high-profile anchor seat on a major network. Predictably, the announcement set off a cascade of tremors through the industry. The press mocked Walters’s appointment with headlines like “Doll Barbie to Learn Her ABCs” and “Barbara Walters: Million-Dollar Baby?” 

Rulebreaker: The Life and Times of Barbara Walters by Susan Page Simon & Schuster, 464 pp.

Walters was no baby doll. She was a twice-divorced, 47-year-old single mother who had spent a quarter century building a broadcasting career, including a dozen years at NBC’s top-rated Today show, ultimately serving as its first female cohost. Walters’s charisma and incisive interviews helped increase Today’s viewership and NBC’s profits. Earlier that year, Gallup listed Walters among the 10 most admired women in America. Still, at a time when even the veteran CBS anchorman Walter Cronkite—dubbed “the most trusted man in America”—earned only $400,000 a year, Walters’s million-dollar salary was eye-popping. And controversial. 

Certainly, the media critique and Walters’s icy reception from male colleagues reflected the casual sexism of the era. But the scorn did not merely reflect doubts about a woman claiming the anchor chair, a vaunted position of public trust. Even in 1976, years before she perfected the hourlong, one-on-one TV interview, before the eponymous news specials and much-criticized intimacy with celebrities, Barbara Walters posed a threat to traditional journalism, thanks to her deliberate blurring of the lines between news and entertainment. Cronkite himself greeted the announcement of Walters’s promotion with a “wave of nausea, the sickening sensation that perhaps we were all going under, that all of our efforts to hold network television news aloof from show business had failed.” The Washington Post opined, “The line between the news business and show business has been erased forever.”

Rulebreaker, Susan Page’s excellent new biography of Barbara Walters, is a nuanced, deeply researched history of the legendary newswoman, who died in 2022 at the age of 93. The book ticks all the boxes for a compulsively readable celebrity biography, relating Walters’s improbable rise to fame, her tumultuous personal life, and plenty of juicy gossip featuring a veritable who’s who of the past 70 years. But Page, the author of best-selling biographies of former House Speaker Nancy Pelosi and First Lady Barbara Bush, rightly keeps the focus on Walters’s propulsive career, her groundbreaking role as a woman in news media, and her controversial legacy in transforming television journalism.

Nothing in Barbara Walters’s adolescence or early adulthood foreshadowed her future success. An average student at Sarah Lawrence College, she demonstrated no particular interest in current events or journalism. For a decade, she held a series of entry-level jobs in public relations and local television. But after her brief first marriage ended and her father’s fortune evaporated, Walters found herself a divorced woman with no safety net in need of a well-paying career. In 1961, Walters was hired as “the girl writer” on NBC’s Today show, but soon began strategizing to join the broadcast. At the time, there were only token opportunities available for a woman, typically reserved for a former actress or beauty queen. Though Walters later modestly claimed that “it never occurred to me that I would ever have a regular on-air role myself,” her colleagues recalled otherwise. A former production assistant remembered, “Barbara was nagging to be on the air … She was just badgering everyone half to death.” Walters distinguished herself through a willingness to take on all tasks, from hawking a sponsor’s dog food to narrating fashion shows. She became a reporter, first in the field, then on set, eventually rising to cohost. 

Walters is a complicated icon for feminists, and a reminder that a pioneer is not necessarily an instigator for change. Her own comments about women in media sometimes echoed sexist tropes.

The Rulebreaker teems with examples of sexism in the workplace. For years on Today, Walters was relegated to the “girlie” stories—puff pieces—rather than hard news. Today’s host, Frank McGee, prohibited her from speaking to guests until he had asked three uninterrupted questions. Walters circumvented the rules by booking her own interviews with news makers offsite.

But Walters is a complicated icon for feminists, and a reminder that a pioneer is not necessarily an instigator for change. Her own comments about women in media sometimes echoed sexist tropes. For example, she told a reporter, “I think that a little of a woman goes a long way on television … For one thing, our voices are different and can easily become tiresome.” In the 1970s, as the women’s movement became pervasive, Walters held herself apart. When her female colleagues at NBC spoke out against discrimination in the workplace, she did not join them. “Barbara was determined to win the game, not change its rules,” Page concludes. “The path she ended up clearing for the women who followed her was, first and foremost, one that she was cutting for herself.” 

At Today, Walters eventually earned the right to cover hard news. In a four-hour interview with Secretary of State Dean Rusk that “helped give her an imprimatur as a serious journalist,” Page writes, Walters pushed Rusk to discuss the public’s critique of the Vietnam War, and his response made headlines and the evening news. But she became best known for her exhilarating interviews with entertainers, and her interviewing portfolio grew to include everyone from Prince Philip to Judy Garland. Her father had been a nightclub impresario, and Barbara spent her adolescence rubbing shoulders with celebrities, alongside gamblers and mobsters. This early exposure gave her an ease among the famous and powerful. 

In network television, the seamless transition from “women’s stories” to hard news was part of the morning show’s appeal. Not so for the evening news. When Walters was named the co-anchor of ABC News, the network’s flagship evening newscast, her critics cast her as solely responsible for threatening the separation of news and entertainment. But the truth is more complicated. As Page illustrates, even before Walters’s arrival, William Sheehan, the president of ABC News, had called for more news stories “dealing with the pop people. The fashionable people. The new fads. Bright ideas. Changing mores and moralities.” Or, as Page notes, “the sort of lifestyle topics that had always been in Barbara’s wheelhouse.” Tellingly, her initial ABC contract was shared between the entertainment and news divisions, a legal acknowledgement of her blended role.

Walters transformed the content of the newscast. For her debut, ABC led with hard news—Walters’s taped interview with Egyptian President Anwar Sadat—followed by her interview with Israeli Prime Minister Golda Meir the next day. But Walters also addressed viewers directly, saying, “I’d like to pause from time to time as we show news items to you and say, ‘Wait a minute. What does this mean to my life and yours?’ ” Today, bringing attention to the everyday impact of seemingly distant events is commonplace in broadcast news, Page notes, but in 1976, it was revolutionary. 

But Walters struggled in her new job—the stiff, sober reading of news reports written by others did not play to her strengths, and on-air tensions with her co-anchor, Harry Reasoner, a self-proclaimed male chauvinist, were so palpable that the news director instructed the cameraman to avoid angles that captured both anchors in a single shot. She lasted just two years in the role, but the changes she introduced to the show remained.

Once released from the constraints of a nightly news broadcast, Walters moved to ABC’s newsmagazine show 20/20 and focused on her specialty, the long-form interview. Blockbuster prime-time interviews, including her annual Oscar night series and her “10 Most Fascinating People” franchise, became Walters’s signature events. She was a ruthless competitor and indefatigable in her pursuit of a subject, throwing pebbles at Sadat’s window at 11 p.m. to capture his attention after a security guard refused to deliver a message. At Camp David, she was once discovered hiding in the women’s bathroom, hoping to grab an exclusive, after the rest of the press pool had boarded the bus home. 

To achieve success, she used all the tools in her toolbox. She wore short skirts, flaunting her “good legs,” and flirted with her subjects, including Cuban President Fidel Castro, with whom she was rumored to have had an affair. In a comment that would make feminists wince, she wrote in her 2008 memoir, “Sex rears its happy little head, and a sought-after male subject chooses you to do the interview in the hope that somewhere along the line, the romantic side—or at least the flirtatious side—will surpass the professional.” At the first-ever shared interview with Sadat and Israeli Prime Minister Menachem Begin, Begin began the interview by cooing, “Mr. President, don’t you think she’s the prettiest reporter you’ve ever seen?” 

In an era when news organizations made room for—at most—a single token woman, Walters jealously guarded her turf from female colleagues. Her rivalry with Diane Sawyer, a co-anchor at ABC, was legendary. “Diane will stab you in the back,” an ABC veteran recalled, “[but] Barbara will stab you in the front.” Walters resented the women who followed her, who “had it easy,” Page writes. In one telling anecdote, Walters was surprised to see that ABC had provided a room for the Good Morning America co-host Joan Lunden’s toddlers, noting that no one had done that for her when she was a new mother. Walters had built her own career without the benefit of female role models or mentors, and only relatively late in her career did she relish that role for herself, nurturing young women on her staff and emerging journalists. 

Page dissects Walters’s mastery of the television interview, describing how she often built to a climax with a final short, direct question crafted to grab headlines. Walters “raised the art of the ‘get’ to a contact sport,” Page writes. Landing interviews with Castro, Vladimir Putin, and Palestine Liberation Organization leader Yasser Arafat, Walters earned the grudging respect of traditional journalists. In 2011, at age 82, she snagged a coveted sit-down with Syrian President Bashar al-Assad. She posed tough questions and held him to account for wartime atrocities. David Kenner of Foreign Policy wrote, “Everyone who made snarky questions about Walters’ lack of qualifications to conduct this interview should be eating crow (and that includes me).” During the course of her career, Walters interviewed every sitting president and first lady from Richard Nixon through Barack Obama, and moderated two presidential debates.

And yet, she earned her reputation for asking softball questions, cozying up to celebrities, and making her subjects cry. (Vanity Fair wrote, “Almost single-handedly, Barbara Walters turned TV interviewing into the weepily empathetic kudzu that has swamped broadcast journalism.”) More traditional journalists dismissed her subjects—convicted murderers, crime victims, movie stars, athletes—as too lowbrow, but Walters “saw it as a brag,” Page writes. She had an uncanny ability to tap into the public’s curiosity, serving up the questions everyone wanted answered—asking the singer Ricky Martin if he was gay, Putin whether he had ever ordered anyone killed, and Nixon whether he regretted not burning the White House tapes. 

In 1997, Walters launched The View, “a floating focus group” featuring women of different backgrounds and generations that fluidly combined entertainment news and political commentary. The New York Times later dubbed it “the most important political TV show in America.” Her primetime specials were ratings juggernauts—her 1999 interview with Monica Lewinsky attracted 70 million viewers, at the time a record for any news program.

As Walters cemented her superstardom in the 1990s, the industry was changing. News programming scarcely resembled the days when Cronkite, Chet Huntley, and David Brinkley held 40 million viewers’ daily attention for 15 minutes of carefully scripted, soberly recited hard news. The networks had traditionally run their news divisions at a loss and made their revenue from entertainment programming, but news was becoming an import profit center, sometimes compromising the quality of journalism. Questionably newsworthy true crime programs like Inside Edition and Hard Copy proliferated. 

How much of this change—the transformation of television journalism, the elevation of sensational content, the pandering to prurient interests—can we pin on Walters? Page is reluctant to pass judgment. As the Washington bureau chief of USA Today, a newspaper partly responsible for the death of local newspapers and the quintessential example of dumbing down the news, she is not exactly a disinterested observer. Page concludes The Rulebreaker with a recitation of the myriad ways Walters was a pioneer for women in journalism, and adds her to a short list of luminaries who have shaped television news, including Edward R. Murrow, Mike Wallace, Cronkite, and Roger Ailes. Other observers haven’t been so kind. The media critic Eric Boehlert once wrote that Walters “pioneered the transformation of television news into, literally, parlor gossip.”

She had an uncanny ability to tap into the public’s curiosity, serving up the questions everyone wanted answered—asking Nixon, for example, whether he regretted not burning the White House tapes.

In 2014, on what was to be Walters’s final episode of The View, a parade of more than two dozen of the nation’s most famous women in broadcasting assembled to honor her storied career. Surrounded by Jane Pauley, Katie Couric, Diane Sawyer, Lunden, and other luminaries, Oprah Winfrey spoke for them all, saying, “I want to thank you for being a pioneer, and everything that that word means. It means being the first, the first in the room, to knock down the door, to break down the barriers, to pave the road that we all walk on.” 

Walters basked in the praise, momentarily speechless, and pointed to the accomplished women surrounding her. “These are my legacy,” she said. Indeed, it’s a legacy worth celebrating. But in an industry refashioned in her image, it’s far from her only one.

The post The Art of the “Get” appeared first on Washington Monthly.

]]>
152306 Apr-24-Books-Page Rulebreaker: The Life and Times of Barbara Walters by Susan Page Simon & Schuster, 464 pp.
The Lost Mystique of Betty Friedan https://washingtonmonthly.com/2023/10/29/the-lost-mystique-of-betty-friedan/ Mon, 30 Oct 2023 00:05:00 +0000 https://washingtonmonthly.com/?p=149812

Later waves of feminists assailed the pioneering author and activist for focusing on women’s legal and economic rights rather than sexual liberation. Her reputation is due for a revival.

The post The Lost Mystique of Betty Friedan appeared first on Washington Monthly.

]]>

In February 1969, Betty Friedan, president and cofounder of the National Organization for Women and best-selling author of the feminist manifesto The Feminine Mystique, led a protest of 30 women at Manhattan’s storied Plaza Hotel. Since 1907, the Plaza’s elegant wood-paneled Oak Room and adjacent bar had excluded women from its weekday lunch service. Clad in a mink coat, the 48-year-old Friedan addressed the press gathered in the gilded lobby. Drawing parallels to the sit-ins of the civil rights movement, Friedan asserted that the Oak Room’s exclusion of women constituted a violation of state law, asserting, “This is the only kind of discrimination that’s considered moral, or, if you will, a joke.” 

Betty Friedan: Magnificent Disrupter by Rachel Shteir Yale University Press, 384 pp.

Indeed, the media mocked the “phalanx of feminists” and their theatrics. “For a woman to stroll into a men’s bar at lunchtime and demand service seems to me as preposterous as a woman marching into a barbershop and demanding a hot towel and a haircut,” the New York Post chided. Though the small protest, like hundreds of others staged by NOW, was ultimately successful, resulting in the hotel’s reversal of its men-only policy, it became an object of derision within the movement, too. Younger, more radical feminists like the journalist Gloria Steinem “felt that the Oak Room sexgregation action proved yet again that the organization was too white, too middle class,” as Rachel Shteir writes in her new biography, Betty Friedan: The Magnificent Disrupter. In 1963, the explosive publication of The Feminine Mystique, Friedan’s siren call for women trapped in the mind-numbing drudgery of housework and the glorification of motherhood, had lit the fuse of the second-wave feminist movement. But just six years after becoming a household name, Friedan was on the verge of being eclipsed by the movement she had created, dismissed by her critics as a relic of a stodgy feminism too narrowly focused on legal and economic equality.

Shteir’s book wrangles with the complex legacy of the mother of mid-20th-century feminism, and, by extension, the women’s rights movement of the 1960s and ’70s. This new biography is animated by a desire to restore Friedan’s reputation, which Shteir describes as marred by highly publicized quarrels within the women’s movement, and by disparaging historical treatments. Shteir portrays Friedan as misunderstood, both in her time and today. Shteir writes, “Since Friedan’s death [in 2006], the practice of either ridiculing her or making her disappear continues, carrying forward the portrait cemented twenty years ago in the last round of full-length biographies.” In 2020, a biopic about Steinem (The Glorias) and a miniseries about the conservative activist Phyllis Schlafly and the defeat of the Equal Rights Amendment (FX’s Mrs. America) introduced the women’s liberation movement to a new generation of young women. Indeed, Friedan fares poorly in both cinematic histories, coming across as shrill, out of touch, and self-absorbed.

Shteir’s rehabilitation of her subject rests on Friedan’s undeniable achievements. The Feminine Mystique is regularly listed among the most influential non-fiction books of the 20th century, alongside classics like the conservationist Rachel Carson’s Silent Spring. The first paperback edition sold 1.4 million copies. The futurist Alvin Toffler proclaimed that “it pulled the trigger on history.” It’s difficult to think of a book published in the past 25 years that has had a comparable cultural and political impact. Validating many women’s dissatisfaction with their lives—a phenomenon she dubbed “the problem that has no name”—Friedan wrote, “Each suburban wife struggled with it alone. As she made the beds, shopped for groceries, matched slipcover material, ate peanut butter sandwiches with her children, chauffeured Cub Scouts and Brownies, lay beside her husband at night—she was afraid to ask even of herself the silent question—‘Is this all?’ ” 

The Feminine Mystique launched Friedan’s public career. For the next decade, she was everywhere—in magazine profiles, with Johnny Carson on The Tonight Show, leading marches, speaking at civic organizations, and meeting with elected officials. But critics noted that the book spoke primarily to white, college-educated, suburban women, virtually ignoring Black and working-class women. Others questioned the originality of Friedan’s ideas and deemed the book derivative, particularly of Simone de Beauvoir’s The Second Sex, published more than a decade earlier. These criticisms—too white, too derivative, too middle class—followed Friedan for decades. Feminist theorists like bell hooks demeaned the The Feminine Mystique as “a case study of narcissism, insensitivity, sentimentality, and self-indulgence.” Friedan often cast herself in heroic terms, musing, “The reactions to my book have been most satisfying, even the violence of the attacks … Writing this book seems to have catapulted me into a movement of history.”

Friedan was, by all accounts, difficult. Her own brother described her as “a cross I had to bear.” She had a fierce temper, was imperious and demanding, and insisted on proper deference. “She thought she was the wave,” an obituary noted wryly.

While Shteir acknowledges the narrow scope of The Feminine Mystique, she endeavors to rescue Friedan from attacks of classism and racism. By 1963, Shteir argues, Friedan had earned her left-wing bona fides. She was quick to join a picket line and had logged two decades of writing for labor publications, publishing critiques of capitalism, conspicuous consumption, and income inequality. After the book’s publication, despite viewing herself as not “an organization woman” but “a writer, a loner,” Friedan parlayed her celebrity to cofound NOW to confront bread-and-butter issues of legal and workplace inequality, and lobby for the passage of the Equal Rights Amendment and the expansion of the Civil Rights Act. Shteir writes that Friedan actively recruited Black luminaries like Coretta Scott King and Fannie Lou Hamer onto the boards of her organizations. Friedan drew frequent parallels between the civil rights and women’s movements, drawing ideological and tactical inspiration from the former, and referring to NOW as “the NAACP for women.” 

NOW was remarkably effective in raising awareness of structural inequalities in every sector of American life, many of which are unimaginable today: prohibitions against unaccompanied women being served liquor at a bar; United Airlines’ men-only “executive flights”; and newspaper classified ads divided by sex. In 1969, Friedan built on NOW’s success by cofounding NARAL (the National Association for the Repeal of Abortion Laws) and in 1971, the National Women’s Political Caucus, to elevate women’s voice in the political process. 

Despite organizational successes, fissures emerged around substantive ideological disagreements, separating feminists from would-be allies in the labor and civil rights movements. For example, NOW split with unions over the ERA, which some feared would undercut hard-earned protections intended to shield women workers from long hours and dangerous work. And early Black allies like Pauli Murray abandoned NOW, frustrated with its ongoing preoccupation with the ERA at the expense of issues more directly impacting poor Black women. 

By the late 1960s, a clear schism had emerged between centrist feminists like Friedan and a growing women’s liberation movement, which included disparate radical feminist groups—many of them comprised of younger, unmarried women—advocating female separatism and sexual freedom. This strain of the movement, shaped by the Black Power movement, the student movement, antiwar protests, and the counterculture, was represented by Steinem, Friedan’s younger and more charismatic rival.

The two camps disagreed on fundamental matters, including divergent attitudes toward the nuclear family—Friedan argued that gender equality was compatible with marriage and motherhood, and rejected radical feminists’ vilification of men. Friedan shied away from portraying women as victims or members of an oppressed class. Influenced by the counterculture’s celebration of sexual freedom, some feminists drew connections between their own sexuality and feminism, celebrating the female orgasm and advocating alternatives to heterosexuality. Friedan quipped that lesbians in the movement constituted a “lavender menace” and feared that they would scare off the middle-class suburban housewives she needed to rally support for the ERA. In response to Kate Millett’s Sexual Politics, which focused on sexual oppression, Friedan griped, “Young women only need a little more experience to understand that the gut issues of this revolution involve employment and education not sexual fantasy.” Even as Friedan enthusiastically led “guerrilla” actions like a 1967 protest in which NOW members threw typewriters and aprons at the White House gates, she eschewed those targeting beauty culture, like the Miss America protest where women burned bras and hanged the pageant host Bert Parks in effigy.  

Differences came to a head in 1968, when Valerie Solanas, the author of SCUM (Society for Cutting Up Men) Manifesto, shot the artist Andy Warhol. Ti-Grace Atkinson and Flo Kennedy, leaders of the NY NOW chapter, rushed to Solanas’s defense, with Kennedy describing her as a hero of Black Power and “one of the most important spokeswomen of the feminist movement.” Friedan was appalled, and telegrammed, “Desist immediately from linking NOW in any way with Valerie Solanas. Miss Solanas’s motives in Warhol case entirely irrelevant to NOW’s goals of full equality for women in truly equal partnership with men.” When Atkinson, a former Friedan protégée, ran for reelection as president of NY NOW, Friedan rallied the opposition; Atkinson and Kennedy defected to found the Feminists, an egalitarian, radical organization. 

By the 1970s, Friedan was increasingly marginalized from the movement she birthed. Her attempts to make common cause with other factions could be cringingly tone deaf—she organized a truck bearing watermelon and fried chicken (a “Traveling Watermelon Feast”) in support of Black Congresswoman Shirley Chisholm’s 1972 presidential campaign. But Friedan continued to work on behalf of women’s equality. Shteir notes that in many ways, Friedan was ahead of the culture, writing about the pressures of the “double shift,” paid maternity leave, universal child care, and pressure to choose between family and career. In later years, she wrote about menopause, women’s right to love and sexual satisfaction, and aging. But on many issues, Shteir concedes, Friedan was on the wrong side of history. She viewed rape, domestic abuse, sexual liberation, pornography, and abortion as distractions from the fundamental fight for gender equality. 

Could a different, more flexible leader have navigated the transition from the early women’s movement, which emphasized legal and political strategies, to women’s cultural liberation? Perhaps. Shteir blames Friedan’s centrism and incrementalism for accelerating a mass defection of young women from NOW into radical feminism. But it’s hard not to see Friedan’s limitations as those of personality. Obliquely referring to Friedan, Steinem told a reporter, “I know other women with whom I have the same ideological differences with whom I can work.” 

Friedan was, by all accounts, difficult. Shteir’s interviews with Friedan’s former colleagues and family members provide some of the most biting commentary in the biography. Her own brother described her as “a cross I had to bear.” Within the movement, she turned on former allies, maligning them behind their backs. Friedan had a fierce temper, was imperious and demanding, and insisted that she receive proper deference. In a pointed obituary, Germaine Greer noted wryly of Friedan, widely acclaimed as the mother of second-wave feminism, “She thought she was the wave.” 

Jaw-droppingly hostile press coverage skewered Friedan’s clothing, hairstyle, weight,
and facial features. In a
Philadelphia Inquirer profile, a sympathetic female reporter offered a backhanded compliment, writing, “[Friedan] is not as grotesque as the press and many photographs would have you believe.”

Shteir shares the catty comments from Friedan’s fellow feminists, and the jaw-droppingly hostile press coverage, which skewered Friedan’s clothing, hairstyle, weight, and facial features. In a Philadelphia Inquirer profile, a sympathetic female reporter offered a backhanded compliment, writing, “[Friedan] is not as grotesque as the press and many photographs would have you believe.” There is more than a tinge of antisemitism to many attacks—her “long nose,” and “bulging” eyes—and in a movement filled with Jewish activists, Friedan seemed uniquely targeted. 

The discussion of how a subject is perceived by colleagues and family is fair game in a biography, but there’s something that feels cruel, almost—dare I say?—antifeminist about Shteir’s ample attention to these personal flaws. For a movement that trumpeted that “the personal is political,” Shteir’s repetition of the slurs—even as a form of reporting—feels gratuitous, even as they create the context for the hostile environment in which feminism flourished. And one wonders whether all of Friedan’s negative attributes—her bluntness, bossy demeanor, and assertiveness—might have been viewed as virtues in a male counterpart.

Ultimately, Shteir successfully argues that Friedan’s legacy rests on the work itself, rather than on her character, an assessment Friedan herself would have found gratifying. Through her writing, her organizations, and her unrelenting prodding at social norms, Friedan transformed the way women viewed themselves, even as true equality remains unrealized. Shteir concludes, “Friedan was no saint. But she was an oracle and an iconoclast, ahead of her time … She imagined herself under the shadows of history and eternity, acting with remorseless courage.” A fairly magnificent legacy, indeed.

The post The Lost Mystique of Betty Friedan appeared first on Washington Monthly.

]]>
149812 Nov-23-Books-Shteir Betty Friedan: Magnificent Disrupter by Rachel Shteir Yale University Press, 384 pp.
Indian Country https://washingtonmonthly.com/2023/08/27/indian-country/ Sun, 27 Aug 2023 22:10:00 +0000 https://washingtonmonthly.com/?p=148758

A new history shows how the interactions between European colonists and Native peoples helped shape the foundations of American government.

The post Indian Country appeared first on Washington Monthly.

]]>

In September 1862, Confederate forces met Union troops in Sharpsburg, Maryland. The ensuing Battle of Antietam would prove to be the bloodiest of the Civil War. Within weeks, President Abraham Lincoln issued his preliminary Emancipation Proclamation, decreeing that if the Confederate states did not rejoin the Union by January 1, 1863, enslaved people in the rebellious states would be freed. 

The Rediscovery of America: Native Peoples and the Unmaking of U.S. History by Ned Blackhawk Yale University Press, 616 pp.

Twelve hundred miles away, in the recently admitted state of Minnesota, the U.S. military was engaged in a different kind of war, a campaign of ethnic cleansing to remove Native peoples from their lands. This crusade was fueled by a demand for farmland by white homesteaders. In just 10 years, the population of settlers in Minnesota had soared from under 5,000 to 150,000. 

In a brazen violation of existing treaties, the U.S. government ceased paying promised annuities, and permitted homesteaders to squat and graze cattle on tribal lands. Clashes between settlers and Native Americans turned violent, sparking the six-month-long Dakota War, in which 1,000 settlers, Indigenous people, and U.S. soldiers died. General John Pope, commander of the U.S. forces, wrote, “It is my purpose utterly to exterminate the Sioux if I have the power to do so … Destroy everything belonging to them and force them out to the plains.” 

Violence against Native people was justified by racism: “They are to be treated as maniacs or wild beasts, and by no means as people with whom treaties and compromises can be made,”  Pope wrote. Three months later, with the express approval of Lincoln, the military conducted the largest mass execution in U.S. history, hanging 38 Indigenous soldiers in Mankato for their part in the Dakota War, one of more than a hundred campaigns against Native people fought in the West during the Civil War and Reconstruction.

These parallel histories—the first, a climactic event in virtually all Civil War narratives, and the second, a lesser-known story rarely linked to the larger context of the Civil War and its themes of dispossession and freedom—illustrate the argument at the heart of Ned Blackhawk’s The Rediscovery of America, a sweeping, even audacious, retelling of U.S. history centered on the Native American experience. Blackhawk asserts, “It is impossible to understand the United States without understanding its Indigenous history.” 

Blackhawk—a professor of history and American studies at Yale University and a member of the Te-Moak Tribe of Western Shoshone Indians of Nevada—brings decades of academic bona fides to the task of synthesizing the deluge of recent scholarship on Indigenous Americans into a single, comprehensive volume. This achievement alone would make The Rediscovery of America a notable and important book. 

But for those outside academia, Blackhawk’s more interesting accomplishment is not the comprehensiveness of this deeply researched narrative. Rather, this impressive tome offers a bold new framework for understanding U.S. history. 

Since the 1970s, a generation of historians has pushed for the study of “forgotten” Americans. Blackhawk goes beyond a call for inclusion, arguing instead for a whole new paradigm, an “alternate American story that is not trapped in the framework of European discovery and European ‘greatness.’ ” Like The New York Times Magazine’s controversial “1619 Project,” which places African American slavery at the epicenter of the American story, Blackhawk prods readers to rethink our collective historical narrative, but with Native Americans at the hub. This is not just a question of focus, but also one of empowerment, elevating the continent’s Indigenous people from passive victims to actors in a centuries-long struggle over land and sovereignty. 

Since the 1970s, a generation of historians has pushed for the study of “forgotten” Americans. Blackhawk goes beyond a call for inclusion, arguing instead for a whole new paradigm, an “alternate American story that is not trapped in the framework of European discovery and European ‘greatness.’ ”

For the armchair historian, the book offers an exciting, even disorienting narrative. The experience is similar to viewing a world map drawn from the so-called Peters projection, which reduces the distortions at the equator and poles to reflect countries’ true sizes; the essential facts are the same, but the overall effect is disarmingly different.

The raw, violent outlines of Native American history are generally well known: the brutal warfare of the colonial era, then the settler colonialism during the nineteenth century, followed by forced assimilation, and culminating in the resurgence of cultural pride and identity politics. Blackhawk fills in this familiar framework with lesser-known histories. The result is a new chronology along with a new geographic focus, which shifts attention away from the urban East Coast to the nation’s interior. 

While The Rediscovery of America’s exhaustive narrative covers the full span of history, Blackhawk’s thesis—that “American Indians were central to every century of U.S. historical developments”—is most persuasive in his retelling of the American origin story.

Appropriately, the author devotes the first half of the book to the colonial era, when daily encounters between Indigenous people and the colonists were commonplace, and the dominance of the European empires was not preordained. It’s a complex story that expands well beyond simplistic descriptions of Spanish missionaries, French trappers, and British settlers. Instead, Blackhawk portrays a web of interdependence, with Native people and Europeans deeply entangled with one another through trade and commerce, military encounters, diplomacy, and social affairs. 

These relationships defy simple descriptions—at times, they were amicable and mutually beneficial, at others, antagonistic. Perhaps the most apt description would be opportunistic—both Native tribes and Europeans benefited from trade and the exchange of knowledge. Tribes and colonists forged diplomatic and military alliances in protection of shared interests, and against common foes. Such relationships were fluid, and at times deeply personal—rape and kidnapping occurred along the same continuum as conversion and intermarriage.

Connections between Europeans and Native people shaped daily life, patterns of settlement, and economic development in America, with global consequences. The Spanish, for example, built their American empire around silver mining enabled through Indigenous labor, an enterprise so successful that it transformed European economies by providing a universal currency, facilitating the mercantile and commercial revolutions. 

Blackhawk depicts Indian nations during the colonial period as strategic, at times pitting colonial powers against one another. Other groups, like the Iroquois, successfully brokered a balance of power among the European empires in North America that lasted 50 years. 

Blackhawk’s portrayal of the Seven Years’ War (1756–1763), a global event he boldly labels “the principal conflict in American history,” offers an interesting case study for how a Native American reframing can alter our understanding of the past. Most historians emphasize the importance of the Seven Years’ War as a prelude to independence, pointing to Britain’s burdensome war debt, which prompted the taxation of its colonies, a key factor in sparking the Revolutionary War. 

Blackhawk focuses instead on a second major outcome of the war: France’s cession to Britain of its expansive but sparsely settled lands east of the Mississippi River. The addition of this massive, arable territory was a boom for homesteaders ranging from small farmers to plantation owners like George Washington, who were eager to expand and diversify their holdings. More than 100,000 settlers flooded the Ohio River Valley between 1770 and 1790, raiding Indian villages and squatting on Native lands. 

The author argues that decades of instability in this frontier territory became a flashpoint for “an elemental struggle” between Native Americans and white settlers. Blackhawk concludes, “This interior world … would determine much of the history of the new Republic. From Indian resistance to the enforcement of new national laws and policies, struggles over interior lands shaped the contours and eventual structures of the new American government.”

Managing chaos on the frontier proved an ongoing headache first for the British Crown, then for the nascent U.S. government. Indian leaders complained that settlers were “like a plague of locusts.” Eager to expand their borders and their states’ relative size and influence, governors from Kentucky, Pennsylvania, and South Carolina encouraged settlement and condoned violence, offering $100 bounties for Indian scalps. 

Blackhawk argues that the British government’s inability to control violence in the interior helped shape anti-monarchical attitudes among settlers, contributing to the Revolution. Indeed, the Declaration of Independence listed the Crown’s failure to protect frontier settlers from “merciless Indian Savages” among their grievances. 

The ripple effect of the Seven Years’ War and the ensuing instability of the interior continued after independence. Blackhawk asserts that the Articles of Confederation failed in part because the weak national government couldn’t raise an army or staff military forts and confront Indigenous tribes on the frontier. Lacking a strong national defense, settlers in Kentucky and Ohio called for citizen militias, birthing a political culture of skepticism toward national authority that echoes today. And border states, including Virginia, Pennsylvania, and New York, seized Indian lands for themselves in violation of existing treaties, provoking Native Americans and undercutting the weak national government.

The call for a single, coordinated diplomatic and military response to the Native nations and for a centralized authority to regulate and tax interior lands helped, in turn, to make the case for a strong federal government embodied in the Constitution. And it was with the Indigenous people, Blackhawk argues, that the new republic honed its expertise with the tools of government and diplomacy, signing nearly 400 treaties with Native nations in the years between independence and the Civil War. 

The book’s timeline is, appropriately, heavily weighted toward the colonial era. However, it is difficult to argue that events of the 20th and 21st centuries conform with Blackhawk’s thesis concerning the centrality of Indigenous history. 

Native Americans are largely absent from post–World War II narratives, despite their ubiquity in mid-century popular culture. That’s no surprise—600 years of violence have pushed the lives (and stories) of Native people into the margins. But Blackhawk makes the provocative point that their absence became a self-fulfilling prophecy, not only shaping our understanding of American history but also “inform[ing] policies toward Native nations aimed to assimilate them into American society.” 

Blackhawk’s final chapter includes a disturbing section on cultural erasure—a pattern of post–World War II policies that included the forcible removal of one-third of Indigenous children into white foster care or adoption, along with housing, job, and education incentives intended to encourage Native peoples to urbanize and abandon reservations. Additionally, he outlines a series of U.S. “termination” policies in the decades following the war, aimed at ending tribal recognition, privatizing tribal lands through claims settlements, and eliminating federal responsibility for Native Americans. Blackhawk makes the compelling argument that these policies should be understood within the cultural context of Cold War ideology, the clash between Indigenous ideologies of communal governance and land ownership, and American individualism. As one Cold War–era South Dakota congressman fumed, “Socialist Democrats are making much ado about fighting Communists and Communism throughout the world, and yet the same Administration … [is] bringing it right to America and Communizing the Indians just as thoroughly as if they were citizens of Russia.”

The Rediscovery of America is a dense narrative, brimming with unfamiliar histories and big, expansive themes. Among the book’s most interesting contributions is its legal history. For 250 years, the courts have struggled to define what it means to be a nation (or, more accurately, nations) within a nation, and its myriad implications, including land ownership and legal jurisdiction for civil and criminal cases. 

Blackhawk traces the Supreme Court’s narrowing interpretation of Native sovereignty, beginning with the earliest days of the republic, when Indigenous tribes were treated as separate nations accorded legal rights and diplomatic status, to the 1830s, when the Court redefined Indian tribes’ relationship to the United States as a domestic dependent, resembling “that of a ward to his guardian.” By the late 19th century, the Court had given Congress power to supersede existing treaties, with full administrative power over tribal lands. Blackhawk places each of these judicial shifts in historical context, explaining how changes in the interpretation of Native nations’ legal status served as pretexts for changes in policy intended to buttress corporations’ and white Americans’ rapacious demands for Indian lands. 

Twentieth-century policies encouraging Native American assimilation were based on a contrary assumption—that Indigenous people are a race, not a nation. Race-based policies aimed to weaken or eradicate reservations by providing socioeconomic benefits to Indigenous people individually rather than collectively. Throughout the book, Blackhawk explores the idea of racial identity, contrasting depictions of Native Americans with those of Black Americans and other ethnic groups, and exploring how racial ideologies justified policy shifts.

Today, the question of whether Native Americans are a nation or an ethnic group remains unsettled, in both policy and law. On one hand, there is a general legal consensus supporting a narrow notion of Native sovereignty—for example, tribes have the right to sell water and mineral rights and to build casinos on their reservations. However, it is also implicit that Native American sovereignty is not the same as the sovereignty attached to other countries—in other words, the U.S. government treats the Iroquois differently than it treats France. On the other hand, Native Americans are listed among racial groups on the census, and are at times beneficiaries of race-based policies like affirmative action.

Threads of this centuries-long debate came to the fore this year, in a Supreme Court case challenging a 1978 federal law that gives preference to intra-tribal adoptions of Indigenous children. The plaintiffs, which included a white foster couple from Texas seeking to adopt a Native American child, argued that the law violated the equal protection clause of the Constitution. In June, in a 7–2 decision, the Court upheld the law, affirming the federal government’s right to make laws concerning Native American tribes and protect child welfare, while acknowledging that it is unusual for Congress to wade into the area of family law. Citing a lack of jurisdiction, the Court skirted the question of whether the law, with its explicit racial preferences, violated the equal protection clause, but at least one justice—Brett Kavanaugh, who voted with the majority—explicitly welcomed the opportunity to examine the issue in a future case. 

Constitutional scholars and tribal advocates agree that the issue at stake—an affirmation of Native American sovereignty—undergirds a wide array of established legal rights on Indian reservations, including land ownership, water and mineral rights, certain forms of judicial authority, exemption from certain taxes, and gaming rights. If courts were to redefine Native Americans as a racial or ethnic group—like Blacks or Latinos—all such rights would be at risk.

The call for a single, coordinated diplomatic and military response to the Native nations and for a centralized authority to regulate and tax interior lands helped, in turn, to make the case for a strong federal government embodied in the Constitution.

The Court’s majority position makes clear that the justices were swayed not merely by questions of constitutionality, but also in deference to Native Americans’ troubled history. In a concurring opinion, Justice Neil Gorsuch wrote that the Court’s decision upheld three promises: “the right of Indian parents to raise their families as they please; the right of Indian children to grow in their culture; and the right of Indian communities to resist fading into the twilight of history.” 

The Rediscovery of America is an important, possibly even a landmark, book. Ned Blackhawk persuasively argues that the histories of Native America and the United States are inextricably intertwined and, more controversially, asserts that an entirely new paradigm—completely with new themes, geographies, and chronologies—is necessary to create a balanced, comprehensive American history. The book is academically rigorous and exhaustively researched.

It is, however, an academic book. It is dense, laden with facts and events. Blackhawk is scrupulous in his attention to detail, acknowledging changes over time, distinctions among tribes, geographies, and cultures. To his credit, he has written a nuanced, expansive history. But for the average, non-scholarly reader, it’s a lot to digest.

The history of Indigenous peoples is, at times, appalling and violent. And yet, the narrative is, at times, surprisingly clinical. Maybe because Blackhawk writes in academic prose, or because the book is so packed with facts and timelines, it is less emotionally compelling than one might expect. There’s also a lost opportunity for images to carry some of the weight. The book does include 10 maps and about a dozen photos and illustrations. But the addition of more recent images, like photos of the Lakota occupation of Alcatraz in 1969, or the politically charged artwork of T. C. Cannon—could have been really affecting. 

In comparison with the 200 pages that cover the colonial era, Blackhawk’s treatment of the 20th and 21st centuries feels light. Discussion of the four decades since 1980—an era characterized by the resurgence of cultural identity, the reassertion of tribal sovereignty, and the expansion of economic opportunities through the gaming industry—is particularly cursory, summarized in just five pages. That’s a shame, because recent events offer an opportunity to connect historical threads to current controversies, like the disputed Dakota Access pipeline. 

With The Rediscovery of America, Ned Blackhawk has opened the door to a national conversation. Blackhawk sets the tone with his opening line, asking rhetorically, “How can a nation founded on the homelands of dispossessed Indigenous peoples be the world’s most exemplary democracy?” 

Blackhawk is calling for a revolution in the way American history is conceptualized, studied, and taught. In doing so, he takes aim at old myths about American exceptionalism and the democratic experiment. Scholars have long debunked these narratives, although they still retain some hold in the popular imagination. While academics and educators can argue about whether Blackhawk’s new paradigm should replace existing frameworks for understanding American history, he has succeeded in demonstrating that a deeper knowledge of Native American history should supplement (if not supplant) our understanding of our collective national experience. 

This book has been born at a particular cultural moment, in which certain state officials have meddled with the curricula of high school civics and history classes, pressured the College Board to reconsider the content of its AP African American history course, and banned the teaching of critical race theory. In the context of these culture wars, Blackhawk’s book and his insistence on the centrality of the Native American experience are tinder. But it’s a conversation worth having. And long overdue.

The post Indian Country appeared first on Washington Monthly.

]]>
148758 Sept-23-Books-Blackhawk The Rediscovery of America: Native Peoples and the Unmaking of U.S. History by Ned Blackhawk Yale University Press, 616 pp.