November/December 2023 | Washington Monthly https://washingtonmonthly.com/magazine/november-december-2023/ Fri, 05 Apr 2024 12:50:22 +0000 en-US hourly 1 https://washingtonmonthly.com/wp-content/uploads/2016/06/cropped-WMlogo-32x32.jpg November/December 2023 | Washington Monthly https://washingtonmonthly.com/magazine/november-december-2023/ 32 32 200884816 The Great Reordering https://washingtonmonthly.com/2023/10/29/the-great-reordering/ Mon, 30 Oct 2023 00:50:00 +0000 https://washingtonmonthly.com/?p=149829

There can be no doubt now that an epochal shift is underway in how the economy—in America and across the globe—is governed. The mystery is how a moderate, conventional politician like Joe Biden engineered it.

The post The Great Reordering appeared first on Washington Monthly.

]]>

Reading lists say lot about a person, or at least what they care to spend their time thinking about. Ben Harris, who served as chief economic adviser to President Joe Biden when Biden was still the VP, remembers prepping for his first day on the job in 2014. The vice president’s policy staff had sent Harris a large pile of documents designed to get him into Biden’s headspace. It was filled with esoteric papers on corporate governance, financial market short-termism, and labor policy. Still, Harris wanted to know more about the personality traits of his new boss. When he asked his predecessor, Sarah Bianchi, about Biden’s character, Bianchi said, “What can I tell you? This guy is the vice president of the United States, but he still gets up on a ladder and cleans his own gutters.” 

He also stands in picket lines with UAW members. Biden is, of course, an excellent politician, and he’s long been a friend to labor. Still, few people would have expected, when he entered the White House, that his administration would herald the beginning of a sea change in America’s political economy, from trickle down to bottom up, or, as the president’s campaign slogan put it, to a core emphasis on “work, not wealth.” 

The record on that score is unequivocal. His COVID-19 stimulus bailed out people, not banks. His domestic economic policy has been about curbing giant corporations and promoting income growth. His infrastructure bills invested in America in a way not seen since the Eisenhower administration. He has taken commerce back to an earlier era in which it was broadly understood that trade needed to serve domestic interests before those of international markets. 

The contrast with the so-called neoliberal economics of recent decades, in which it was presumed that markets always know best, and particularly the Clintonian idea that “free” trade and globalization were inevitable, could not be starker. With a few notable exceptions (Joseph E. Stiglitz, Jared Bernstein), Bill Clinton’s administration, like Barack Obama’s, was filled with neoliberal technocrats who bought fully into the idea of the inherent efficiency of markets. Although they might have occasionally looked to tweak the system, many of the academic economists running policy basically believed that capital, goods, and people would ultimately end up where it was best and most productive for them to be without the sort of public-sector intervention you’ve seen during the Biden administration. 

In this world, so long as stock prices were going up and consumer prices were going down, all was well. Monetary policy trumped fiscal stimulus. And if the latter had to be used, it should be, in the words of the economist Larry Summers, “timely, targeted, and temporary.” (The Biden stimulus, by contrast, is designed to be broad based and long term.) In this political economy, outsourcing wasn’t a bad thing. China would get freer as it got richer. Americans should aim to be bankers and software engineers, not manufacturers. 

“The conventional wisdom was, ‘We don’t need to make T-shirts here,’” remembers Beth Baltzan, a career trade staffer who has served under several administrations and is now senior adviser to U.S. Trade Representative Katherine Tai, one of the numerous Biden appointees who are taking a fundamentally different economic tack than Democrats of the past. There was, of course, almost no air between this view and the Republican take that it doesn’t matter for national competitiveness whether a country makes “computer chips or potato chips,” as an economic adviser to George H. W. Bush once quipped. 

The pandemic, the war in Ukraine, and the U.S.-China conflict have changed all that, of course. But so has Biden, who has led a kind of stealth revolution, the depth and profundity of which have yet to be fully understood by the media, the public, or, indeed, many elites in Washington, D.C. This is perhaps because we haven’t had a true economic paradigm shift in nearly half a century, since the era of Ronald Reagan and Margaret Thatcher overturned the New Deal/Keynesian paradigm that had reigned in the United States and much of the Western world for decades before. As Franklin Foer writes in his recent Biden biography, The Last Politician, “Where the past generation of Democratic presidents was deferential to markets, reluctant to challenge monopoly, indifferent to unions, and generally encouraging of globalization, Biden went in a different direction.” Rather than speaking to Goldman Sachs, Biden spoke to autoworkers. 

While paradigm shifts take years, indeed decades, to play out, there’s no question that one is underway: A massive boom in manufacturing engineered with federal dollars. Aggressive antitrust lawsuits brought against the biggest tech behemoths. (See “Winning the Anti-monopoly Game” by Will Norris.) International agreements on corporate tax evasion, and an even tougher stance on Chinese mercantilism than we saw during the Trump administration. Beyond this, the White House has begun laying out a powerful new post-neoliberal narrative. From Biden’s July 2021 address to Congress announcing the end of trickle-down economics, through to National Security Council Director Jake Sullivan’s April 2023 speech on building back better abroad and the call from USTR Tai last May for a “postcolonial” trade paradigm, a new political economy in America is taking shape. You can call it Bidenomics. You can call it a post-neoliberal world. You can call it “the new economics,” as some progressives who want to separate the changes that are afoot from a single president are inclined to do. But whatever you call it, it’s an epochal shift in how America—and possibly the world—works. 

Whether this shift continues past 2024 is, of course, an unknown. What’s even more mysterious, and worth explaining, is how it is being engineered by perhaps the last national leader you’d expect: the “moderate” and “conventional” Joe Biden.

The Road to a Post-Neoliberal World

For the past four decades, under both Republican and Democratic administrations, America bought wholesale into the idea that the World Was Flat. As long as markets were free, individuals—at home and abroad—would eventually prosper. Some would get rich faster than others, but eventually, wealth would trickle down to the masses. Pulling emerging market nations, most particularly China, into the global trading system would benefit all, as they would eventually become more democratic, and free traders. 

But there was a chink in the “Washington consensus,” and that was that capital always moved faster than people. The global economy as a whole grew at the fastest level after the 1980s in the period from 2003 to 2007. But inequality within individual countries grew in most places, as capital—meaning multinational companies, financial institutions, and the people who ran them—flew 35,000 feet above the problems of the nation-state. Wall Street pulled away from Main Street, as a group of cosmocratic elites working in global service industries—the sort of people whom the British writer David Goodhart calls “Anywheres”—left the “Somewheres,” meaning those tethered to place, behind. This happened not just in the United States, but in many developed countries as well. But while markets, particularly financial markets, are global, politics still happens at the level of the nation-state. As voters in many places became convinced that the political economy was no longer being crafted for them, liberal democracy itself was put in jeopardy. 

The seeds of this were already in evidence as far back as 1999, during the “Battle in Seattle” protests of a World Trade Organization meeting where young progressives and environmental groups were beginning to question the costs of free trade and globalization, for both people and planet. Nobel Laureate Joseph E. Stiglitz wrote his first challenge to the economic status quo, Globalization and Its Discontents, a few years later. Naomi Klein was writing about the disproportionate power of global corporations. In 2005, the same year that Thomas Friedman wrote The World Is Flat, the journalist Barry Lynn, who now runs the Open Markets Institute, wrote a very different and, as it turned out, quite prescient take on the global economy, End of the Line: The Rise and Coming Fall of the Global Corporation. In it, he laid out the vulnerabilities of the highly concentrated global supply chains, and the “just in time” efficiency models driven by Wall Street, that would become so obvious during the coronavirus pandemic and the war in Ukraine. 

But most people in government at this point, even on the left, didn’t really have a deep understanding of the complexities of globalization. Nobody yet knew how risk could ricochet from Iceland to Iowa via highly concentrated financial institutions monopolized by a handful of big Western companies. Nor did they understand that no amount of cheap stuff made in China would paper over the fact that the “labor share”—meaning the amount of gross domestic product paid out to workers, in wages and benefits—has been declining in the United States and many other developed countries since the 1980s. The fall since 2000 has been particularly precipitous, leading to stagnant pay, growing inequality, and a loss of consumer purchasing power. While the fortunes of the country, companies, and citizens used to rise in tandem, those links have weakened over time, as productivity and pay have diverged. Meanwhile, the cost of key goods and services that determine entry into the middle class—housing, health care, and education—have been rising much faster than the core inflation rate. 

The fall of Lehman Brothers and the great financial crisis in 2008 made it clear that neoliberal economic models didn’t always work as they were supposed to. Banks were bailed out, and homeowners were left holding the bag of exploding debt. Private equity swooped in to buy up foreclosed homes on courthouse steps, and Blackstone eventually became the country’s largest landlord. Main Street suffered while Wall Street enjoyed record growth thanks to a flood of easy money from the Federal Reserve. All of this eventually gave rise to the Occupy movement, although it took years—until 2011, really—for the left to turn that felt experience into a slogan: “We are the 99 percent.” 

That slogan carried the seeds of solidarity and a new political narrative. Meanwhile, there were a new crop of policy makers and academics, like the former Harvard professor Elizabeth Warren, talking about predatory loans and the increasing inability of working Americans to make ends meet. INET, the Institute for New Economic Thinking, was funded by George Soros to challenge neoliberal economics in academia and the media. Rob Johnson, the new institution’s president, and a former portfolio manager for Soros’s Quantum Fund, supported a new crop of thinkers challenging the old economic paradigms. (Johnson was a key source for my 2016 book, Makers and Takers: How Wall Street Destroyed Main Street.) Johnson, Soros, Warren, and a host of other key academics, policy makers, and financial market participants spoke in 2010 at a Roosevelt Institute conference, “Make Markets Be Markets,” which looked at how to bring integrity back to the U.S. financial system after the crisis. 

Meanwhile, one of the biggest drivers of inequality and market dysfunction went largely unrecognized even by most progressive economists: the growing monopolization of industry after industry thanks to the boom in corporate mergers enabled by the gutting of antitrust enforcement that began under Reagan. That started to change in the early 2010s with a raft of investigative exposés, primarily in the Washington Monthly, by journalists including Barry Lynn, Phillip Longman, and Lina Khan, who is now chair of the Federal Trade Commission. Like the “muckrakers” who took on corporate monopolies in the early 20th century, these writers brought to light how the workings of cornered markets in everything from airlines to agriculture to tech to health care were driving down wages and job growth, stifling innovation and entrepreneurship, and widening geographic inequality. 

In 2013, the French academic Thomas Piketty published his book Capital in the Twenty-first Century, using hundreds of years of global tax records to show quantitatively what everyone suspected—the rich were getting richer, and far from trickling down, wealth was actually trickling up. In fact, in lieu of wars or government interventions like the New Deal, the rich would inevitably take a greater share of global wealth, since asset values grew so much faster than income. (In the U.S., 89 percent of stock assets are owned by the top 10 percent of the population.) 

In the neoliberal worldview, so long as stock prices were going up and consumer prices were going down, all was well. Monetary policy trumped fiscal stimulus. And if the latter had to be used, it should, said Larry Summers, be “timely, targeted, and temporary.”

By this point, certain Washington think tanks were starting to spread this new economic gospel. John Podesta and Heather Boushey cofounded the Washington Center for Equitable Growth to study inequality and its effects. In 2015, the Roosevelt Institute, led by Felicia Wong, published “Rewriting the Rules,” a paper by Joe Stiglitz, which essentially laid out all the ways in which the policy status quo resulted in a rigged system. The think tank Demos took on the issue of growing private debt and the way in which it penalized the poor. Barry Lynn launched the Open Markets program at the New America Foundation to study monopoly issues, later spinning it off into the Open Markets Institute. Others, like Michael Wessel, senior trade strategist and counselor to the United Steelworkers and a member of the U.S.-China Economic and Security Review Commission, became more vocal about how the outsourcing of American capital and multinational supply chains to lower-cost countries like China supported autocracy at the cost of U.S. labor.

Slowly but surely, it was becoming more widely understood (at least among the progressive chattering classes) that neoliberal policies, including financial deregulation and trade deals that looked good on paper but didn’t take into account the human cost of joblessness, had created huge pockets of pain in many rich countries, not just the United States. That pain had resulted in voters pulling away from the mainstream of both American parties. Capital did travel freely—the world’s financial assets were six times larger than the real economy as of 2020. Goods were relatively mobile. But most people, and jobs, were not. The problem for policy makers is that people vote. And in 2016, they voted for Donald Trump, who made hay with the Clinton and Obama legacies of free trade deals on the campaign trail. 

Trump would pull the greatest political con job of all time by self-interestedly leveraging a true felt experience among most Americans that there is, in fact, a smoky back room in Washington where powerful people make decisions for their own benefit. Unlike any other politician before him, Trump actually said this One True Thing aloud—embedding a single truth in a welter of lies is the gift of the con man. But then, being who he is, he took a whole host of metaphorical cigars into the Oval Office and stank it up to high heaven. 

A New Narrative

A surface read of Joe Biden’s record in public life would not suggest someone eager to usher in a new economic paradigm. As a senator, his passions were foreign affairs, oversight of the judiciary, and protecting the interests of his home state corporations. He voted for NAFTA and normalizing trade relations with China, and as vice president supported trade deals like the never-completed Trans-Pacific Partnership.

But those who have worked with Biden say he was never fully on board with the arguments for globalization and full trust in markets—nor comfortable with the brainy meritocratic set that made them. One manifestation of this was his long, heartfelt support for organized labor—something the last three Democratic presidents never quite conveyed. Another was a certain skepticism of elite groupthink, a trait that biographers of Biden, from Richard Ben Cramer to Franklin Foer, have tied to his peculiar mix of class resentment and anxiety—his pride in his Scranton roots and state school education, his desire to be taken seriously by Ivy League–educated colleagues, his keen sense that many don’t. That distrust increased after the 2008 financial crisis, when centrists in his own party put limits on fiscal stimulus, which led to extended unemployment and pain for average families during the latter Obama years and, ultimately, to greater support for Trump.

As vice president, Biden never openly broke with the administration’s policies. But he gave several speeches that hinted at his growing concerns, such as one in 2017 at the Century Foundation that focused on building a high-wage America. He even talked about the need to ban things like noncompete clauses, which penalized labor relative to capital. 

Biden wasn’t the only establishment Democrat who was having doubts about the wisdom of the reigning neoliberal order. So were many of his closest advisers, including his long-serving aide Ted Kaufman. Often described as Biden’s closest friend in Washington, Kaufman was appointed to serve out the last two years of his boss’s Senate term when Biden became vice president. He used that time to become an expert on the dangers that megabanks posed to the country’s financial and economic health and to craft legislation with Democratic Senator Sherrod Brown limiting their size. The measure died in 2010, thanks in part to hostility from Obama’s own Wall Street–oriented economic consiglieres.

Another veteran Biden adviser whose views on modern capitalism were changing was Bruce Reed. As head of domestic policy for Bill Clinton and, later, president and CEO of the centrist Democratic Leadership Council, Reed was a driving force in the party’s shift away from traditional liberalism. But after leaving the job of Biden’s chief of staff in the Obama White House, Reed went on to advise a nonprofit that was attempting to protect children from the predatory behavior of the entertainment industry. That group’s successful effort to negotiate a strong privacy law in California in 2018 brought him into contact with social media companies like Facebook, and, according to Politico, he didn’t like what he saw. “Reed had begun to read about the history of the anti-monopoly tradition in America, which stretched back to Thomas Jefferson,” Foer writes in The Last Politician, and “to wonder if the nation had strayed too far from that tradition.” (Reed was also a reader of and, as Foer notes, an occasional writer for, the Washington Monthly.) 

Jake Sullivan, Biden’s current national security adviser, also experienced a dark night of the soul during these years. Sullivan was the senior political adviser to Hillary Clinton when she ran for president in 2016. The shock of losing that election launched him on a path toward recognizing how modern capitalism had gone awry and was threatening democracy—a journey he recounted in a 2018 article in, appropriately, Democracy: A Journal of Ideas. He went on to argue, in a 2020 Foreign Policy article coauthored with Jennifer Harris, that America’s place on the world stage depended on getting economic policy right, at home and abroad. Other top Biden advisers, including Ron Klain and John Podesta, went on similar intellectual voyages during the Trump years.

When the 2020 presidential primaries began, Biden staked out relatively moderate positions on most issues compared to his more progressive rivals such as Bernie Sanders. But after securing the nomination that summer, he moved left on some key economic ones. Reporters at the time chalked this up to the need to win Sanders’s endorsement. But while that was true, it was also the case that Biden and his inner circle were surprisingly agreeable to the shift.

One illustrative moment, according to several insiders, was when Biden decided not to use taxes as a lever for issues like climate. “I called Ben Harris that summer [before the election] and asked why we didn’t have a carbon tax proposal,” says Boushey, who was at the time a volunteer to the Biden campaign offering primarily economic advice. “And he said, ‘We just don’t think the markets will deliver what we need here.’ ” “That was a big moment for me,” Boushey says. “It was clear that they understood we’d had 50 years of letting the markets take care of climate, and they hadn’t.” Now a member of the White House Council of Economic Advisers, Boushey was one of many younger, progressive wonks the Biden team recruited to fill key economic policy positions in the government after the election. (Not coincidently, Biden put Ted Kaufman in charge of the transition.) 

Once in the White House, the new administration lost no time rolling out major policies that defied the old neoliberal order—beginning with the American Rescue Plan, a massive economic stimulus bill. Biden and his top advisers, all veterans of the Obama White House, reasoned that it was better to spend big on pandemic relief to keep the economy and employment afloat, even at the risk of inflation, than to underspend and leave average Americans unemployed, as the government did after the 2008 financial crisis. 

Another momentous break with the old order was Biden’s revival of anti-monopoly policy. The administration put several younger progressive appointees on the front lines of this new battle, including Lina Khan at the Federal Trade Commission, Jonathan Kanter at the Department of Justice, and Tim Wu, a Columbia University law professor who until recently served as Biden’s special assistant for technology and competition policy. More experienced hands, however, were needed to help the president become fully comfortable with the policy shift. 

In The Last Politician, Foer tells the story of Biden, in the summer if 2021, reading through the draft of a sweeping executive order written by Wu with Reed’s support that would direct a dozen federal agencies to rein in anticompetitive corporate practices. A creature of the Senate, Biden didn’t approve of presidents overstepping their designated powers, and, pen in hand, he searched through the text for worrying passages. “But Reed had spent so many years working over Biden’s shoulder that he knew how to shape policy to avoid his peeves and how to calm his anxieties,” Foer writes, “by reminding Biden of the specifics that mattered to him the most.” Reed highlighted a provision outlawing the sort of noncompete clauses that Biden had long condemned. Such specifics “appealed to Biden’s political instincts. That made for a good narrative. Everyday folks could relate to that.” Biden signed the order.

Economic Statecraft

Meanwhile, the pandemic and the war in Ukraine made the vulnerabilities of the old economics impossible to miss. Suddenly average Americans understood that cheap Chinese PPE masks could disappear when Beijing needed them for its own citizens, forcing U.S. apparel companies to try to make home-grown versions again (it turns out it did matter if the U.S. made T-shirts, or at least cotton cloth). The war in Ukraine and trade tensions with China made it clear that it mattered whether countries got their energy from an autocrat, or depended on a strategic adversary for crucial pharmaceutical inputs or the minerals needed for the green transition. The White House plans for reindustrialization and place-based economics fit well in this new world, where resiliency mattered as much, if not more than, efficiency. 

It was a big shift away from the old system of globalization that had emerged from the Bretton Woods system. That established the post–World War II framework for global commerce via institutions like the International Monetary Fund, the World Bank, and what eventually became the World Trade Organization. Purpose built for its time, the system was born out of war-torn Europe and focused more on connecting global capital and business to avoid future conflict than on raising the fortunes of workers in individual nation-states. 

Biden was never comfortable with the brainy meritocrats who argued for unfettered markets. His skepticism was tied to his class anxiety—his pride in his state school education, his desire to be taken seriously by Ivy League–educated colleagues, his keen sense that many don’t.

It was also because, as Beth Baltzan wrote in the Washington Monthly in 2019 and 2020, congressional Republicans killed FDR’s and Harry Truman’s plan for an international trade organization that would require that lower tariffs be combined with worker rights, anti-monopoly provisions, and controls on capital flows. In another influential Monthly article earlier this year, Barry Lynn detailed how presidents from Thomas Jefferson through Dwight Eisenhower wielded federal power to ensure that the U.S. military and the American economy would not be vulnerable to industrial chokepoints caused by monopoly suppliers at home and abroad. Some of what the Biden administration is attempting to do on trade can best be understood as a revival of this older American vision of industrial policy.

USTR Tai is attempting to knit together a Bidenomics approach to both competition and trade. In a May speech at the Open Markets Institute, she spoke about chokepoints that needed to be addressed and broken up, regardless of whether they were the result of Chinese mercantilism (in the case of rare-earth minerals), Russian aggression (food crops and fertilizer), or multinational corporate power in areas such as digital trade. She stressed the need to move away from traditional free trade agreements, which “reinforce existing supply chains that are fragile and make us vulnerable. This does not make sense at a moment in history when we are trying to diversify and make them more resilient.” 

Perhaps most provocatively, she said the Biden administration wanted to “turn the colonial mindset on its head”—by partnering with emerging markets and allies as part of what Treasury Secretary Janet Yellen has called “friend-shoring.” Rather than allowing big companies to put jobs and investments wherever it was cheapest for them, Tai wanted to put a floor, rather than a ceiling, on labor and environmental standards while building new supply chains. “The key is to offer economies a spot in vertical integration so that developing countries are not perpetually trapped in an exploitative cycle,” she said. Of course, the devil will be in the details, and Tai’s speech was short on those. Still, paradigm shifts begin with narrative shifts. And the USTR’s intervention was the latest proof that the thinking around free trade is changing profoundly.

This new thinking is a clear rejection of the approach from the 1990s and 2000s, in which the United States tried to negotiate broad trade agreements by breaking down regulatory barriers to advance the interests of large American corporations. It is also a rejection of the Trump approach, which was to unilaterally bludgeon other countries—allies in Europe as well as adversaries like China—with tariffs until they bought more American goods (which they did not). Instead, the Biden administration, while keeping some of the Trump-era tariffs as leverage, is trying to negotiate a series of multilateral agreements in discrete areas—such as supply chains, labor and environmental standards, antitrust, privacy, data security, technology transfer, and so on—while sidestepping the more contentious issues, like the restrictions Europe imposes on agricultural imports to protect its small farms. 

Biden wasn’t the only establishment Democrat who was having doubts about the wisdom of the reigning neoliberal order. So were many of his closest advisers, including Clinton White House veterans Bruce Reed, Ron Klain, and John Podesta.

The administration has already racked up a few successes, including a U.S.–European Union agreement to measure the carbon content of steel and aluminum and a U.S.-OECD agreement on a minimum 15 percent tax on global corporations. The aim is not just economic; it’s also geostrategic. It is to raise middle- and working-class wages in countries that sign on to the agreements—the better to undermine the rise of illiberal politics—while challenging the economic predations of Russia and China. Under the new steel and aluminum agreement, for instance, China will not be able to sell into the American and European markets until it moves away from forging its aluminum and steel with polluting (but cheaper) energy sources like coal.

Many conventional economists are trying to pretend nothing has shifted in the past 20 years, and that driving down prices should still be the ultimate goal for society. Trickle-down economics has simplicity, if not truth, going for it. “The objective has to be buying as cheaply as possible,” said former Treasury Secretary Larry Summers, in reference to procurement for the Biden administration’s $1.2 trillion infrastructure program. It’s an alluring argument in a more inflationary era. Still, most people seem to understand that cheap isn’t cheap if you tally up the true cost of labor and carbon. Ever-cheaper goods have raised wages in some parts of Asia, and created incredible profits for big companies, but they haven’t led to a healthier and more sustainable form of market capitalism. Liberal democracy hasn’t fared well, either. 

The new world is, admittedly, messier, and it will come with some downsides in the short term. Take inflation, for example. There’s no doubt that products made by big companies using job-replacing technology or labor from autocratic states that suppress wages are cheaper. Moving from a globalized, concentrated economy to one in which production and consumption are more tightly geographically connected, and in which stakeholders, not just shareholders, have a voice, may come with some short- to mid-term inflationary pressures. But the costs of the old paradigm—environmental degradation, labor abuses, rising inequality, and toxic politics—were high, too. 

As the Inflation Reduction Act (IRA) rolls out, Bidenomics will have to address tough questions: What is the right balance between, say, foreign and domestic concerns when we’re thinking about trade policy? Is it in the national interest to push for more domestically produced solar panels that will raise prices in the short term, even if doing so brings back the industrial commons in the long term? Or it is better to use cheap equipment from China as quickly as possible? Does this even make sense if you price whether those Chinese panels were made with coal-powered electricity? What are the new metrics for measuring inclusive, sustainable growth? How does one measure the political risk inherent in far-flung globalized supply chains that run through politically vulnerable countries? And what should be done at home to build a stronger, better workforce? How might better education and competition policy mitigate the downsides of our new era? 

April 2023 marked the 10th anniversary of the Rana Plaza factory collapse in Bangladesh, in which 1,100 garment workers were killed because a shoddily constructed factory collapsed on top of them. It turned out that the factory was making goods for major global brands. The company managers who made the decision to outsource to unknown individuals way down the production line were just doing what Finance 101 would tell them to do: Move expense off the balance sheet, and treat labor like a cost, not an asset. Never mind the risks hidden in plain sight, even those that result in death and despair. 

That kind of thinking has dominated the global economy for decades: Let capital, goods, and labor move where they will, even if that results in human suffering and the degradation of the planet. Chinese labor camps in Xinjiang are perhaps the apex of this sort of thinking. How can any country, or company, compete with state-subsidized operations with few environmental safeguards that are accused of forcing slaves to dig for silica, which is then used in solar panels, electronics, and other types of goods dumped into the world at below-market rates? Answer: You can’t, unless you change the economic rules of the game. 

In what may be the most impactful foreign policy speech of the Biden era, U.S. National Security Adviser Jake Sullivan laid out the beginnings of a new rule book at the Brookings Institution in April 2023, connecting American domestic plans with foreign policy. He made it clear that the old “Washington consensus” was over—in part because it had not been able to manage the challenges of a more vulnerable financial system, fragile supply chains, and working-class job losses (with the subsequent blows to democracy). 

Embedded in the old system, as Sullivan put it, was an assumption “that the type of growth did not matter. All growth was good growth. So, various reforms combined and came together to privilege some sectors of the economy, such as finance, while other essential sectors, like semiconductors and infrastructure, atrophied. Our industrial capacity—which is crucial to any country’s ability to continue to innovate—took a real hit.” 

Sullivan’s speech was an attempt to reassure allies that the new economics isn’t about “America alone,” or even primarily about containing China (in truth, the very notion that any nation could contain China is a fiction). Rather, it’s about working with allies—which are being more broadly defined to include parts of the Global South—to create a system built on the assumption that power exists and can’t be economically modeled, and that not all growth is the same. “Our objective isn’t autarky,” Sullivan said in his speech. “It’s resilience and security in our supply chains.” 

Biden lost no time rolling out policies that defied the old neoliberal order—beginning with the American Rescue Plan. He reasoned that it was better to spend big to keep the economy and employment afloat than to underspend and leave average Americans unemployed.

The challenges to that are, of course, immense, and not everyone within the Biden administration is on the same page. The Commerce and Treasury Departments, as well as some in the White House, have been more eager to soft-shoe it with China, particularly at a time of rising inflation. The administration is under pressure to settle for less strict labor and climate provisions in negotiations currently underway to create new strategic alliances in Asia and the rest of the Global South. Some worry about the inflation and market effects of decoupling too quickly from China. Ambitious plans like the global minimum tax have yet to be enforced. Priorities like child and senior care, which were carved out of the first stimulus packages, risk going by the wayside. The president is walking a fine line in supporting union workers in Detroit while much of the IRA and CHIPS stimulus is pouring into right-to-work red states. It’s unclear whether Europe—in particular Germany, which depends on exports to China—will come to a shared view about the new world. Will allies support shared purchasing agreements around critical minerals, even if they violate WTO rules? Will Washington and Brussels—not to mention India, South Africa, Malaysia, and other such nations—be able to agree on new trade rules of the road? The U.S. can’t go it alone in a post-neoliberal world. It will take all of Biden’s political acumen to balance the needs of his two favorite interest groups: workers and allies. 

But despite all this, it is impossible to ignore that this administration has already marked a sea change in economic history. Not only is there increasing overlap between the progressive left and parts of the right on issues like industrial strategy and trade, it’s also impossible to imagine anyone running for president and winning with a neoliberal message. Americans used to believe that the rules of capitalism were handed down on stone tablets. This White House has made it clear that they can be rewritten. As the president put it when he signed his landmark July 2021 executive order on fighting monopoly power, “Capitalism without competition is exploitation.” When a guy like Joe Biden is using language like that, you know there are big changes underway.

The post The Great Reordering appeared first on Washington Monthly.

]]>
149829
Winning the Anti-monopoly Game https://washingtonmonthly.com/2023/10/29/winning-the-anti-monopoly-game/ Mon, 30 Oct 2023 00:45:00 +0000 https://washingtonmonthly.com/?p=149832

Despite press accounts to the contrary, the Biden administration’s revival of antitrust policy isn’t failing. It’s just getting started.

The post Winning the Anti-monopoly Game appeared first on Washington Monthly.

]]>

In the early summer of 2023, a consensus about President Joe Biden’s revival of antitrust enforcement took hold in the media: It had flopped. After the Federal Trade Commission, led by Lina Khan, failed in its attempts to stop two high-profile mergers—Meta’s acquisition of the virtual reality firm Within and Microsoft’s acquisition of the video game company Activision Blizzard—the obituaries poured in. “Joe Biden’s trustbusters have fallen short of their ambitions,” The Economist declared. “The defeats raise questions about Ms. Khan’s ability to carry out her ambitious goal of reversing decades of weak antitrust enforcement, as political pressure mounts and patience wanes,” The New York Times wrote. The Wall Street Journal has published an attack on Khan approximately once every 11 days. 

But if the boos from the peanut gallery fazed the Biden administration’s trustbusters, they haven’t shown it. In September, the Department of Justice went to trial against Google over its deals with smartphone companies to crowd out search engine competitors; another suit against Google, over its dominance of online advertising, is set to begin next spring. Later in September, Khan announced a major lawsuit against Amazon, charging the behemoth with engaging in “unfair methods of competition” such as forbidding merchants on its sites from offering lower prices on other sites. 

The audacity of the suit, and Khan’s abiding criticism of the company’s behavior—she wrote a now-famous treatise on Amazon’s stranglehold on the American economy as a law student in 2017—won her praise from some knowledgeable observers. “By sheer force of intellect,” wrote the New York Times tech reporter David Streitfeld, who has covered Amazon for decades, “she is opening up a conversation about how companies are allowed to behave.” But other media outlets covered the cases in the style of political campaign reporting, in which momentary wins and losses are treated as immensely important. “The spate of high-profile losses has amped up pressure on the FTC to bring a successful case against Amazon,” Politico speculated, as if recent “pressure” is what’s driving a case that Khan has been plotting for years. 

Hot takes like these reflect a spoon-fed PR narrative from the tech platforms in the government’s cross hairs. But a quick perusal of history books, or even a glance at Wikipedia, should dispel the notion that a few adverse rulings mean curtains for the revival of antitrust enforcement. The first efforts to check corporate “trusts” in the 1890s also faced major reversals in court, but in little more than a decade the federal government had broken up some of that era’s biggest monopolies, including Standard Oil. And over the subsequent decades, Washington would put together a robust regulatory regime that maintained a competitive economy that would last until the 1980s, when it was dismantled by the Reagan administration. 

Yet even on the myopic terms of debate set by the media, the story that Biden’s revival of anti-monopoly policy is failing is not supported by the facts. “That’s a complete bullshit narrative,” Matt Stoller, the author of Goliath: The 100-Year War Between Monopoly Power and Democracy, told me. “They’ve actually been really successful at deterring mergers, winning cases, and changing some aspects of the law.” Biden’s team has had a slew of important victories in court. They include a landmark win against the merger of Simon & Schuster and Penguin Random House and a series of proposed mergers in health care, energy, and tech that were abandoned under the threat of litigation. This has had a demonstrable deterrence effect: So far in 2023, the total value of successful mergers is down 40 percent compared to the averages over the past five years.

But more than just litigating antitrust cases, Biden’s administration has reoriented the entire government toward making the economy fairer and more competitive. Biden’s policy program is designed to check monopoly and restructure competition in labor and other markets to public purposes. His administration has made stamping out anti-
worker and anti-consumer practices in the business world a government-wide imperative, directing agencies to use their full regulatory powers and issue new rules. Results include a crackdown on junk fees, a breakup of the private hearing aid cartel, and new regulations on broadband and rail companies, to name a few early victories. 

While the media and many political commentators remain largely obtuse to the larger vision behind these changes, Biden has been explicit about how competition policy unites his administration’s directives across areas that have long been treated as distinct policy silos, including trade, national security, antitrust, labor law, industrial policy, and public investment in infrastructure, as Rana Foroohar explains elsewhere in this issue (see “The Great Reordering”). And while Biden’s revival of antitrust enforcement and competition policy is a sharp break with the past few decades, there’s reason to think it will endure for years to come, regardless of what happens next November.

Biden’s vision is deeply informed by the largely forgotten history of how America once used a broad range of public policies and institutions to contain corporate monopolies and channel competition to productive, equitable ends. This history includes the 1887 Interstate Commerce Act and subsequent amendments, which tamed the power of railroad barons and ensured that different shippers, towns, cities, and regions enjoyed equal access to the dominant networked industry of the era. It includes the Sherman Act of 1890 and subsequent amendments that would eventually come to contain the monopoly power of trusts controlled by colluding Wall Street banks and financiers like Jay Gould and J. P. Morgan. It included Progressive Era institutions like the Federal Trade Commission, armed with the statutory power to police unfair business practices and combinations wherever they occur. And it included state and federal laws like the Robinson-Patman Act of 1936 that restricted giant retailers like Woolworth’s and the A&P grocery chain from abusing their market power over suppliers and customers.

By the 1950s and ’60s, this broad competition policy regime had led to market structures that were well balanced compared to today and consistent with both innovation and the growth of a broad middle class. In sectors of the economy like chemicals or auto manufacturing, where there were large economies of scale and deep capital needs, giant corporations like DuPont or General Motors were allowed, but they were bound by labor laws that effectively forced them to share their profits with their workers, and by codes of corporate governance that made them responsible to stakeholders beyond just their stockholders. In other realms, corporate concentration was allowed only to the point that it was necessary to accommodate progress. Farms got bigger and more mechanized, and so did food processors, but they were not allowed to become concentrated agribusinesses like today’s industrial-scale confined animal feeding operations, monopolized meat-packers, or international fertilizer and “biotech” cartels. Modern supermarkets replaced many local butchers or bakers, but chain stores were prohibited from approaching anything like the market dominance of today’s Walmart, let alone Amazon, and in most American towns and cities, Main Street merchants still had a chance. 

All this changed during and after the 1980s. Ronald Reagan effectively ended antitrust enforcement, except in cases of proven collusion and egregious monopoly pricing. Restraint on price discrimination by retailers also went by the wayside. This set off a merger and acquisition boom that, when combined with broad deregulation of financial institutions, gave Wall Street financiers increasing dominance over the whole economy and led to the loss of millions of middle-class jobs. Meanwhile, free trade policies and lax antitrust enforcement embraced by both Republican and Democratic administrations eroded much of the country’s industrial production and led to dangerous dependencies on foreign-made computer chips, pharmaceuticals, and key minerals. 

Most of this sea change in policy occurred not by repealing the laws that had long channeled and balanced market competition in America, but by policy makers in both parties just failing to enforce them. The result was the growth of corporations of unprecedented size and power in every sector—from media and communications to retail, banking, health care, energy, and food production—that hollowed out local communities and vastly increased racial, regional, generational, and other forms of inequality. 

A glance at Wikipedia should dispel the notion that a few adverse rulings mean curtains for the revival of antitrust enforcement. The first efforts to check corporate “trusts” in the 1890s also faced court reversals, but in little more than a decade the federal government had broken up Standard Oil.

For years, the connection between these baleful economic trends and growing market concentration went largely unrecognized by leading policy makers and economists. But beginning in the mid-2000s and increasingly in the early 2010s, writers and thinkers such as Barry Lynn, Phillip Longman, and Lina Khan (who worked together at the think tank New America) began making these causal links in a series of major exposés, mostly in the Washington Monthly and Harper’s. The national press and established politicians in both parties were slow to take note. Even those who faulted the American economy on other grounds typically still believed that it was marked by entrepreneurial dynamism and robust competition. 

After the Great Recession, the rise of the Tea Party on the right and Occupy Wall Street on the left revealed the country’s disillusionment with the deeply unequal and precarious economy that consolidation had created. Further evidence came in 2015, with the ascendant presidential candidacies of Donald Trump and Bernie Sanders, who channeled the country’s fury at big banks and billionaires. None of this populist activity focused much on monopoly power, but its effect was to finally make some leaders in Washington start listening to antitrust reformers. In early 2016, Senator Elizabeth Warren met with Lynn, Khan, Jonathan Kanter, another strong anti-monopoly advocate, and Ted Downey, the executive editor of The Capitol Forum, which reports on antitrust issues. A few months later, Warren delivered a speech in which she warned that “concentration threatens our markets, threatens our economy, and threatens our democracy.” That fall, Hillary Clinton gave a speech on the need for greater antitrust enforcement, the first major presidential candidate to do so in decades.

It was a breakthrough moment for the Democrats. During Trump’s term in office, lawmakers like Warren, Sanders, and Amy Klobuchar began talking regularly about the dangers of monopolies. In June 2019, the House Judiciary antitrust subcommittee opened an investigation led by Democrat David Cicilline into Amazon, Apple, Facebook, and Google, and Cicilline recruited Khan as counsel for the committee. Summoning the CEOs of those companies for testimony, Cicilline framed them as modern-day robber barons. The investigation, he said, “goes to the heart of whether we as a people govern ourselves, or whether we let ourselves be governed by private monopolies.” Ahead of the 2020 elections, regulating Big Tech became a major issue in the Democratic primary. 

For decades, Biden went along with his party’s general retreat from antitrust enforcement and tolerance of growing corporate concentration. But by 2020, his party’s neoliberal consensus was cracking under the country’s obvious disaffection with the economic status quo. When he won the nomination, he spoke of the need for deep structural change—“an FDR-sized presidency,” as he put it. 

Under the guidance of senior advisers Ron Klain and Bruce Reed, he appointed Tim Wu, a Columbia Law School professor and the author of The Curse of Bigness: Antitrust in the New Gilded Age, to a newly created White House economic advisory position. Wu helped impress on him the importance of antitrust enforcement. “The president really liked the idea of basically doing what FDR had done,” Wu told me. “What did FDR do? What FDR did was reinvigorate antitrust.” 

Biden chose Khan to lead the FTC, at 32 the youngest chair in its history, and Kanter to head the antitrust division of the DOJ. Both are members of the “New Brandeis” school of antitrust theory, which, in the tradition of the Progressive Era jurist and reformer Louis Brandeis, holds that consolidation threatens the economic and social conditions of democracy and which advocates for the full use of the antitrust regulatory powers of the government. 

Wu, Klain, and Reed, together with other senior staffers like the economic adviser Brian Deese, National Security Adviser Jake Sullivan, and U.S. Trade Representative Katherine Tai—some veterans of the Clinton and Obama administrations, others newcomers—devised a policy agenda that exchanged the free market austerity of Reaganomics for new regulations and investments in climate and infrastructure. The group shared an expansive view of competition policy that went beyond antitrust enforcement to include “every law and policy that promotes the distribution of power, of opportunity, of risk,” as Lynn told me. The Biden staff’s conceptual embrace of competition policy, he said, “is far better than anything we could have ever imagined.” 

After years of fruitless advocacy, Wu told me, “all of a sudden, things start to move very quickly” with Biden’s senior staff in place. While Congress began to move fitfully on new antitrust legislation, Biden used his own authority to reorient the federal agencies toward protecting workers and promoting competition as it did in the New Deal era, with a “whole-of-government” executive order, issued on July 9, 2021, that was principally authored by Wu. The order called on 17 different government agencies to take a laundry list of actions to address “some of the most pressing competition problems” in the economy. “Capitalism without competition isn’t capitalism; it’s exploitation,” Biden declared in his announcement. “Over time, we’ve lost the fundamental American idea that true capitalism depends on fair and open competition.” 

More than two years later, here’s what Biden’s anti-monopoly push has accomplished. 

The most important dimension of Biden’s whole-of-government antitrust agenda is an aggressive approach to litigating cases. Last October, the DOJ secured the most important antitrust victory of the Biden era when it successfully blocked the merger of Simon & Schuster and Penguin Random House, which would have cut the number of major publishers in the U.S. from five to four. The “theory of harm” prosecutors presented was novel for the modern era. The DOJ did not allege harm to consumers—i.e., rely on the “consumer welfare” standard—and instead focused their arguments on how authors suffer from consolidation in publishing. By winning the case on this basis, the government set an important new precedent for future litigation: Antitrust cases can be argued on and won by proving harms to independent contractors and businesses, like Uber drivers and merchants selling on Amazon. Another key victory came in May 2023, when the DOJ successfully sued to stop an anticompetitive regional partnership between JetBlue and American Airlines. “I think we can call airline consolidation dead for the moment,” Matt Stoller wrote in his Substack, BIG. Kanter also has pending investigations into Visa, Ticketmaster, and Apple. 

And it’s not just merger cases. The agency announced its intention to more aggressively enforce the Clayton Act’s prohibition on directors serving simultaneously on the boards of competitors, leading to the resignation of board directors in several industries. “The DOJ making this a priority is a big step and consistent with the broader theme of reviving dormant legal powers,” Sandeep Vaheesan of the Open Markets Institute (OMI) said. During Obama’s first term, the USDA unsuccessfully tried to challenge the poultry industry’s abusive “tournament system,” which pits chicken farmers against each other to compete for bids and undercuts their bargaining power. But last summer, a DOJ lawsuit against the food production conglomerate Cargill, several competitors, and a data consulting firm alleging years of collusion and wage suppression ended in an $85 million settlement, effectively ending the practice across much of the industry.

Kanter has also initiated a half-dozen criminal investigations against no-poach and wage-fixing agreements, which illegally block workers from changing jobs in their industry. In October 2022, the DOJ won the first-ever conviction under antitrust law for employer collusion when the staffing firm VDA pleaded guilty to a conspiracy with another firm to refrain from recruiting or hiring each other’s nurses. Several other cases have ended in acquittals, but as with lawsuits against mergers, the message is being heard in the business world: Antitrust enforcers are taking labor violations seriously again. 

More than just litigating antitrust cases, Biden’s administration has reoriented the entire government toward making the economy fairer and more competitive. Biden’s policy program is designed to check monopoly and restructure competition in labor and other markets to public purposes.

The FTC likewise has filed far more ambitious antitrust lawsuits than it did under previous administrations. Khan has said that to win, the government must be willing to lose, eschewing the extreme caution of her predecessors. But there’s been less losing than media coverage would lead you to believe. Khan’s two oft-referenced defeats in court—the failed lawsuits against Microsoft’s acquisition of Activision and Meta’s acquisition of Within—are in fact the FTC’s only clear losses. The agency has secured important wins in court, including a recent ruling against Intuit for falsely advertising its signature product, TurboTax, as free. The FTC has also extracted settlements that prevent abusive business practices, such as one with the health information technology company Surescripts that prohibits the firm from excluding competitors from e-prescribing markets. And many cases have concluded in companies dropping a merger after being sued by the FTC. Under the threat of litigation, the computer chip maker Nvidia called off its $40 billion acquisition of the chip design firm Arm; Lockheed Martin dropped its $4.4 billion purchase of the engine maker Aerojet Rocketdyne; and Berkshire Hathaway called off its $1.7 billion purchase of a pipeline in Utah, to name a few. 

The credible threat of prosecution has proven to have a deterrence effect on mergers and acquisitions across the economy. Though other factors may be at work as well in driving down mergers, merger filings fell by roughly 40 percent in the year after Biden put his new competition policy team in place. In 2023, the total value of successful mergers is also down 40 percent. The business world is feeling the heat: In recent months, the supermarkets Kroger and Albertsons have sold more than 400 stores to try to avoid a merger challenge.

What’s more, antitrust advocates hailed the unsuccessful challenge to Meta’s purchase of Within as a sneaky victory for changing case law. The FTC argued that a monopolist’s acquisition of a nascent company in the market, rather than a mature competitor, can violate antitrust law. The FTC also maintained that a company like Meta can even hurt competition in an industry, like VR fitness apps, in which it’s not yet operating. In his decision, the judge affirmed both of the FTC’s arguments as valid in principle—the first time a court has accepted such arguments since the 1980s. This, like the Penguin Random House case, sets a precedent for future litigation.

The credible threat of prosecution has proven to have a deterrence effect on mergers and acquisitions across the economy. In 2023, the total value of successful mergers is down 40 percent. The business world is feeling the heat.

Reestablishing a broader interpretation of antitrust this way is an essential part of Khan and Kanter’s strategy. “One of the challenges that we face is that antitrust law has, in certain areas, calcified because cases haven’t been brought in new contexts,” Khan told me. Pursuing only classic “rivals buying rivals” cases would do little to expand what’s possible through litigation in an increasingly complex economy, where anticompetitive behavior can take many forms. 

Breaking the judiciary’s narrow reliance on the consumer welfare standard will be especially important for tech regulation. Effects on consumer prices alone are a poor measure of the monopoly power of companies like Google and Facebook, which offer free products to consumers even as they charge monopolistic prices to the businesses that rely on their platforms. Case in point: In its lawsuit against Google’s dominance over online advertising—the heart of Google’s business—the DOJ alleges that the company’s anti-competitive behavior lowers ad revenues for websites and publishers and hikes ad costs for marketers. And in its lawsuit against Amazon, the FTC points specifically to the goliath’s monopolistic abuse of online stores doing business on its platform. 

This past July, the DOJ and the FTC published draft guidelines that laid out the more expansive standards by which they’re prosecuting anticompetitive behavior. The merger guidelines are a sort of open letter to judges, and while they have no power to compel judges to rule a certain way, courts have given guidelines significant deference across every presidency they’ve been issued. As major cases loom, it may help prosecutors turn the tide. 

Lawsuits by the FTC and the DOJ are only the most high-profile weapon in the Biden administration’s war against concentrated economic power. Less prominent but equally potent are the writing and enforcement of regulations based on statutory powers previous administrations have neglected. For instance, when Trump took office, enforcement actions by the Consumer Financial Protection Bureau fell by 75 percent, but it has been revitalized by the Biden appointee Rohit Chopra. The CFPB has gone after companies for price gouging in captive markets like prison financial services, is working on rules that would make it easier for consumers to change banks, and is cracking down on “junk fees”—deceptive charges for service—in banking, part of an administration-wide fight against such fees. 

New pro-competition, pro-consumer rules are popping up across the government. The Food and Drug Administration passed a rule to foster greater competition within the hearing aid industry, which is dominated by four companies that together control 85 percent of U.S. sales. With a new Democratic majority, the Federal Communications Commission is preparing to reinstate net neutrality rules to stop broadband providers like AT&T, Comcast, and Verizon from speeding up connection rates to favored websites and slowing service to others. And the Surface Transportation Board is developing rules that will inject more competition into railroading, one of the most consolidated industries in America, by giving shippers currently served by only one railroad greater options for routing their freight on other lines. 

The most important rule changes have been at the FTC. Under Section 5 of the Federal Trade Commission Act, the agency has broad authority to make new rules to stop “unfair methods of competition,” but previous administrations failed to use that power. That’s changed under Biden. In January, the FTC proposed a ban on noncompete agreements, which rob more than a quarter of private-sector workers of the basic right to freely switch jobs within the same industry. Under this same authority, the agency is developing rules to crack down on personal data collection and has voted to boost “right to repair” enforcement, prompting Microsoft and Apple to change their rules to allow consumers to repair their own electronic equipment. 

Is an earth-shattering win against a tech giant possible in the near term? It’s hard to say. But if major breakups like that of Standard Oil in 1911 are what antitrust is best known for, enforcement in the New Deal era was built on the humbler work of reaching favorable settlements and deterring mergers through the threat of litigation. Bringing corporate power to heel has always been achieved through “1,000 nibbles,” as Barry Lynn told The American Prospect. The Google search case the FTC is currently litigating, for example, is simply “one of those nibbles,” he said.

In 1964, the historian Richard Hofstadter famously observed that “the antitrust enterprise, as an institutional reality, now runs its quiet course without much public attention.” In other words, as enforcement became routinized, the specter of litigation was enough to keep corporations in line. In the 1960s and ’70s, about 70 percent of antitrust lawsuits concluded in a court-ordered settlement.

Such agreements were the backbone of midcentury antitrust enforcement. As Lynn wrote in the Monthly, Thurman Arnold, the architect of America’s 20th-century antitrust regime, established the “government’s general approach” to enforcement, which “was to start by bringing an antitrust suit against a firm that had captured undue control of some sector of the economy. It would then accept a settlement (in the form of a consent decree)” and extract meaningful concessions, such as requiring the company to share patents with competitors for free. 

The history of antitrust is only in small part the history of winning big cases. “People think antitrust is very effective when the government is bringing and winning big cases, and my view of that is it’s wrong,” OMI’s Vaheesan said. “When agencies are bringing cases, they’re also deterring a lot of bad conduct from being pursued in the first place, certain mergers aren’t proposed, and certain competitive practices aren’t being used. I think those are the defining features of the successful postwar antitrust system.” 

This deterrence effect has important implications for the cases against Google, Amazon, and others that Biden has brought, even if his administration isn’t around to see them through. In 1998, the Clinton administration sued Microsoft over its attempts to monopolize the web browser market, the last major antitrust action against a tech giant before the Biden era. A court ordered the company to be split in two, but the George W. Bush administration reversed the order when he assumed office. Still, the company was chastened enough to allow competitors—including Google—to emerge and thrive in Silicon Valley. Even if a future administration lets Biden’s major cases fizzle, this wave of lawsuits will likely induce lingering caution.

There’s reason to hope that the revival of anti-monopoly policy will not end after Biden leaves office, even if he’s replaced by a Republican. After all, the Trump administration also made some halting moves toward restoring antitrust enforcement.

But there’s reason to hope that the revival of anti-monopoly policy will not end after Biden leaves office, even if he’s replaced by a Republican. After all, the Trump administration also made some halting moves toward restoring antitrust enforcement, most notably bringing the Google search suit in the waning months of his presidency before Biden’s team took the case over. Republican Senator Josh Hawley, who voted for Lina Khan’s confirmation in 2021, recently introduced legislation to break up meat-packing and poultry monopolies. In the September Republican debate, Ron DeSantis called Meta and Google “monopolies.” The writing is on the wall: Slamming corporate power is good politics. “There are such things as ideological and intellectual trends,” Wu told me. “A return to antitrust is one of them.” 

That’s not to say that this new antitrust trend is fated to triumph. The sway of corporate power in both parties remains formidable. The federal bench is rife with judges who’ve spent their careers waving through mergers and who can be counted on to be skeptical of cases brought by federal trustbusters. It’s foolish to discount the difficulty of bringing to heel the most powerful corporations the world has ever known. 

But it’s even more foolish to write off the antitrust efforts of the Biden administration after a couple of court losses, as the press has been doing. The truth is that the fight for a fair economy isn’t failing. It’s just getting started.

The post Winning the Anti-monopoly Game appeared first on Washington Monthly.

]]>
149832
The Prosecutor Who Blew the Whistle on Barr and Durham https://washingtonmonthly.com/2023/10/29/the-prosecutor-who-blew-the-whistle-on-barr-and-durham/ Mon, 30 Oct 2023 00:40:00 +0000 https://washingtonmonthly.com/?p=149822

The inspiring tale of Nora Dannehy, Connecticut’s newest Supreme Court justice.

The post The Prosecutor Who Blew the Whistle on Barr and Durham appeared first on Washington Monthly.

]]>

At a time when the Supreme Court of the United States is at its lowest level of popular support in at least 50 years—when it overturns long-standing precedents on abortion, affirmative action, and environmental regulations and divides a riven nation still further—it’s reassuring that a remarkably intelligent and ethical public servant can be named by a Democratic governor to a state supreme court, sail through its judiciary committee without drama or acrimony, and be overwhelmingly confirmed by the general assembly in less than a month. 

The state is Connecticut, and the nominee is Nora Dannehy. If you haven’t heard of her, not only is that okay, it’s also intentional. The 62-year-old career prosecutor and Harvard Law School graduate has never been a showboat, even as she played a critical role in bringing down a corrupt governor, Republican John Rowland, back in 2005. In 2008, Dannehy was the first woman to be named U.S. attorney for Connecticut, and in 2019, she came to Washington with John Durham, the Trump-appointed U.S. attorney in Connecticut and the special counsel probing the origins of the Russian collusion investigation. 

Durham’s investigation was a disaster. Once considered a respectable Republican prosecutor, Durham managed to lose the cases he brought to trial and uncovered nothing of substance that would undercut the original narrative that the FBI was right to investigate Trump’s 2016 campaign. Moreover, instead of avoiding politics—which is why special counsels get named in the first place—Durham and Attorney General Bill Barr, who appointed him, traveled together and constantly conferred while administration officials and the president cried that any Russia scrutiny was a witch hunt. 

Dannehy resigned in protest but didn’t write a tell-all. Only last month, as her nomination to the state supreme court was being considered, did she speak out, in answer to a direct question about her sudden resignation. “I simply couldn’t be part of it. So I resigned,” she told Connecticut state legislators during her confirmation hearing. “In the spring and summer of 2020, I had growing concerns that this Russia investigation was not being conducted in [an objective and apolitical] way. Attorney General Barr began to speak more publicly and specifically about the ongoing criminal investigation. I thought these public comments violated DOJ guidelines.” All true, and an understatement, given the former attorney general’s conduct. 

Dannehy’s common sense and restraint are in sharp contrast to that of the U.S. Supreme Court justices who accept lavish gifts from billionaire benefactors, also known as “dear friends,” and, like Justice Samuel Alito, give interviews to skewer their opponents and even other justices. It’s also a reminder that at a time when every Supreme Court justice save one (Elena Kagan) served as a federal judge, there’s a place on the bench for a career prosecutor who has led anything but a cloistered life. 

Dannehy hails from a family of reformers. Her father was a highly respected lawyer and later jurist who became a state supreme court justice after serving on every other court in the Connecticut judicial system. The family lived in Willimantic, a blue-collar mill town where Nora went to public school. She did well enough to be admitted to Wellesley and, from there, to Harvard Law School. Her husband, Leonard Boyle, investigated corruption as the deputy chief state’s attorney and in 2021 became interim U.S. attorney. Her brother is a retired judge.

It was back in 2003 when Dannehy, then a federal prosecutor in Hartford, Connecticut, began angling for the biggest catch in the Nutmeg State: Governor John Rowland, a whale living far beyond his means. Rowland was a political wunderkind. In 1985, he became the nation’s youngest congressman at 27; in 1994, Connecticut’s youngest-ever governor at 37; and in 2001, chair of the Republican Governors Association—confirming his status as a GOP star who could win in the blue Northeast. But the golden boy was reading his press clippings and flying too close to the sun, taking expensive vacations ($90,000 in flights to Las Vegas and Florida), renovating his vacation home (adding cathedral ceilings and a hot tub), and pocketing rolls of cash from builders to whom he steered hundreds of millions of dollars in no-bid state contracts. Dannehy pulled his star from the sky. 

In 2005, Rowland was convicted on political corruption charges and sentenced to one year and a day, and his downfall put Dannehy on the map. (It also helped cement the reputation of Dannehy’s boss Durham as an apolitical straight-shooter.) It was the flagship case in a convoy of prosecutions that, besides Rowland, brought down the chief of staff, the deputy chief of staff (who—shades of U.S. Senator Robert Menendez—buried gold coins in his backyard), a state treasurer, the mayors of Bridgeport and Waterbury, and some 20 others. Dannehy’s targets never saw her coming. Her gender and self-effacing manner proved potent weapons against male opponents who invariably underestimated her.

Dannehy built quite a record, but unlike Rudy Giuliani, and other prosecutors who do double duty as their own publicists, she never so much as took a bow. Braggadocio might have helped Dannehy in her interview for a judgeship after convicting Rowland. But she didn’t brag, and in a place as small as Connecticut, where politicians sympathize with other politicians and are chummy across party lines, Dannehy was passed over. 

Although denied the family tradition of donning a black judicial robe, Dannehy was still the Woman to See, a DOJ careerist with admirers at “Main Justice” in Washington, D.C. In 2008, the department tapped her to lead a politically sensitive inquiry into whether President George W. Bush had wrongly dismissed nine U.S. attorneys. In less than two years—a world land-speed record in the annals of special counsels—Dannehy wrapped up her work by finding insufficient evidence of criminal conduct in the firings. In part because it was Dannehy, partisans on both sides accepted her conclusion.

In the insular world of Hartford, Bush, who had gotten wind of the governor’s troubles, was being pressed to name a Rowland pal as U.S. attorney. He didn’t get the nomination, and Bush appointed another Republican, Kevin O’Connor, who promised to recuse himself (and did). That meant that in 2003, Durham, whose career at Justice spanned nearly four decades, was the senior attorney in the office overseeing Dannehy’s prosecution of a governor from his own party. Even when Durham wasn’t her boss, he was her mentor, going back to 1991 when she joined the U.S. attorney’s office. That made her later resignation from the Russia investigation carry enormous personal weight. 

All that history is one reason Durham turned to her in March 2019 to go to Washington with him to probe whether the first Russia investigation was a deep-state plot to sabotage Donald Trump. Durham had legitimate credentials. He’d won national recognition prosecuting the mobster Whitey Bulger’s infiltration of the Boston FBI office. Other aspects of Durham’s work got less recognition, such as when he was tapped to investigate the destruction of videotapes of CIA detainee interrogations in 2008 and, in a similar vein, in 2009, probed alleged criminality in so-called enhanced interrogations said to include waterboarding of suspected terrorists. Asking only for further investigation is at odds with the conclusion of other investigators, including the Senate Select Committee on Intelligence. The committee found that 119 detainees were held by the CIA, and 39 were subjected to enhanced interrogation techniques that included sleep deprivation, prolonged standing, exposure to cold, and waterboarding. At least 26 of them were held “wrongfully.”

Finding nothing prosecutable in the enhanced interrogation program made Durham the Man to See when Barr, who had served in the CIA for four years, was looking for someone outside Washington to investigate the Robert Mueller investigation. This was an opportunity for Dannehy as well as Durham. He had always given Dannehy an open field, even when she prosecuted Republicans. For Dannehy’s part, she was happy to stay in the background and give Durham credit for operations she drove across the finish line. It was a perfect working relationship.

But this was the Trump administration, where reputations went to die. Almost as soon as Durham got to Washington in 2019, the bald-and-bearded prosecutor took off for Europe with the boss, Barr, from whom he was supposed to be independent, in search of anyone who could back up Trump’s claim that the FBI’s investigation was “the crime of the century.”

As weeks passed, the two septuagenarian Republicans found no one fitting that description but persisted as if the FBI, of all places, was brimming with Republican-hating rabble-rousers. Dannehy found herself riding shotgun, if not riding in the back seat, in an Edsel. Durham followed Barr protocols, not the department strictures that she’d observed during her Bush inquiry. Contrary to long-established department conventions, first Trump and then Barr, with Durham’s acquiescence, began talking up alleged findings. Dannehy asked Durham nicely to ask Barr to stop going on Fox News and hyping their findings. He didn’t. In the months before Election Day 2020, Barr even asked Durham to draft an interim report to be released early, which the team began doing without Dannehy’s knowledge. This made her furious and would have violated Justice Department guidelines. 

The two old friends argued, but Durham was her boss, Barr was Durham’s, and Trump was Barr’s. After detailing her concerns and sending a brief farewell message to staff, Dannehy walked out the door. Three more prosecutors followed. She’d been there 11 months. Durham would stay another two and a half years, a belabored investigation that dwarfed the Watergate prosecutor’s tenure.

Dannehy completed both her prosecution of Rowland and her investigation of the Bush White House in under two years. In less than three years, Robert Mueller, who ran the inquiry Durham was seeking to discredit, produced seven guilty pleas, two convictions after trial, and several indictments of Russian nationals. Durham’s inquisition lasted nearly four years, during which he secured one guilty plea and lost the two cases that went to trial. 

After walking away from the Durham investigation, Dannehy returned to Connecticut, where she became counsel to Democratic Governor Ned Lamont. In a reproach to politicians who pass over prosecutors who prosecute politicians, the governor nominated her for the state supreme court in September. Called to the podium at the press conference announcing her appointment, Dannehy was happy but unassuming. The Hartford Courant noted that she took less than 90 seconds to say thank you. She mentioned that she had attended Windham High School. She made no mention of Wellesley or Harvard.

Here we are, 20 years after Dannehy was passed over for a federal judgeship, and now she’s following her father to the state’s highest bench. All bets are that she’ll make a fine judge, which makes for a happy ending. But it isn’t the end of public corruption where sometimes the once-good guys are on the take. We shouldn’t forget the Durham inquiry. A once-good man lost his way and tried to take his loyal lieutenant with him at the bidding of the president and his submissive attorney general, who deployed the federal government as their private law firm to disprove the findings of a legitimate investigation they didn’t like. 

There’s a lesson here for Washington. When it comes to the U.S. Supreme Court, the Beltway graybeards of both parties often overlook career prosecutors (like Earl Warren) and, for that matter, civil rights lawyers (Thurgood Marshall, Ruth Bader Ginsburg). One current exception is Barack Obama–appointed Justice Sonia Sotomayor—once the editor of the Yale Law Journal, often a path straight to chambers—who was recruited by Manhattan District Attorney Robert Morgenthau. After six months, she was trying felonies, busting a child pornography ring, and serving as second chair in the 1982 trial of the “Tarzan” burglar, who swung on ropes to break and enter apartments in Harlem, murdering a total of three people. Alito was also a prosecutor (for the feds in New Jersey), which is a reminder that while putting criminals behind bars is a valuable experience, it’s not dispositive.

With its approval rating at record lows, the Supreme Court could use a few good prosecutors. The justices need only look to the legislature in Connecticut. They just overwhelmingly confirmed one of the best.

The post The Prosecutor Who Blew the Whistle on Barr and Durham appeared first on Washington Monthly.

]]>
149822
Show Me the Ballots!  https://washingtonmonthly.com/2023/10/29/show-me-the-ballots/ Mon, 30 Oct 2023 00:35:00 +0000 https://washingtonmonthly.com/?p=149826

To combat rampant disinformation and public distrust in elections, a bipartisan group of politicians is advocating for a radical form of transparency: post all cast ballots online.

The post Show Me the Ballots!  appeared first on Washington Monthly.

]]>

In 2016, when Adrian Fontes became the first Democrat in a half century to oversee elections in Maricopa County, Arizona—a giant jurisdiction with more voters than two dozen states—he had no idea what he was in for. 

Over the course of the next four years as county recorder, the former Marine and prosecutor would spearhead a project to mail a ballot to every registered voter, navigate a global pandemic, and face down a mob of Donald Trump’s supporters, who had convened, guns in tow, outside of the county’s election headquarters in 2020 after Fox News called the state for Joe Biden. At one point, when conspiracists, including InfoWars’s Alex Jones, began attacking Fontes by name, he and his wife packed “go bags” and briefly evacuated their children from their home. 

But if those years were bracing, Fontes, who became Arizona’s secretary of state earlier this year, is not cowed. If anything, his experience on the chaotic front lines of election management has underscored his long-standing belief that the best way to revive public confidence in elections in this age of distrust is by embracing a position of radical transparency. 

One of Fontes’s most revolutionary policy pursuits is straightforward: He believes that in every election, during the most disinformation-prone stage—immediately after the votes are counted but before the winner has been certified—election officials should publish digital images of every ballot cast. Those images should be uploaded in a form that’s easy to read and sort, so rival campaigns, grassroots groups, press outlets, academics, and individual citizens could judge the results of an election for themselves, rather than having to rely on an official’s word. Those who continued to spout groundless theories of malfeasance or fraud could also then be checked by other outside groups and individuals with direct access to the same ballot images.

“[Ballot images are] where we actually glean the data that gives us the results,” Fontes told me last spring. He explained that most voters nationwide hand-mark paper ballots; officials then use scanners to create digital images of those paper ballots, then employ software to count votes and detect inconsistencies. Sharing those digital ballot images as widely as possible is the best way to illustrate how officials arrive at a final count, he said. “There is no better tool. It doesn’t exist.” 

A handful of counties, including Dane County, Wisconsin, where the state capital, Madison, is located, and San Francisco, have already implemented this audacious idea. Election officials in both counties say the move has improved voter confidence. Other states, including Maryland and parts of Florida, have been using ballot images in recent years to verify results before certifying winners. Officials there, too, say the practice has helped cool political temperatures, especially when a losing campaign can scrutinize the images itself, and therefore determine, in a matter of hours, how many ballots are contestable and whether a recount or litigation would be fruitful. 

Making ballot images publicly available, Fontes argues, is also relatively simple technologically, thanks to recent changes in election infrastructure and procedures. After Russian attempts to hack the 2016 election, municipal and state governments purchased new voting systems for polling places that produce a paper record, while the coronavirus pandemic led to a vast expansion of voting by mail. As a result, nearly all ballots today are on paper—either hand-marked directly by voters or printed out by voting machines. Scanning those paper ballots, then publishing the images in a user-friendly form online, isn’t rocket science. 

But if Fontes, along with a growing group of Republican election officials who deal day to day with election deniers, see public ballot images as a policy no-brainer, the idea has faced strong headwinds in state legislatures. Progressives argue that making ballot images public could compromise voter anonymity and invite violence on individuals. They also say the images could be hacked, further compromising public trust. Meanwhile, county election managers of both parties worry that this and other transparency reforms could create an outsized burden on their operations, which already must function under tight deadlines and shoestring budgets. 

The biggest critique of this idea, however, is philosophical. Many argue that presenting the public with evidence of this kind is pointless. The tin-hatted Big Lie conspiracists are not interested in evidence at all, the argument goes—much less evidence produced by the same government officials they accuse of engaging in vast and convoluted conspiracies. And these propagandists wield enough influence already: Nearly 70 percent of Republicans and Republican-leaning independents were still telling pollsters, in July 2023, that Joe Biden was not legitimately elected. 

But Fontes and his allies say that critique misses the mark. The point of making ballot images public is not to convince the die-hard Trumpers, says Ken Bennett, a mild-mannered Republican Arizona state senator and former secretary of state, who sponsored state legislation in 2023 to make ballot images public. It is, instead, to empower those rank-and-file conservatives who have been conditioned to distrust elections but are open to believing otherwise if given persuasive proof. Such voters exist, Fontes and Bennett argue, and perhaps in considerable numbers. The share of Republicans who believe there’s “solid evidence” proving that Joe Biden didn’t win the 2020 election has fallen in the past two years, according to a March CNN/SSRS poll. The Republican pollster Sarah Longwell, who publishes The Bulwark, says the “move on from Trump” bloc comprises 30 percent of the GOP—a group sizeable enough to swing competitive races. 

Bennett, who is Fontes’s key Republican partner on this issue in Arizona, told fellow lawmakers earlier this year that while making ballot images public won’t solve the problem of distrust in elections, it’s a crucial start. “We’ve got to do something,” he said in an Arizona senate hearing in March. “If we want a different result, I think we have to do something different than what exists right now.”

Fontes was having a drink after a conference for state and local election officials in January 2017 when he first heard about giving the public access to images of every ballot cast. Fontes, then the newly elected Maricopa County recorder, was talking to a former state election director and UN election observer, who was praising, of all places, Mongolia. The man was saying, Fontes recalled, that in Mongolia 98 percent of voters report very high confidence in the outcome of their elections. Fontes asked the guy what the country’s trick was. “He said, ‘Well, this place actually publishes their ballots online, by precinct, so that the population can go count it their doggone selves,’ ” Fontes said.  

Fontes was gobsmacked. When he got home, he began researching and found that, unlike most election records, which are kept in databases, spreadsheets, or coded computer logs, ballot images are visceral documents that reflect human errors—like errant pen marks or smears, or when someone circles an oval rather than filling it in. But voters almost never see such documents; they only hear the results on the nightly news, which seem to have emerged by some unknown process performed by shadowy forces elsewhere. To most voters, the logistics of how ballots are counted are, at best, opaque. So the idea of taking a page out of Mongolia’s book, and letting voters in on the ground level—inviting them to examine, sort, and scrutinize the same ballot images that election officials are using—seemed like an “incredibly powerful” idea, Fontes said. “We’ve gotten to a place where I think it’s high time that we have that public verifiability.”

This push toward radical transparency represents a sea change for U.S. election management. For most of the last century, officials have simply asked citizens to trust that a count is correct, that the systems in place were credible. That era is over, Fontes said. Public trust in government institutions is now at a near-historic low; just 20 percent of Americans say they’re very confident in the country’s elections, according to a January 2022 ABC/Ipsos poll. The modern American public needs more proof, Fontes said, and ballot images are the key. “The principle that a person can walk into a county treasurer’s office and ask for the budget and be shown where the money from the county is actually being spent is almost exactly the same,” Fontes said. “Except that this is actually better, because this is the real representation. It’s as if you walked into the county treasurer’s office and got copies of all of the canceled checks.”

Ballot images should be uploaded in a form that’s easy to read and sort, so rival campaigns, grassroots groups, press outlets, academics, and individual citizens could judge the results of an election for themselves.

A few years ago, I saw up close how this works in practice. In 2018, I was in Leon County, Florida, home to Tallahassee, when election workers were performing the recount of what turned out to be a contentious election involving three statewide races, including the races for governor and U.S. Senate. Before the recount process began, Mark Earley, the elected supervisor of elections, and a Democrat, asked the representatives of the candidates and parties to follow him to his computer. He showed them a new election audit technology that he wished he could use in the recount—but was not officially approved. The tool was from Clear Ballot, a company founded by Larry Moore, that a handful of Florida counties were using as part of a pilot program. Clear Ballot’s technology rescans every paper ballot cast in an election to create digital ballot images, then analyzes the ink and white space inside each marked oval and ranks the sloppiest, and therefore most contestable, ovals in each race. Election workers—that is, humans—then scrutinize each of those messy ballots individually to determine a voter’s intent, and to find and fix any mistakes made by the official voting system. 

“I went through this four times that day and said, ‘After we go through this official process for an entire week, here’s what you’re going to see,’ ” Earley told me. He showed the representatives images of the most messily marked ballots in each race to illustrate what the ensuing manual recount would likely look like. “They were thrilled to be able to see it so transparently from a different system,” he said. 

While Earley’s team could only use Clear Ballot’s technology informally—the state had not yet approved image-based recounts and audits—it was clear that the software changed the game. “The fact that you could search through all of these images so quickly and find where there were not votes, or overvotes [more than one ink mark], or where voter intent was misinterpreted by a machine” was transformational, Earley said. “If there’s a mistake, it’s right there. It’s apparent. Nobody’s hiding anything. It’s so easy to understand.”

A couple years later, another effort, led by the San Diego County–based progressive nonprofit Citizens’ Oversight, used ballot images and its technology, Audit Engine, to verify the results of the 2020 contest in central Florida’s Volusia County, where Trump had won by 14 points. Lisa Lewis, Volusia County’s supervisor of elections, who is a Republican, told me she invited the scrutiny. “We were able to tell people, ‘Look, we didn’t do an audit. It was an independent auditor. And the results are the results,’ ” Lewis said. “There are always going to be people who are skeptical about the election … But it does relieve a lot of people’s [minds]. The ballot speaks for itself.” 

Since 2016, Maryland has been the first and only state to use images to verify the results of every contest before certifying the results. Elsewhere, the use of ballot images is slowly increasing behind the scenes. Voters hand-mark their ballots in two-thirds of the country’s jurisdictions, and the latest software allows election workers to review digital images to double-check, correct, or reject votes—which some jurisdictions now do. (Partisan observers are usually present.) 

In 2020, Florida passed a law allowing officials to use ballot images for recounts, thanks in part to Earley and his Republican colleagues’ enthusiasm about the idea. But opposition within the state bureaucracy to Clear Ballot’s monopoly as the only state-certified ballot image technology has meant that it will not be in place for the 2024 election. Transparency activists and Florida’s Democratic Party also have sued to save the images, as the new recount law didn’t make preserving the images mandatory. 

The ongoing state-level debate over whether to make ballot images public is rarely covered by national media outlets, but the fights are heating up as more jurisdictions hire Clear Ballot. If Fontes and Bennett get their way, Arizona will be next.

Earlier this year, Fontes, in his new position as Arizona’s secretary of state, and Bennett, who was reelected as state senator in 2022 after nearly 15 years out of office, emerged as a powerful but unlikely duo on Arizona’s public stage. Fontes, a Democrat, is pugnacious and blunt (he calls election deniers “MAGA fascists”), while Bennett, a conservative Republican, is soft-spoken and conciliatory. But the men are bound by their shared belief that ballot images must be made publicly available statewide. 

Bennett first heard about the idea a few years after Fontes. It was the spring of 2021 and Arizona was in the throes of an ugly battle over the results of the 2020 election. Bennett was not in public office at the time, but the state senate had asked him to be its liaison at the Cyber Ninjas’ audit of Maricopa County’s ballots. It was there that Bennett met John Brakey, a gregarious progressive transparency activist from Tucson, whom the Ninjas had allowed to observe, to claim bipartisanship. Bennett and Brakey soon bonded over the idea of making ballot images public and together pushed Cyber Ninjas CEO Doug Logan to employ them as a tool in the audit. Bennett went so far as to promise Logan that a GOP donor would pay for a ballot image review if he agreed to it. (Logan demurred, but he eventually gave a contract to an election denier who never finished the audit.)

Bennett emerged from the experience a fervent convert to the idea of radical transparency, and, upon taking office in January 2023, he drafted a bill—his office’s top priority—to make public four sets of election records immediately after Election Day: registered voters; people who voted; digital ballot images; and cast-vote record. Fontes immediately threw his support behind the bill. But opposition came fast and furious. Progressive advocates argued that making voter lists and ballot images public could be dangerous, especially in primary elections in small precincts, where someone could link a ballot to an individual, thus compromising a voter’s privacy and right to a secret ballot, and possibly exposing them to right-wing vigilantes. “Bad actors could easily abuse this data,” Alex Gulotta, Arizona state director of All Voting is Local Action, and Jenny Guzman, program director of Common Cause Arizona, wrote in an Arizona Republic commentary. 

Bennett, Fontes, and other supporters scrambled to address the critics’ objections, pointing out that reams of voter information are publicly available. Political parties and campaigns already have access to lists of registered voters, as well as who voted—data they use to turn out voters. Bennett also agreed to reduce the bill’s scope to protect voters’ identities in low-traffic precincts, while Brakey tried to persuade his fellow progressives, including Gulotta and Guzman. It was tough going. 

Nearly all ballots today are on paper—either hand-marked directly by voters or printed out by voting machines. Scanning those paper ballots, then publishing the images in a user-friendly form online, isn’t rocket science.

Experts in academic and advocacy circles, many of whom have long opposed anything electronic in the vote-counting process, worried that ballot images could be hacked. Practicing election officials, including Earley, pushed back against those concerns. Forging hundreds—and sometimes thousands—of ballot styles, which change in every jurisdiction to reflect local races, would be an extraordinarily complex undertaking, and any attempt to do so on a large scale would likely be detected by security protocols, he explained. There is also no evidence of any statewide or federal election ever having been hacked in this way. 

Other critics’ concerns were logistical. Figuring out how to publish ballot images online in a user-friendly format would require technological investment. After all, simply uploading tens of thousands of JPEGs wouldn’t do the trick. To be persuaded, nontechnical people would need to be able to sift through ballots and compare what they saw to a final tally. To that end, state officials would need to either build new software in-house for local jurisdictions or contract with an election systems firm. 

Still others objected to Bennett’s bill on premise: More public data, they argued, would not only fail to convince post-truthers enthralled by Trump’s Big Lie, it could also potentially fuel more propaganda and frivolous litigation. They rejected Bennett’s theory of change—that sizeable numbers of non-Trump Republicans wanted to see the evidence for themselves.

After a contentious, 15-week battle, in mid-May, Bennett’s bill passed on partisan lines, but four days later it was dead. Democratic Governor Katie Hobbs, who had been Fontes’s predecessor as secretary of state, vetoed it, citing many of the progressives’ concerns.

Fontes and Bennett were disappointed but resolute. Bennett told me he plans to introduce a new version of the bill first thing in the 2024 session, this time addressing some of his critics’ primary fears head on. The new bill will apply only to general elections, where everybody is voting on an identical ballot, so there would be no way to identify a specific person in a precinct. And it will give counties the option to decide whether to release ballot images. “So if Pima County doesn’t want to do it, but Maricopa and Yavapai do, fine,” Bennett said. Maricopa County, with nearly two-thirds of Arizona’s voters, is already on board with the new version of his bill, he added. 

Fontes too remains committed to the legislation, echoing a growing number of election officials nationwide. “We don’t have publicly verifiable elections if we don’t have those images out there,” he warned. In an age of widespread public distrust, it’s radical transparency or bust.

The post Show Me the Ballots!  appeared first on Washington Monthly.

]]>
149826
My Parkinson’s Crisis—And Ours https://washingtonmonthly.com/2023/10/29/my-parkinsons-crisis-and-ours/ Mon, 30 Oct 2023 00:30:00 +0000 https://washingtonmonthly.com/?p=149838

The still-mysterious disease is spreading wildly, and Washington isn’t doing enough. As a physician and a sufferer, I should know.

The post My Parkinson’s Crisis—And Ours appeared first on Washington Monthly.

]]>

Parkinson’s disease is a puzzle. I know because I have had it for more than a decade. Some of its symptoms, such as tremors, are easy to understand, but others are weird. For example, turning my body is difficult, and it’s even more difficult if I try to turn it clockwise rather than counterclockwise. I have lost my ability to swim. And what happens when I suddenly freeze, as if my feet were glued to the floor? My brain has sent a message to my feet to step forward. Did the message not arrive, or did my feet simply ignore it? It’s impossible to know. (This article has a shared byline, but the “I” refers to Torrey.) 

Parkinson’s disease is not just a puzzle; it’s an expensive one. A recent detailed study, based on 2017 data, reported that just over 1 million individuals in the United States were living with Parkinson’s. The disease costs our health care system $51.9 billion annually—and that price is expected to balloon to $79.1 billion by 2037, or roughly $1.36 billion a year. Since 90 percent are 65 or older, these patients place a particularly heavy burden on Medicare. 

Even more alarming, the researchers estimated that by 2037 an additional 600,000 people will be diagnosed with Parkinson’s. Such projections are consistent with other studies showing that Parkinson’s is the fastest-growing neurological disease globally, increasing even faster than Alzheimer’s disease. Indeed, an editorial in The Lancet Neurology reported that “the prevalence, burden of disability, and number of deaths associated with Parkinson’s disease all more than doubled between 1990 and 2016.” Some observers call this a “Parkinson’s pandemic.”

Part of this increase is attributable to people living longer and to the large, aging Baby Boomer population. But that is only a partial explanation. Another part might be due to factors related to the causes of this disease. Studies have shown that you are more likely to get Parkinson’s if you have red hair or melanoma, or if you still have your appendix. Other studies have reported that having numerous dental amalgam fillings or living downwind from a golf course are risk factors. Perhaps strangest is data showing that drinking large amounts of milk or never having smoked tobacco increases your chances of getting Parkinson’s disease. Examining some of the leading theories of causation—genetics, infection and inflammation, toxic metals, and pesticides—can illuminate this puzzling potpourri of claims and determine whether research dollars are being effectively deployed in halting the rise of the disease.

Genetics

Genetic research on Parkinson’s has identified more than 20 genes that can potentially cause the disease. However, most of them rarely do. Some genes were only identified in single families, and several only in individuals with early disease onset. The relative lack of importance of genes as a primary cause of Parkinson’s disease has been confirmed by studies in which identical twins, who share 100 percent of their genes, are compared to fraternal twins, who share only 50 percent. If a disease has a genetic cause, one expects both identical twins to be affected—this is called the “concordance rate”—significantly more often than both fraternal twins. Large studies of twins and Parkinson’s disease, however, have reported that the concordance rates for identical and fraternal twins are not significantly different. Twin studies thus confirm the relative lack of importance of genes as a primary cause of Parkinson’s disease and support researchers’ estimates that genes cause only 5 to 15 percent of cases. 

In addition to genes that directly cause diseases, there are dozens, and perhaps hundreds, of so-called risk genes for each disease. Risk genes do not cause the illness but, rather, predispose individuals to it or protect them from it. For example, scientists know that a particular kind of Mycobacterium can cause tuberculosis. Once M. tuberculis enters a body, risk genes determine whether it will cause clinical symptoms and, if so, how severe they will be.

Scientists have identified more than 90 risk genes for Parkinson’s disease. One of these also controls the quantity and distribution of melanin, which determines hair color. Redheads have almost double the chance of developing Parkinson’s compared to people with black hair; blondes and brunettes have intermediate risks. This genetic association also explains why someone with Parkinson’s has an almost fourfold increased probability of developing melanoma and why someone with melanoma has a fourfold increased likelihood of developing Parkinson’s. 

Infection and Inflammation

Infectious agents are also potential causes of Parkinson’s disease. Many viruses that affect the brain can cause symptoms of Parkinson’s disease, such as tremor and stiffness. Research has shown that the influenza virus, for example, caused the epidemic of encephalitis lethargica—a neurological syndrome—in the 1920s, which followed the 1918 influenza pandemic and resulted in thousands of cases of parkinsonism. (When individuals have symptoms of Parkinson’s disease, but it is unclear whether they have the full disease, it is referred to as “parkinsonism.”) The residual cases from that epidemic became the subject of Oliver Sacks’s Awakenings, subsequently made into a movie starring Robin Williams as the doctor giving his patients a new medication that produces a dramatic but temporary improvement in their symptoms. The possible role of the influenza virus in causing Parkinson’s disease is a subject of ongoing debate among researchers. Other viruses that have been shown to cause parkinsonism include Coxsackie, Japanese encephalitis, western equine encephalitis, herpes simplex, hepatitis C, and the virus that causes acquired immune deficiency disorder. How often such viruses actually do so is not yet known.

Other researchers take a broader approach to the relationship between infections and Parkinson’s disease. A large Swedish study identified individuals who had been hospitalized for any infection of the central nervous system. They found that individuals with Parkinson’s were 50 percent more likely than controls to have previously been hospitalized for a CNS infection. The researchers reasoned that it was not the specific infectious agent that was the problem but that all the infections produced inflammation in the brain. 

In support of this theory, many studies have reported increased levels of inflammatory markers in the blood of individuals with Parkinson’s disease. This has led to speculation that the disease is not a brain disease but a disease of the immune system. 

Some studies show a relationship between inflammatory bowel disease and Parkinson’s disease. Additional evidence linking the gastrointestinal tract to Parkinson’s is that the wall of the intestine contains alpha-synuclein, a protein also found in the brain of individuals with Parkinson’s. Alpha-synuclein is especially prominent in the wall of the appendix. An international consortium of researchers extensively studied the appendix in individuals with Parkinson’s disease. Looking at the records of more than 1.6 million individuals over 52 years, they found that patients who had had their appendix removed had a modest but statistically significant 19 percent reduction in their chances of developing Parkinson’s. Furthermore, those who had had their appendix removed at least 30 years previously but still developed the disease did so almost four years later than those who still had their appendix. 

Toxic Metals

Researchers have also noted that exposure to high levels of certain metals—among them aluminum, bismuth, copper, iron, lead, manganese, mercury, and zinc—causes parkinsonism. (Workers at a manganese ore–crushing facility, for example, reported parkinsonian symptoms.) Other studies show geographic associations, such as a higher prevalence of Parkinson’s disease in urban counties that also report a higher industrial release of copper or manganese. 

Among the most promising research involving metals and Parkinson’s is in studies about mercury. Most human exposure to mercury comes from amalgam dental fillings and eating fish. Amalgam fillings, which consist of 50 percent mercury and a 50 percent mix of silver, copper, zinc, and other metals, were introduced almost 200 years ago. Individuals with amalgam fillings have between two and 12 times more mercury in their bodies than those without amalgam fillings. An autopsy study of 34 individuals reported a statistically significant correlation between the number of amalgam fillings and the mercury level in their brain’s occipital lobe. Studies have also shown that amalgam fillings slowly leak mercury vapor; when inhaled, it can easily pass through the blood-brain barrier. (Based on toxicity studies, amalgam fillings have been banned in Denmark, Norway, and Sweden. They are still used in the United States, where 58 percent of adults have them.)

Studies of mercury that were specifically focused on Parkinson’s patients have confirmed this link. A study of 54 Parkinson’s patients and 95 controls reported a significant association between blood mercury levels and the diagnosis. A large study from Taiwan found that amalgam fillings significantly increased the risk of a subsequent diagnosis of Parkinson’s disease. One study used data from Denmark’s Faroe Island, where there is a high prevalence of Parkinson’s, to examine the dietary history of 79 individuals with the disease and 154 matched controls. A statistically significant association was found between individuals who ate more whale, which is high in mercury, and those with Parkinson’s. Most recently, an Australian study compared the distribution of mercury in the autopsied brains of two individuals who had died with Parkinson’s disease and 12 who had not. Some mercury was found in all the brains, but only in the Parkinson’s-infected brains was it found in neurons in the substantia nigra, striatum, and thalamus, areas associated with this disease. The researchers often found Lewy bodies—abnormal deposits of alpha-synuclein and one of the hallmarks of Parkinson’s disease—along with the mercury. 

Pesticides 

In the 1980s, an unusual outbreak of Parkinson’s disease occurred among young adults in California. It turned out that all had used a designer street drug, MPTP, that was chemically similar to a widely used pesticide called paraquat. This caused researchers to wonder whether pesticides or other chemicals might be causes of Parkinson’s disease.

Four decades later, research suggests that the answer is yes. A 2017 analysis of 23 such studies concluded that pesticide exposure increases the risk of an individual developing Parkinson’s by 50 percent or greater. This was true for pesticides in general and also for each class of pesticides—insecticides, herbicides, and fungicides—examined individually. Some pesticides seem worse than others; for example, a meta-analysis of 13 case-control studies of paraquat alone demonstrates its association with Parkinson’s disease. Paraquat has already been banned in more than 30 countries, but it is still widely available in the United States, and according to data from the U.S. Geological Survey, its use more than doubled between 2008 and 2018. 

The assessment of pesticide exposure differs widely in these studies. For example, a study in Iowa and North Carolina determined the incidence of Parkinson’s disease in individuals who worked in agriculture as professional pesticide applicators. By contrast, a study in Nebraska reported a geographic association between the incidence of Parkinson’s disease and the use of pesticides by counties. Several studies have examined whether rural residents who live next to fields on which pesticides have been used have a higher incidence of Parkinson’s. The results have been mixed. In 2012, researchers in Raleigh, North Carolina, published a letter in a neurology journal asking, “Is Living Downwind of a Golf Course a Risk Factor for Parkinsonism?” They observed that among 26 cases of parkinsonism, 19 individuals lived on or within two miles of a golf course. Furthermore, 16 of the 19 lived downwind from the course, and two others were said to have had additional golf course exposure. The researchers invited readers to confirm their findings with a larger sample size, but we could not ascertain whether anyone had done so.

Other Theories

Many Americans associate Parkinson’s disease with head trauma. The boxer Muhammad Ali was diagnosed with Parkinson’s in 1984, at the age of 42. Studies were subsequently done asking people with Parkinson’s about their history of head trauma. Such studies were subject to recall bias because subjects might have been more likely than controls to remember such incidents. A meta-analysis of 22 such studies, all done since 1984, reported a significant association with Parkinson’s disease, but only for head trauma that resulted in a loss of consciousness or a concussion. Recent studies have also emphasized that sports-related head injuries are more likely to result in chronic traumatic encephalopathy, with symptoms such as depression and cognitive deficits; some subjects will also have tremors, but other symptoms of parkinsonism are not prominent in such cases.

Clues to the causes of Parkinson’s disease have also been sought in prospective, longitudinal health studies in which data on dietary habits and smoking is collected on large groups who are followed for years. A meta-analysis of five longitudinal studies unexpectedly identified milk—but not cheese, yogurt, or butter—as a risk factor for Parkinson’s disease. The more milk people drank, the greater the risk. In a study of men in Hawaii, those who consumed the most milk doubled their chances of developing Parkinson’s disease later in life. The risk does not appear to be related to milk’s calcium; milk causes a decrease in blood chemicals, which are thought to be a protective factor for Parkinson’s.

Even stranger than the milk story is the nicotine story. Everyone knows smoking is bad for your health and is associated with several cancers, chronic obstructive pulmonary disease (COPD), heart disease, and stroke. However, several longitudinal studies have identified nicotine, including in cigarettes, cigars, pipes, and chewing tobacco, as one of the strongest protective factors for Parkinson’s disease, reducing the risk of developing it by more than half. Current smokers have the lowest risk, followed by past smokers and then by people who have never smoked. The danger is inversely related to how long and heavily the person smoked. The decrease in smoking among men in recent decades has been cited as one possible reason Parkinson’s disease is increasing in prevalence. It is possible that nicotine has neuroprotective effects on the brain.

These are all important clues to ultimately discovering the causes and better treatments for Parkinson’s. Given the disease’s increasing prevalence, and the rising cost of caring for Parkinson’s patients, more research needs to be done. 

How much federally funded research on Parkinson’s disease is currently being carried out? In 2021, the National Institutes of Health supported 526 research projects on Parkinson’s, totaling $254 million. Based on the titles of the 526 projects, it appears that 58 focus on genetic causes; 17 on infectious and inflammatory causes; 14 on pesticides (including paraquat); and only seven on toxic metals, none of which include mercury. Given the limited number of cases of Parkinson’s disease known to be caused by genes, genetic research is probably being adequately covered. However, the other possible causes all appear to be disappointingly under-researched.

How much should the NIH be spending? One way to assess this is to compare it to research expenditures for Alzheimer’s disease. In 2021, the NIH spent $254 million on Parkinson’s research and roughly $3.1 billion on Alzheimer’s research. There are 6.5 million people with Alzheimer’s, compared to the approximately 1 million with Parkinson’s, meaning that the NIH spent roughly $254 per Parkinson’s patient and more than $470 per Alzheimer’s patient. If we use this comparison, Parkinson’s is being underfunded by the NIH by about $216 million a year. Perhaps it is time to review the NIH research portfolio. This may lead to a better understanding of the causes of Parkinson’s disease, leading to better treatments and control of Medicare costs.

The post My Parkinson’s Crisis—And Ours appeared first on Washington Monthly.

]]>
149838
Give ‘Em Hell, Joe https://washingtonmonthly.com/2023/10/29/give-em-hell-joe/ Mon, 30 Oct 2023 00:25:00 +0000 https://washingtonmonthly.com/?p=149815

President Biden has been compared, and has compared himself, to FDR. But the real similarity is to Harry S. Truman.

The post Give ‘Em Hell, Joe appeared first on Washington Monthly.

]]>

Joe Biden has often been compared, and compared himself, to Franklin Delano Roosevelt. This was happening even before he won the presidency. “I’m kind of in a position that FDR was in,” he told The New Yorker in August 2020, explaining that the unprecedented crises he would face as president would require New Deal levels of government resources and activism.

The analogy seemed apt enough in the first months after Biden’s inauguration. He signed a $1.9 trillion COVID-19 stimulus, the American Rescue Plan; signed a bipartisan $1.2 trillion infrastructure bill, the largest since Dwight Eisenhower; and was rallying support for a multitrillion-dollar Build Back Better bill, his signature social welfare and economy-restructuring package. 

FDR, however, was elected with large Democratic majorities in Congress. Biden’s legislative ambitions had to squeeze through the tiny hole of a 50-50 Senate. In December 2021, one senator from his own party, Joe Manchin, decided to plug the opening, and Build Back Better suffocated. Instead of FDR, pundits began likening Biden to Jimmy Carter.

Manchin then returned to the bargaining table in the summer of 2022 and supported the passage of the $740 billion Inflation Reduction Act for green energy and the $280 billion CHIPS Act for semiconductors. Suddenly, the Roosevelt comparisons came roaring back—and for good reason. 

As Rana Foroohar and Will Norris report in their twin cover stories in this issue, Biden’s legislative wins, combined with his reviving of antitrust enforcement and related shift in trade strategy, amount to a renewed—though not fully appreciated—vision of the economy and government’s role in it. It is nothing short of an overthrow of the free market “neoliberalism” that has guided the economic policies of every president since Ronald Reagan—who himself jettisoned the Keynesian–New Deal paradigm that had dominated economic thinking since the Roosevelt administration. 

But while FDR won widespread public support for his revolutionary actions during his first term, Biden so far has not. His approval ratings for handling the economy are so low that recent surveys have him trailing the likely GOP presidential nominee, the four-times-indicted Donald Trump. 

That’s why I think the former president whom Biden most closely resembles is not FDR but Harry S. Truman. Indeed, the parallels between the two men are uncanny. 

Biden, the muscle-car-loving, state-school-educated man from Scranton, has the same feel for and connection to the middle America of his day as did Truman, a former farmer and haberdasher from Missouri who could tell the age of a draft horse by inspecting its teeth. Like Truman, Biden loved serving in the U.S. Senate but has a chip on his shoulder about Ivy League types. Like Truman, Biden suffers the endless derisions of insiders about not being up to the job and lacking the charisma and oratorical gifts of the president he served as VP. Both men were stymied after their first midterm by do-nothing Republican House majorities (and, for Truman, a recalcitrant GOP Senate as well). Both were strong supporters of organized labor who overrode union strikes when they threatened key industries—mining and others for Truman, railroads for Biden. Both lost public support due to inflation—Biden by pandemic-caused disruptions of brittle supply chains exacerbated by monopoly rent-seeking, and Truman by Congress’s lifting World War II price controls. During his reelection campaign, Truman faced third-party challenges from Strom Thurmond and Henry Wallace, just as Biden needs to worry about Robert F. Kennedy Jr., Cornel West, and whichever candidate No Labels recruits. 

The comparisons extend to foreign policy. Biden and Truman both faced new threats from Russia that they countered through alliances—Truman by founding NATO, Biden by reinvigorating and expanding it. Truman executed the Berlin airlift, saving an outpost of democracy from Moscow’s control without directly engaging with it militarily. Biden has so far done the same in Ukraine. Truman recognized Israel against the advice of many in his inner circle and party. Biden unconditionally supported the Jewish state after the brutal Hamas incursion despite dissent from some progressives.

The most striking parallel, however, is the two men’s work to build a global order. Truman is most remembered today for the doctrine that bears his name: committing U.S. power to contain communism and support democracies. This involved military actions, as in Korea, and new military alliances like NATO, but also vast economic interventions and investments that integrated the world’s democracies into a common trading system that could withstand the threat of totalitarianism while building a broad middle class. As Foroohar details, Biden is attempting something similar: He is pursuing economic agreements with U.S. allies to boost average wages in their home countries and ours. The goal is to undercut support for political illiberalism and challenge the economic predations of Russia and China.

Truman didn’t receive full recognition for his world-historic accomplishments until many years after he left the White House. Similarly, the scope and profundity of what Biden is attempting to do is currently lost on the American public. Most plugged-in Washington insiders don’t have a clue, either. The aim of the two cover stories in this issue of the Washington Monthly is to enlighten those insiders about what’s really going on so they can bring the news to the public. That, in turn, might give Biden a fighting chance to defy expectations in 2024 like Truman did in 1948. 

The post Give ‘Em Hell, Joe appeared first on Washington Monthly.

]]>
149815
How the Founders Overcame Partisan Dysfunction https://washingtonmonthly.com/2023/10/29/how-the-founders-overcame-partisan-dysfunction/ Mon, 30 Oct 2023 00:20:00 +0000 https://washingtonmonthly.com/?p=149809

At the dawn of the republic, ideological divisions and personal hatreds were, if anything, worse than they are today. The nation survived because its warring leaders compromised.

The post How the Founders Overcame Partisan Dysfunction appeared first on Washington Monthly.

]]>

On an overcast June day in 1788, delegates to Virginia’s ratification convention listened raptly to Patrick Henry, fiery patriot and five-term governor, rail against the proposed Constitution. It went too far in replacing the Articles of Confederation with a “consolidated” national government, he charged. “The rights of conscience, trial by jury, liberty of the press, all your immunities and franchises, all pretensions to human rights and privileges, are rendered insecure, if not lost, by this change.” The room was hushed as Henry proceeded to invoke the Almighty.

When I see beyond the horizon that binds human eyes, and look at the final consummation of all human things, and see those intelligent beings which inhabit the ethereal mansion reviewing the political decisions and revolutions which in the progress of time will happen in America, and the consequent happiness or misery of mankind …

As he spoke, the sky grew ominously dark. Thunderclouds unleashed a furious storm, lighting the sky and shaking the entire building. 

Founding Partisans: Hamilton, Madison, Jefferson, Adams and the Brawling Birth of American Politics by H. W. Brands Doubleday, 464 pp.

“The spirits whom he had called,” one delegate wrote, “seemed to have come at his bidding.” 

Relentlessly, Henry escalated his appeal. “Availing himself of the incident, with a master’s art … he seemed … to seize upon the artillery of Heaven, and directed its fiercest thunders against the heads of his adversaries.” 

Stirred by Henry’s theatrics and frightened by the storm, the delegates rose in confusion and rushed from their seats. 

As the controversy over ratifying the Constitution exemplifies, things did not go smoothly as the embryonic alliance of former British colonies in America tried to invent a new country. The times were a stew of clashing principles, ideas, fears, and egos. The Founding Fathers rarely saw eye to eye. Alexander Hamilton’s ambition irritated everyone. Thomas Jefferson, an aloof intellectual, was hard to pin down. John Adams wore everyone out with his complaining. James Madison, bookish and quiet, the youngest of the four, had to turn heated oratory into a blueprint for a new kind of government. It wasn’t easy. As Madison later put it, “We are in a wilderness without a single footstep to guide us.” 

By the time the Constitutional Convention ended in Philadelphia, the fissures of earlier days had started to cleave into partisan divisions—Federalists versus Anti-Federalists. A central fault line in American politics was already clear: the authority of the national government versus the states. Hamilton and Adams were Federalists; Jefferson was in Europe but disfavored strong central government; Madison’s views were more complicated, as a protégé and longtime ally of Jefferson’s who also authored most of the Constitution, Bill of Rights, and Federalist Papers. The convention, through bitter compromises, papered over differences temporarily, but the fissures would recur and sometimes widen into chasms in the decades ahead. 

Founding Partisans, by the historian H. W. Brands, traces the early origins of these fractures in American politics. The Founders, as he portrays them, were willing to endure a “brawling birth” and agree to compromises they hated because they shared an overarching aim: to unify the congeries of weak and independent states into a single country that would one day rival the great powers of Europe.

Brands’s book joins a rich literature on the period of the founding and the men who led it, including The Age of Federalism: The Early American Republic, 1788–1800, by Stanley Elkins and Eric McKitrick; Ratification: The People Debate the Constitution, 1787–1788, by Pauline Maier; Plain, Honest Men: The Making of the American Constitution, by Richard Beeman; The Antifederalists: Critics of the Constitution, 1781–1788, by Jackson Turner Main; and other works. Brands adds to the historical conversation a sharp focus on the emerging battle lines of partisanship using the Founders’ writings. He illustrates—by their own hand—their evolving thinking about each new crisis and what, when, and how to negotiate through their intense conflicts and find a way forward. 

Brands moves steadily through territory that will be familiar to readers of American history: the Continental Congress; the Constitutional Convention in 1787; the ratification of the Constitution; the First Federal Congress; George Washington’s presidency; John Adams’s foreign policy and crackdown on dissent; and the deadlocked election of 1800, in which Republicans, led by Jefferson, at last triumphed over the Federalists. Given today’s intense partisanship, no doubt Brands hopes revisiting this history through the Founders’ words will reveal a path out of our own political wilderness. 

Major flashpoints flared up in each of the periods he covers, challenging this first generation of leaders to improvise and agree over and over to bargains they abhorred, at times sacrificing not only policy preferences but even principle to the larger goal of a United States grounded in both stability and liberty. 

Of course, they could not have predicted how the deals they made then would play out over more than 200 years. Take the knotty problem of representation. The Constitutional Convention deadlocked over it, and the impasse dragged on for weeks. The large states, led by Madison and the Virginians, wanted to overthrow the Articles of Confederation and replace them with a new charter that fairly reflected the greater wealth and population of the large states. The small states clung to the one-state, one-vote model of the articles. Matters such as the power of Congress and the executive were delayed until representation was resolved.  

Delegates were eager to go home. Benjamin Franklin, not a religious man, surprisingly suggested they pray about it. Hamilton objected, saying the public would interpret that as an act of desperation. Finally, Roger Sherman and Oliver Ellsworth of Connecticut proposed creating a House of Representatives with members apportioned by population and a Senate with each state represented equally. This, Ellsworth argued, “will secure tranquility.” Legislation would have to pass both houses of Congress. Ellsworth raised the specter of dissolution. “If the great states refuse this plan, we will be forever separated,” he said. 

“The diversity of opinions turns on two points,” Franklin declared. “If a proportional representation takes place, the small states contend that their liberties will be in danger. If an equality of votes is to be put in its place, the large states say their money will be in danger. When a broad table is to be made, and the edges of planks do not fit, the artist takes a little from both, and makes a good joint. In like manner here both sides must part with some of their demands, in order that they may join in some accommodating proposition.” 

The so-called Connecticut Compromise kept the states together, but it was harsh medicine. Madison even voted against it. He had wanted “a government wholly national. But when the convention accepted the Connecticut compromise, he reconciled himself to getting half what he’d aimed for,” says Brands. 

Almost 250 years later, Madison’s fears seem justified. Brands’s historical narrative stays firmly in the late 18th and early 19th centuries, not venturing to address present-day ramifications of the Connecticut Compromise or other events he writes about. But as Steven Levitsky and Daniel Ziblatt write in their new book, Tyranny of the Minority, the Connecticut Compromise “was a ‘second-best’ solution to an intractable standoff that threatened to derail the convention and perhaps destroy the young nation.” Over time, the malapportionment of the Senate relative to population has escalated into massive structural inequality. In 1790, Virginia, the largest state, had a population 13 times that of the smallest state, Delaware. But by 2000, California, the state with the greatest population, was 70 times larger than Wyoming, the smallest. This also skews the Electoral College, another invention of the Founders, in which the number of electors for each state is the total of its representatives and senators. Until the 21st century, rural and large states were both Democratic and Republican, but in the past 20 years, partisan sorting into rural and urban areas has created a significant national advantage for Republicans. This realignment has intensified partisan politics in Washington and in the states, potentially threatening a fundamental principle of democracy—majority rule combined with minority rights. 

Brands writes at length about the fraught process of the Constitution’s ratification. Nine of the 13 states had to ratify for it to take effect, but all of them needed to sign on to ensure that the new structure would work. Five states quickly approved it, and gradually four more. But Virginia and New York held back—their state conventions were riven by partisanship and a sense that, as larger states, they had given up too much. 

In Virginia, former Governor Patrick Henry, who had refused to attend the national constitutional convention in Philadelphia, opposed the Constitution for a host of reasons. Madison, meanwhile, did not want any amendments to the document he had largely drafted. During heated debate in Richmond, the majority of delegates started to lean toward ratification. What closed the deal was a proposal that accommodated both sides—ratification accompanied by appointment of a committee that incorporated the objections of Henry and other anti-Federalists into 40 amendments, 20 specifying rights reserved to the people, and 20 changing the structure of the new government. Virginia forwarded both certification of ratification and the proposed amendments to Congress. During the First Federal Congress, Madison drafted 19 amendments, drawn from more than 200 proposals demanded by the states. Eventually 10 of them were adopted and are known today as the Bill of Rights.

In New York, state interests and sharp differences between factions almost derailed the constitution. Governor George Clinton opposed it, for example, for giving the federal government a monopoly over trade, when tariffs were the state’s principal source of revenue and political leverage. The Anti-Federalists stalled for time. Then news arrived of ratification by New Hampshire, the ninth state, clinching adoption. A few days later, on July 2, a dispatch from Richmond announced Virginia’s ratification on June 25th. With the Constitution technically in force, New York narrowly agreed to ratify it as well, though it also circulated a letter to the other states recommending changes.

Conflicts that had existed since the beginning repeatedly threatened to derail the emerging republic. By the late 1790s, Adams and Jefferson, once close friends, had become rivals and leaders of bitterly opposed political parties. There were few policies that the two men agreed on, which made for a rocky relationship between president and vice president from 1797 to 1801. Jefferson loved France and supported the French Revolution, while Adams brought the U.S. to the brink of war with France.

Adams also locked horns with Jefferson’s Republicans over freedom of the press. Enraged by the attacks on him and his administration by Republican newspapers, Adams and the Federalist majority in Congress passed the deeply controversial Sedition Act of 1798. Some partisans even called for separation of the Federalist North and the Republican South. But Jefferson defused what might have become a shattering split. “In every free and deliberating society, there must, from the nature of man, be opposite parties, and violent dissensions and discords; and one of these, for the most part, must prevail over the other for a longer or shorter time,” Jefferson wrote. “Seeing, therefore, that an association of men who will not quarrel with one another is a thing which never yet existed … I had rather keep our New England associates for that purpose, than to see our bickerings transferred to others.” 

The book effectively ends with Jefferson’s inaugural speech in March 1801, in which he once again touched on an issue that has remained sensitive to the present day. “Though the will of the majority is in all cases to prevail, that will, to be rightful, must be reasonable,” he said. “The minority possess their equal rights, which equal laws must protect, and to violate would be oppression.” And he encouraged reconciliation: “Every difference of opinion is not a difference of principle. We have called by different names brethren of the same principle. We are all Republicans; we are all Federalists.” 

Throughout this richly sourced book, Brands hesitates to directly draw parallels to our “brawling” politics today, leaving readers to make their own inferences. He relies perhaps too heavily at times on lengthy quotations from his characters. Some readers may enjoy the ornate language from that era, while others may find it an obstacle. In addition, Brands scants clear chronology that could help guide the reader who does not already know the period well. Still, the book provides intimate access to the Founders’ thinking as it developed and insightful synthesis of complex events. Indeed, Brands may be trying to show us, through language that sounds archaic to modern ears, how our leaders—even now—can place unity above deep differences on policy and even principle, ahead of personal discord and dislike, and find a path forward—not an easy one, not one that is ideal or even acceptable to all, but one that holds together a country that is essential to world stability, prosperity, and freedom.

Madison, not surprisingly, had the final word. He was the last to die, a decade after Adams and Jefferson passed away on the 50th anniversary of the Declaration of Independence. He lived to see the emergence first of the Republican and Federalist Parties, and then the Democrats and the Whigs. Once fiercely partisan himself, at the end he warned against the dangers of excessive partisanship: 

The advice nearest to my heart and deepest in my convictions is that the Union of the states be cherished and perpetuated. Let the open enemy to it be regarded as a Pandora with her box opened, and the disguised one as the serpent creeping with his deadly wiles into Paradise.

The post How the Founders Overcame Partisan Dysfunction appeared first on Washington Monthly.

]]>
149809 Nov-23-Books-Brands Founding Partisans: Hamilton, Madison, Jefferson, Adams and the Brawling Birth of American Politics by H. W. Brands Doubleday, 464 pp.
The Supreme Court’s World War II Battles https://washingtonmonthly.com/2023/10/29/the-supreme-courts-world-war-ii-battles/ Mon, 30 Oct 2023 00:15:00 +0000 https://washingtonmonthly.com/?p=149471

Cliff Sloan’s lively new book explains how the Franklin Roosevelt-shaped Court wrestled with individual rights as the nation fought to save itself and the world.

The post The Supreme Court’s World War II Battles appeared first on Washington Monthly.

]]>

On October 22, 1935, a middle school student in Minersville, Pennsylvania, named William Gobitas refused to say the Pledge of Allegiance; his sister Lillian followed suit the next day. As Jehovah’s Witnesses, they believed that saluting the flag amounted to idolatry. School officials expelled the children, and other kids threw rocks at them. In 1940, Supreme Court Justice Felix Frankfurter rejected the family’s religious exercise objection to the coerced salute in the 8–1 Minersville School District v. Gobitis decision (the Gobitas family name was misspelled). Frankfurter did not merely defer to the wisdom of local school boards; he also extolled the virtues of “cohesive sentiment” as the “ultimate foundation of a free society.” A wave of violent assaults on Jehovah’s Witnesses and suspensions of students soon followed.

But within three years, the Gobitis consensus crumbled. In 1943, the Supreme Court revisited the issue. Once again, two schoolchildren—Gaithie and Marie Barnett, also Jehovah’s Witnesses—refused to salute the flag and were sent home. Robert Jackson wrote for a new 6–3 majority in West Virginia State Board of Education v. Barnette (in an odd twist of fate, court clerks also misspelled their name), vindicating the dissenters’ rights and systematically taking apart Gobitis—over Frankfurter’s pained dissent. What could have caused such a complete and abrupt about-face?

The Court at War by Cliff Sloan. September 19, 2023.

The answer lies in three factors: Franklin Delano Roosevelt, his court-packing fight, and the Second World War. There is no shortage of books on FDR’s battle with the Supreme Court over his administration’s Depression-era policies, which culminated in an attempt to stack the Court with liberal justices. Even though his court reform proposal failed in Congress, the 32nd president eventually was able to appoint eight new justices over his four terms, capturing the judiciary for the Democratic Party for decades. FDR’s justices—nearly all drawn from a close circle of friends and allies—not only gave his party a supermajority on the nation’s apex court, they also helped chart the path for liberal constitutionalism for more than a generation.

At a time when the constitutional order feels archaic, and the Supreme Court is once again firmly in the hands of appointees from a single party, many observers are mining past periods of consensus and progress for understanding—and perhaps inspiration. A new book, The Court at War, is a highly readable contribution to this trend. Its author, Cliff Sloan, teaches constitutional law at Georgetown and is a former law clerk to Justice John Paul Stevens. Sloan does not focus on the economic constitutional issues that were at the heart of FDR’s clash with the Court during his early terms, but rather on a subset of controversies over individual rights after the United States entered the global war in the winter of 1940 until roughly FDR’s death in 1945. These cases included equality-based challenges to the internment of Japanese Americans; a fight over the government’s effort to strip a Communist Party member of his U.S. citizenship; and an assortment of other free speech and association cases, including those of Jehovah’s Witnesses who objected to saluting the American flag. A 35-year-old Thurgood Marshall makes an appearance partway through the book before the Supreme Court to argue the case of Smith v. Allwright, challenging a Texas Democratic Party rule that barred Black people from voting in the primary election. Behind the scenes, Marshall and the NAACP lobbied the administration ferociously to back their legal argument that the white primary violated the Constitution’s equal protection clause.

Sloan’s central thesis is that close examination of this cluster of legal controversies against the backdrop of the country’s war mobilization will reveal that “World War II was interwoven with every ruling.” He not only describes key events on the war front happening at the same time the major rights cases were being litigated, but also contends that the war environment influenced how elite actors understood the law. His account positively brims with admiration for the Supreme Court. This can be jarring because many ordinary citizens wonder daily whether the justices really have their best interests in mind—or the interests of a narrower slice of society—when they interpret the Constitution’s majestic provisions.

The historical figures in The Court at War are colorfully rendered; the action moves briskly. This is especially true in Sloan’s retelling of the cases challenging the wartime internment of Japanese Americans. He skillfully probes the personal and professional motivations of their major players: Government lawyers concealed evidence to defend the evacuation plan; a general spouted anti-Japanese rhetoric as he implemented the policy; and Supreme Court justices who were close to that general defended internment as a necessary, and not racist, defense measure.

Even so, The Court at War’s greatest achievement—shedding the narrative of methodological and theoretical scaffolding—also represents its greatest weakness as a work of legal and political history. Sloan never commits to any particular explanation as to how the war mobilization shaped constitutional decision-making beyond the observation that “the historical reality is more complicated.”

But larger societal forces may have influenced the War Court’s decisions, and there were long-term consequences—including the troubling possibility that the FDR Court’s muscular liberalism, at first used in defense of individual rights, has bred a culture of judicial supremacy that now risks trampling our liberties and collective capacity to solve societal problems.


Because Sloan focuses on the Washington insiders who jockeyed for influence within the Supreme Court and often beyond its marble walls, The Court at War is not a social or intellectual history. Instead, it is, in the parlance of the presidential scholar Richard Neustadt, an account of judges who take into account a president’s “professional reputation” and “public prestige.” Here and there a dissenter or civic group might appear as a plaintiff in a lawsuit, but readers won’t learn much of the roiling social landscape beyond Washington, D.C., or the pockets of people deeply unhappy with Democratic governance, or those who view the judiciary’s growing influence on American life in darker terms.

Above all, one comes away from this insider account with a stunning sense of the porousness of the Supreme Court to other elite actors within the Beltway. Justices not only gave public speeches but also mingled freely with political figures. During these years, several of the justices nurtured professional aspirations beyond serving on the nation’s highest court, including Frank Murphy, Hugo Black, William O. Douglas, and James Byrnes (who actually was lured off the bench to help lead the war effort). At one point, Sloan writes, FDR brazenly suggested to Frankfurter that the Court’s opinions needed to be “more dramatic,” and offered his speechwriter to help spice up draft opinions. A revelation of this sort today would surely shock the average citizen who has come to believe that equal justice requires a greater degree of separation between political patrons and truly independent jurists. But FDR’s comment showcased how the president saw the justices’ rulings as important vehicles for legitimating his policies and expanding his party’s influence.

Sloan also engagingly recounts Douglas’s machinations to become FDR’s running mate in 1944 once he learned that party elders wished to drop Henry Wallace from the ticket. Wallace, an outspoken but eclectic progressive, made some business leaders and southern Democrats nervous as the election approached. The president took steps to both stoke Douglas’s ambition and keep things fluid—almost certainly to maximize his own place at the center of power. In Sloan’s telling, FDR genuinely had interest in Douglas as a running mate but, as with most matters, was willing to pivot on a dime. Such political flexibility made sense when trying to manage a burgeoning, fractious coalition. When FDR finally raised with party leaders the prospect of nominating Douglas to be his running mate, the idea was met with embarrassing silence. Harry Truman was eventually selected by the party on the second ballot with 1,031 votes, as Douglas wound up with a humiliating four votes. Sloan finds Douglas’s continued participation in cases before the Court while he maneuvered for the VP slot “a problematic immersion in electoral politics” that “undermines the institution.”

Ethical issues aside, Sloan’s portrayal of these episodes underscores the dangers of what he calls “excessive closeness” between presidents and sitting judges. Certainly a president had every reason to nurture proximity. FDR used every arrow in his quiver to ensure that his agenda remained unimpeded. Neustadt once wrote that “the power to persuade is the power to bargain.” By manipulating the egos and aspirations of the justices, FDR kept the justices in their roles as well as receptive to the administration’s entreaties. 

The Korematsu case was another example of such political intimacy. As Sloan writes, Earl Warren, then attorney general of California, supported mass evacuation of the Japanese from the state; Warren would later gain the reputation of liberal stalwart once he became chief justice. Lieutenant General John DeWitt, who drove much of the policy, ominously called people of Japanese ancestry “an enemy race” regardless of their citizenship status or how long they had lived in the U.S. Such a sweeping group-threat stereotype dehumanized a complex community and, in so doing, paved the way for indefinite mass confinement at gunpoint. Sloan reveals that DeWitt was close to Justices Hugo Black and William O. Douglas, and surmises that Fred Korematsu’s lawyers may have hurt their cause by focusing so much attention on DeWitt. And indeed, the Court ultimately upheld the race-based removal policy in an opinion by Black that, in a defensive tone, brushed aside allegations of racial hostility. Frankfurter, who considered himself “a member of the President’s war team,” also voted to reject Korematsu’s equality challenge.

Justice Department lawyers in Korematsu, including Solicitor General Charles Fahy and Edward Ennis, struggled with their ethical obligations to serve their client while being sufficiently honest in characterizing the degree of any actual threat to national security posed by Japanese Americans. Notably, the government attorneys failed to disclose a key intelligence report indicating that fears of espionage were overblown and that it was entirely possible to separate the loyal from the disloyal. Fahy and Ennis disagreed about what to do, and language in the government’s brief was massaged, but the intelligence contradicting its legal position was never mentioned. Sloan endorses the revisionist account that treats this as a breach of legal ethics, but does not give us any new reason to think that, even if the missing reports had been more forthrightly discussed in the government’s legal briefs, the justices would have been willing to disappoint a president of their own party on a grave matter of national security.

Yet successful history is more than scintillating reportage of facts. And there are three possible ways to read The Court at War. One is that partisan processes served as the engine of legal change: A charismatic leader built a large coalition by casting monied elites and guardians of older ways as the enemies of security and prosperity. As Sloan aptly notes, FDR stocked the courts with judges who felt “deep allegiance” to him, all of them loyal Democrats and New Dealers. These justices, who were grateful to an effective party leader and agreed with the wartime president’s general agenda, rendered decisions that gave the affiliated party the legal foundation for its policies.

Another reading of Sloan’s book is what we might call the “gestalt” explanation, which is seemingly evoked when he observes that “every Justice felt intimately connected to the nation’s existential fight against the Nazis and the Japanese Empire.” Unlike the partisan explanation, the gestalt thesis portrays the justices as a part of the broader political community to which every citizen belongs, and unable to resist the fear of a shared external threat or the pull of a national project worthy of assent.

Still a third possibility is a grassroots reading of the past, in which the law changes in large part because an outside community forces elites in charge of institutions to rethink governing principles. A number of the constitutional challenges were brought by despised minorities—Jehovah’s Witnesses with unconventional religious beliefs, people of Japanese ancestry suspected of disloyalty en masse after Pearl Harbor. In this view, outsiders who were unlikely to kowtow to elite priorities nevertheless were willing to exploit a president’s words and deeds to advance their own ends. Quoting FDR’s wartime speeches extolling the value of freedom of speech and religion invited the justices to see dissent in ways that might be compatible with the president’s vision of democracy.

Sloan does not choose among these possibilities, while making the implicit decision to keep insiders on center stage. As a result, the last option is the least likely way to read The Court at War given Sloan’s narrative choice to feature elites so prominently.

So what is really going on with these legal controversies? Why did most of them end up vindicating rights? When the Department of Justice defended laws enacted by Democratic majorities or new bureaucracies as essential to America’s economic recovery before the courts, partisan convergence as an explanation rings true. On civil rights or liberties issues, however, where the administration might be internally divided or political payoff could be uncertain, the calculus became more complicated and the partisan explanation works less well.

Whether the ongoing conflict raging across the globe was a reason to vindicate rights or curb them required reaching a view about war-fighting needs or the legacy of war. Was U.S. involvement in World War II about the overriding value of national unity during a time of crisis or did fighting totalitarianism abroad mean that Americans themselves had to internalize anti-authoritarianism at home? And if the answer was the latter, should judges play an active role in inculcating such values by blocking otherwise valid policies? The legal battle over the coerced flag salute raised these very issues, showcasing sharply conflicting notions of what the war’s legacy would mean for individual rights and state power. It also illustrated how a nation’s leaders—presidents and judges alike—must manage hope and act with principle, or else watch as the seams of political community start to burst.

The second Jehovah’s Witness flag case, in 1943, treated freedom of conscience and dissent as a better legacy of the war against “our present totalitarian enemies” than shared sentiment and ritualistic displays of unity. Perhaps more than any other decision from this period, Barnette represented the soaring aspirations of legal liberalism, a philosophy grounded in the belief that individual rights must be withdrawn “from the vicissitudes of political controversy,” and primarily enforced by unelected judges. Whether judges truly make for reliable defenders of rights was another matter.

The partisan perspective could explain the first decision on the flag salute, given that Frankfurter made so much of the need for national unity during wartime; it has a harder time explaining the second. New additions to the Supreme Court, namely Jackson and Wiley Rutledge, believed Gobitis to be wrongly decided; Stone, elevated to chief justice in the interim, was the lone holdout in the first case. Most glaring, the authors of both opinions, taking exactly opposite positions on the constitutional question, believed they—and not the other—was on the right side of the “Four Freedoms” vision articulated by the president. These disagreements over what it meant to fight the war extended throughout the administration.

And, of course, the oppressed also had a role to play in this reversal of fortune by continuing to resist the Court’s initial pronouncements. If Jehovah’s Witnesses had simply accepted that first ruling and tolerated their fate, the justices would never have had a chance to reconsider. But as this fascinating conflict also reminds us, the Supreme Court’s pronouncements on the meaning of the Constitution are never the last word.

Even so, the question for a historian remains: Was this dramatic turnabout caused by partisan processes, domination of the freedom-inflected version of the war’s legacy, or the persistence of a despised minority? Drawing on existing scholarship (including mine), Sloan is content to relay key facts. But because he does not choose between possible explanations for what happened (full disclosure: I am a proponent of the elite, war-inflected explanation), we get an uncertain understanding of what he believes to be the war’s actual impact on the Supreme Court’s deliberations. It’s possible to be left thinking that, while the Court’s wartime decisions were the product of collective reason, they were also, in Sloan’s telling, sometimes “idiosyncratic.”


When we pan out even further, it seems more obvious that we don’t get a full appreciation of the price society might be paying for the path that the Roosevelt Court put us on. The Court at War ends with FDR’s death in 1945 and Truman’s appointment of Fred Vinson as the new chief after Stone’s sudden demise a year later. Sloan believes that these events mark the close of the War Court. On the other hand, if we accept that America has been almost constantly engaged in armed conflict since those days, then it would be more accurate to say that there has not been just a single War Court but several overlapping ones—including the Court that reviewed War on Terror policies—which have shaped the country’s fundamental law. That Court has now frowned on explicitly bigoted laws, but upheld a cleverly rewritten travel ban on several Muslim-majority countries; it has also approved sweeping detention and surveillance policies so long as there is some minimal role for federal judges to poke and prod individual decisions, even if they do nothing to disturb the basic contours of the modern national security state.

Looking back on the early 1940s, Sloan sounds a triumphalist note. With the exception of Korematsu, which he labels “an indelible stain on American history and on the Supreme Court,” Sloan underscores the “pathbreaking” quality of most of the Roosevelt Court’s civil liberties decisions and insists that they “lit the path for the great achievements of three-quarters of a century after the war.”

But such a characterization of the War Court’s legacy paints an incomplete picture. Sloan is certainly correct that liberal jurists later built on these precedents, but so did conservative ones. The rise of the Supreme Court to its current privileged place of making constitutional policy for the entire country surely represents a mixed bag. In recent years, bouts of judicial supremacy when it comes to enforcing gun rights, corporate speech rights, and the right of religious groups to opt out of equality norms have been followed by judicial rulings that take the federal judiciary largely off the playing field when it comes to voting rights or abortion rights.

FDR’s successful redirection of constitutional law inspired presidents with very different values and priorities such as Richard Nixon and Ronald Reagan to attempt a similar feat. Partisan control of the courts eventually gave way to other strategies for altering the ideological tenor of judge-made law, including turning over the selection of Supreme Court justices to key figures within legal and social restoration movements. The rise of social movements that identify—or even try to cultivate—ideologically friendly jurists is one such adaptation; wealthy benefactors who develop their own cozy relationships with Supreme Court justices represent another. If anything, the stakes of political control of the courts have intensified, with the result that energy is siphoned away from worthy reform projects. It may no longer be sufficient to focus on the behavior of individual justices and politicians and more crucial to consider what steps might be necessary to rethink more broadly a political culture that valorizes judicial policy making when it comes to individual rights.

For while FDR’s justices were brought together for the project of clearing roadblocks to democratic decision-making, their ultimate choice to plunge into a broad range of social questions using the framework of individual rights did just the opposite. Sometimes for the better, but sometimes for the worse, their actions would thrust federal judges into a wider circle of cultural conflict. Their precedents opened the door to muscular rights-based policy making by judges more generally: to create, prioritize, and, most visibly in recent months, even eliminate cherished constitutional rights.

A final, disquieting, possibility should also be put on the table: By wrapping a president’s political agenda in grand constitutional principles, the Roosevelt justices increased the probability of public officials giving idealistic, rights-based reasons for going to war. Such an uptick in Constitution-laden war justifications and intensive societal reliance on judges to make policy should all be treated as legacies of the War Court—and not just the rulings that happen to align with progressive sensibilities. 

An apex court that has become a powerful national policy maker rather than “the least dangerous branch” that the Constitution’s Framers originally envisioned can intervene selectively in favor of some rights and not others and partner consistently with a favorite political party or social movement, while taking actions that undermine the political achievements of those the justices do not support. Judicial decisions that weaken federal and local civil rights laws, erase abortion rights, and hobble campaign finance laws in the name of protecting constitutional rights may be just a prelude for what’s to come.

Sloan has written an eminently readable book. He succeeds in showing us that in the early 1940s war mobilization loomed over everything. During these years, constitutional ideals were consciously reshaped with the war effort in mind. At times, that redounded to the benefit of those affected by government policy; at other times, it did not. In turn, that fusion of war making and constitutional principle became potent stuff for later generations. That’s why it is hard to escape the sense that The Court at War misses an opportunity to show us all the ways in which the imperative to go to war presented both opportunity and peril. We still live with the momentous choices made by Roosevelt’s justices.

The post The Supreme Court’s World War II Battles appeared first on Washington Monthly.

]]>
149471 Sloan
How America Bungled the Pandemic https://washingtonmonthly.com/2023/10/29/how-america-bungled-the-pandemic/ Mon, 30 Oct 2023 00:10:00 +0000 https://washingtonmonthly.com/?p=149777

Why did the world's richest nation, with some of the most advanced health care, respond so poorly to COVID-19?

The post How America Bungled the Pandemic appeared first on Washington Monthly.

]]>

By every objective measure, the U.S. mounted one of the world’s worst responses to the COVID-19 pandemic. Its 1.1 million deaths left it with a mortality rate that exceeded all other advanced industrial nations except the United Kingdom. Official explanations for this catastrophe are in as short supply as ventilators, masks, and hospital gowns were during the pandemic’s first wave. Neither Congress nor the White House has appointed an independent commission to document what went wrong. Federal and state public health officials have offered few recommendations on how the nation could be better prepared for the next pandemic when it strikes, as it certainly will in this crowded and warming world. Even the usually hyperactive network of think tanks and academicians engaged in public health have been relatively silent about the need for changes in U.S. policy to correct the gaps in pandemic preparedness revealed by COVID-19.

Why did a country with the most expensive health care system in the world, an enviable scientific capacity, and a deep bench of public health expertise perform so miserably when confronted by this unique and dangerous pathogen? The short answer, according to a new book by the award-winning journalists Joe Nocera and Bethany McLean, is that it was inevitable given the decades-long trends in every sector of society that must be mobilized to successfully combat a new threat to public health.

Their review of the actions of elected leaders, the government’s health-related bureaucracies, corporate America, health care institutions, and a substantial fraction of the general public claims that each responded in a self-interested manner. A collective response, which requires a commitment by individuals, corporations, and institutions to preserve life (as necessary in combating a public health threat as it is in wartime), never took hold in the U.S. A few countries succeeded in mobilizing their societies around a joint response. Ours did not, in spectacular fashion.

The Big Fail: What the Pandemic Revealed About Who America Protects and Who It Leaves Behind, by Joe Nocera and Bethany McLean

In The Big Fail, the authors provide a comprehensive catalog of the institutional and leadership failures that led to America’s bungled response. Each failure they document reflected organizational and individual behaviors that had been decades in the making. “A central tenet of this book is that we could not have done better, and pretending differently is a dangerous fiction, one that prevents us from taking a much-needed look in the mirror,” the authors write.

They begin by documenting the missteps of Donald Trump’s administration and the president’s antiscientific pronouncements. Trump’s early embrace of unproven and dangerous cures was contagious. In the midst of his reelection campaign, he shoved Health and Human Services Secretary Alex Azar to the sidelines. His replacement as head of the government task force, Vice President Mike Pence, promptly took to the pages of The Wall Street Journal to confidently predict that there would be no second wave—which broke with ferocity just after the election.

The authors only briefly mention the prior decade’s defunding of the nation’s pandemic preparedness infrastructure. But those cuts, demanded by the Republican-run Congress in its dealings with Barack Obama’s administration, contributed to the chaos at the outset of the pandemic. Corporations that supplied personal protective equipment had been outsourcing their manufacturing capacity, largely to China, for decades. Their hospital customers helped drive the trend by demanding ever lower prices for PPE in the name of maximizing their own profits. The result? The government’s stockpile—hoarded by the Trump administration—was inadequate. And supply closets were thinly stocked everywhere. The field was ripe for profiteering and fraud when demand exploded at the outset of the pandemic.

Nocera and McLean provide an important history of the growth of antivaccine sentiment over the previous two decades. When the vaccine finally arrived—a joint government–private sector endeavor that receives generous praise in the book—once-niche antivaxxer sentiment grew to one in seven Americans, one reason why nearly a third of the population remains less than fully vaccinated. The country that helped invent the mRNA vaccine failed to take full advantage of its medical benefits. (Two scientists from the University of Pennsylvania just won this year’s Nobel Prize in Medicine for their work on the vaccine.)

Private equity’s incursion into the health care industry comes in for repeated criticism in the book. More than a fifth of all deaths took place among residents and staff in nursing homes, which private equity firms had purchased in large numbers early in the 2000s but largely abandoned after extracting short-term profits. Those Medicaid-dependent institutions have never been properly funded by Congress, nor have regulators adopted standards for operators that might have protected patients. “Once the pandemic arrived, it was too late,” Nocera and McLean write.

The hospital industry’s inadequate response to COVID was similarly skewed by inadequate funding—for some hospitals, not all. People who are poor or low-income are more likely to suffer from one or more chronic medical conditions and therefore were the ones most vulnerable to serious consequences when stricken with COVID. They were more likely to wind up in one of the nation’s safety net hospitals, which get most of their funding from Medicare and Medicaid, which pay less than private insurance. Hospitals in well-off neighborhoods, meanwhile, took care of fewer COVID patients, yet they received a disproportionate share of hospital emergency funds, which were distributed based on pre-COVID revenue. As a result, hospitals with the least resources bore the brunt of the fight against the disease.

The authors aim their fire for this sorry situation at privately owned chains like HCA Healthcare; at private equity’s incursion into the hospital sector (still a very small share of hospitals); and at the outsized salaries of top hospital officials. It’s important to note that major nonprofit chains, often religiously affiliated, benefited just as much during COVID from the government’s failure to channel most of its emergency aid to frontline institutions.

While their far-ranging critique may sound Pogo-esque—the 20th-century newspaper cartoon character’s most famous aphorism was “We have met the enemy and he is us”—Nocera and McLean repeatedly cite the politicization of science as a major cause of the U.S.’s pathetic performance. That’s true. But rather than lambast opportunistic politicians or the growth of antiscience among the general public, they reserve their sharpest barbs for the arrogance of career government officials at the National Institutes of Health and the Centers for Disease Control and Prevention. This may come as a shock to those (like me) who saw the primary threats to a coherent, science-based response coming from followers of Donald Trump, who early on championed quack cures like hydroxychloroquine and ivermectin; from antivaxxers; and from conservative politicians like the governors of Florida and Texas, who actively encouraged resistance to masking, social distancing, and economic lockdowns.

Nocera and McLean instead focus on the flip-flopping by the government’s physician-leaders, who were the trusted figures to whom most Americans initially turned for advice. Anthony Fauci, the director of the National Institutes for Allergy and Infectious Diseases, comes in for repeated opprobrium. He first said there was no reason for people to be walking around with masks. A month later, the CDC recommended masking and social distancing, which Fauci promptly endorsed. “Follow the science,” he said repeatedly during his television appearances. After Joe Biden’s new administration installed Rochelle Walensky as head of the CDC, she switched her position on social distancing without explaining why six feet of separation was now required instead of the three feet she had recommended when she was a hospital official in Massachusetts. She and Fauci initially encouraged using cloth masks to protect against catching or spreading the disease, but a year later, the CDC admitted that cloth masks without an N-95 rating afforded little protection.

“That kind of grudging change didn’t inspire confidence,” the authors write. “The problem with ‘following the science’ is that science, particularly in the early stages of discovery, is not an immutable thing. It rarely offers certainties. It offers theories and models and probabilities, which are then supposed to be tested against real-world evidence. But self-righteousness does not easily acknowledge uncertainty.”

Acknowledging uncertainty and seeking real-world evidence are sound principles. One wishes that Fauci and others had been more forthright about their recommendations being based on the best available science at the time; that researchers were learning more every day about what worked and what didn’t; and that there would inevitably be twists and turns in recommended actions.

But the authors should have followed their own advice as they issued a harsh indictment of the scientists and state public health officials who supported the federal government’s endorsement of economic and school lockdowns. Yes, those actions had serious consequences for small businesses, schoolchildren, and any working American not a member of the educated Zoom class that could work from home. But the goal of public health officials is preserving human life. They judge the value of lockdowns, like all prevention strategies, with that measure.

Nocera and McLean make the incorrect claim that Fauci “had to know that lockdowns as a mitigation measure had no basis in science.” They follow the lead of the epidemiologists Martin Kulldorff at Harvard and Jay Bhattacharya at Stanford, who are portrayed in near-heroic terms in the book. Both were fierce opponents of lockdowns. They first articulated their position in an op-ed in The Wall Street Journal, and then helped organize the Great Barrington Declaration of October 2020, which eventually gathered nearly a million online signatures. The declaration emphasized the enormous collateral damage when you shut down a society: Kids fall behind in school; domestic violence soars; businesses shut down; people die when hospitals stop performing surgeries and patients postpone routine preventive care.

However, the declaration drew immediate fire from the more than 7,000 scientists, physicians, nurses, health care executives, and others who signed the John Snow Memorandum (named after the founding father of epidemiology, who identified the well water source of a mid-19th-century cholera outbreak in London). The memorandum is never mentioned in The Big Fail, an inexcusable oversight. Those signatories argued that lockdowns were “essential to reduce mortality, prevent health-care services from being overwhelmed, and buy time to set up pandemic response systems to suppress transmission following lockdown … In the absence of adequate provisions to manage the pandemic and its society impacts, these countries have faced continuing restrictions.” That’s exactly what happened in the U.S.

But rather than report both sides of a debate then raging in the medical literature and in the press, Nocera and McLean take the side of those pushing what amounted to a herd immunity strategy. What does the science say? To this day, the academic literature is filled with conflicting studies of the impact of economic lockdowns. One study rejecting their use relied on cost-benefit analysis, which places a lesser value on the lives of seniors because they have fewer “quality-adjusted” years left to live. But another study that looked primarily at the mortality benefit (rather than the economic cost) estimated, conservatively, that the 1 million lives saved through lockdowns outweighed the number of lives lost due to economic dislocation by a factor of four to one.

School lockdowns and their support from teachers’ unions are given a similar one-sided treatment. The risk to kids was “minuscule” and the harms were without historical precedent, they write, quoting at length from an Atlantic magazine article. “The fight over schools was an early sign of how stupidly polarized the country had become, and in this case it wasn’t the red states refusing to follow the science,” Nocera and McLean conclude. “It was blue state Democrats who valued their political affiliation over common sense.”

There’s no doubt educational achievement suffered during the pandemic. Parents were stressed, low-income parents most of all. But you will not learn from this account that more than 800 children under 18 died from COVID in the year that most of the age group became eligible for vaccination, which was two-thirds of all the children who died from the disease during the pandemic. Moreover, student suicides dropped, and a recent review of studies found “both school closures and in-school mitigations (like masking) were associated with reduced COVID-19 transmission, morbidity and mortality in the community.” Was there no validity to teachers’ concern about their own and their pupils’ safety, and were the school lockdowns ordered by mostly Democratic politicians mere catering to a favored constituency? This issue deserved a more evenhanded evaluation.

Near the end of this overly long book (do readers really need eight pages on the failed campaign to recall California Governor Gavin Newsom, whose outcome even the authors admit did not hinge on the state’s pandemic response?), Nocera and McLean wonder if modern communications and its reliance on sound bites isn’t suited to getting people through a pandemic. “When ‘maybe’ or ‘we don’t know’ isn’t allowed; when reputable scientists who hold dissenting views are banned from social media and described as ‘fringe’; when error is never acknowledged; and when the lived experience of people is ignored—it is inevitable that people will lose faith in experts telling them how to behave.”

Some people in the public health community did act dismissively toward their critics. But a fairer account would have put those attitudes, especially during the pandemic’s first year, in the context of a presidential election where the incumbent embraced fake science and stoked resistance to the recommendations of the agencies he ostensibly led.

Should lost faith in public officials be ranked among the major causes of the U.S.’s world-worst response to COVID-19? What is missing from that conclusion is any recognition of what it takes for any strategy designed to prevent the spread of disease to succeed. It requires broad public acceptance, which in turn requires social solidarity with those most harmed by the outbreak. It requires a common commitment to the idea that while the science may be uncertain, there is such a thing as the best available evidence. It requires an understanding that society-wide actions may have to shift as more evidence is acquired.

Social solidarity is in short supply in 21st-century America. In a book that seeks to document the multiple causes of America’s failed response to the COVID-19 pandemic, I wouldn’t give the missteps of harried civil servants and government scientists trying to instruct a divided, misinformed, and politically manipulated public a prominent place on the list.

The post How America Bungled the Pandemic appeared first on Washington Monthly.

]]>
149777 Goozner jacket The Big Fail: What the Pandemic Revealed About Who America Protects and Who It Leaves Behind, by Joe Nocera and Bethany McLean
The Lost Mystique of Betty Friedan https://washingtonmonthly.com/2023/10/29/the-lost-mystique-of-betty-friedan/ Mon, 30 Oct 2023 00:05:00 +0000 https://washingtonmonthly.com/?p=149812

Later waves of feminists assailed the pioneering author and activist for focusing on women’s legal and economic rights rather than sexual liberation. Her reputation is due for a revival.

The post The Lost Mystique of Betty Friedan appeared first on Washington Monthly.

]]>

In February 1969, Betty Friedan, president and cofounder of the National Organization for Women and best-selling author of the feminist manifesto The Feminine Mystique, led a protest of 30 women at Manhattan’s storied Plaza Hotel. Since 1907, the Plaza’s elegant wood-paneled Oak Room and adjacent bar had excluded women from its weekday lunch service. Clad in a mink coat, the 48-year-old Friedan addressed the press gathered in the gilded lobby. Drawing parallels to the sit-ins of the civil rights movement, Friedan asserted that the Oak Room’s exclusion of women constituted a violation of state law, asserting, “This is the only kind of discrimination that’s considered moral, or, if you will, a joke.” 

Betty Friedan: Magnificent Disrupter by Rachel Shteir Yale University Press, 384 pp.

Indeed, the media mocked the “phalanx of feminists” and their theatrics. “For a woman to stroll into a men’s bar at lunchtime and demand service seems to me as preposterous as a woman marching into a barbershop and demanding a hot towel and a haircut,” the New York Post chided. Though the small protest, like hundreds of others staged by NOW, was ultimately successful, resulting in the hotel’s reversal of its men-only policy, it became an object of derision within the movement, too. Younger, more radical feminists like the journalist Gloria Steinem “felt that the Oak Room sexgregation action proved yet again that the organization was too white, too middle class,” as Rachel Shteir writes in her new biography, Betty Friedan: The Magnificent Disrupter. In 1963, the explosive publication of The Feminine Mystique, Friedan’s siren call for women trapped in the mind-numbing drudgery of housework and the glorification of motherhood, had lit the fuse of the second-wave feminist movement. But just six years after becoming a household name, Friedan was on the verge of being eclipsed by the movement she had created, dismissed by her critics as a relic of a stodgy feminism too narrowly focused on legal and economic equality.

Shteir’s book wrangles with the complex legacy of the mother of mid-20th-century feminism, and, by extension, the women’s rights movement of the 1960s and ’70s. This new biography is animated by a desire to restore Friedan’s reputation, which Shteir describes as marred by highly publicized quarrels within the women’s movement, and by disparaging historical treatments. Shteir portrays Friedan as misunderstood, both in her time and today. Shteir writes, “Since Friedan’s death [in 2006], the practice of either ridiculing her or making her disappear continues, carrying forward the portrait cemented twenty years ago in the last round of full-length biographies.” In 2020, a biopic about Steinem (The Glorias) and a miniseries about the conservative activist Phyllis Schlafly and the defeat of the Equal Rights Amendment (FX’s Mrs. America) introduced the women’s liberation movement to a new generation of young women. Indeed, Friedan fares poorly in both cinematic histories, coming across as shrill, out of touch, and self-absorbed.

Shteir’s rehabilitation of her subject rests on Friedan’s undeniable achievements. The Feminine Mystique is regularly listed among the most influential non-fiction books of the 20th century, alongside classics like the conservationist Rachel Carson’s Silent Spring. The first paperback edition sold 1.4 million copies. The futurist Alvin Toffler proclaimed that “it pulled the trigger on history.” It’s difficult to think of a book published in the past 25 years that has had a comparable cultural and political impact. Validating many women’s dissatisfaction with their lives—a phenomenon she dubbed “the problem that has no name”—Friedan wrote, “Each suburban wife struggled with it alone. As she made the beds, shopped for groceries, matched slipcover material, ate peanut butter sandwiches with her children, chauffeured Cub Scouts and Brownies, lay beside her husband at night—she was afraid to ask even of herself the silent question—‘Is this all?’ ” 

The Feminine Mystique launched Friedan’s public career. For the next decade, she was everywhere—in magazine profiles, with Johnny Carson on The Tonight Show, leading marches, speaking at civic organizations, and meeting with elected officials. But critics noted that the book spoke primarily to white, college-educated, suburban women, virtually ignoring Black and working-class women. Others questioned the originality of Friedan’s ideas and deemed the book derivative, particularly of Simone de Beauvoir’s The Second Sex, published more than a decade earlier. These criticisms—too white, too derivative, too middle class—followed Friedan for decades. Feminist theorists like bell hooks demeaned the The Feminine Mystique as “a case study of narcissism, insensitivity, sentimentality, and self-indulgence.” Friedan often cast herself in heroic terms, musing, “The reactions to my book have been most satisfying, even the violence of the attacks … Writing this book seems to have catapulted me into a movement of history.”

Friedan was, by all accounts, difficult. Her own brother described her as “a cross I had to bear.” She had a fierce temper, was imperious and demanding, and insisted on proper deference. “She thought she was the wave,” an obituary noted wryly.

While Shteir acknowledges the narrow scope of The Feminine Mystique, she endeavors to rescue Friedan from attacks of classism and racism. By 1963, Shteir argues, Friedan had earned her left-wing bona fides. She was quick to join a picket line and had logged two decades of writing for labor publications, publishing critiques of capitalism, conspicuous consumption, and income inequality. After the book’s publication, despite viewing herself as not “an organization woman” but “a writer, a loner,” Friedan parlayed her celebrity to cofound NOW to confront bread-and-butter issues of legal and workplace inequality, and lobby for the passage of the Equal Rights Amendment and the expansion of the Civil Rights Act. Shteir writes that Friedan actively recruited Black luminaries like Coretta Scott King and Fannie Lou Hamer onto the boards of her organizations. Friedan drew frequent parallels between the civil rights and women’s movements, drawing ideological and tactical inspiration from the former, and referring to NOW as “the NAACP for women.” 

NOW was remarkably effective in raising awareness of structural inequalities in every sector of American life, many of which are unimaginable today: prohibitions against unaccompanied women being served liquor at a bar; United Airlines’ men-only “executive flights”; and newspaper classified ads divided by sex. In 1969, Friedan built on NOW’s success by cofounding NARAL (the National Association for the Repeal of Abortion Laws) and in 1971, the National Women’s Political Caucus, to elevate women’s voice in the political process. 

Despite organizational successes, fissures emerged around substantive ideological disagreements, separating feminists from would-be allies in the labor and civil rights movements. For example, NOW split with unions over the ERA, which some feared would undercut hard-earned protections intended to shield women workers from long hours and dangerous work. And early Black allies like Pauli Murray abandoned NOW, frustrated with its ongoing preoccupation with the ERA at the expense of issues more directly impacting poor Black women. 

By the late 1960s, a clear schism had emerged between centrist feminists like Friedan and a growing women’s liberation movement, which included disparate radical feminist groups—many of them comprised of younger, unmarried women—advocating female separatism and sexual freedom. This strain of the movement, shaped by the Black Power movement, the student movement, antiwar protests, and the counterculture, was represented by Steinem, Friedan’s younger and more charismatic rival.

The two camps disagreed on fundamental matters, including divergent attitudes toward the nuclear family—Friedan argued that gender equality was compatible with marriage and motherhood, and rejected radical feminists’ vilification of men. Friedan shied away from portraying women as victims or members of an oppressed class. Influenced by the counterculture’s celebration of sexual freedom, some feminists drew connections between their own sexuality and feminism, celebrating the female orgasm and advocating alternatives to heterosexuality. Friedan quipped that lesbians in the movement constituted a “lavender menace” and feared that they would scare off the middle-class suburban housewives she needed to rally support for the ERA. In response to Kate Millett’s Sexual Politics, which focused on sexual oppression, Friedan griped, “Young women only need a little more experience to understand that the gut issues of this revolution involve employment and education not sexual fantasy.” Even as Friedan enthusiastically led “guerrilla” actions like a 1967 protest in which NOW members threw typewriters and aprons at the White House gates, she eschewed those targeting beauty culture, like the Miss America protest where women burned bras and hanged the pageant host Bert Parks in effigy.  

Differences came to a head in 1968, when Valerie Solanas, the author of SCUM (Society for Cutting Up Men) Manifesto, shot the artist Andy Warhol. Ti-Grace Atkinson and Flo Kennedy, leaders of the NY NOW chapter, rushed to Solanas’s defense, with Kennedy describing her as a hero of Black Power and “one of the most important spokeswomen of the feminist movement.” Friedan was appalled, and telegrammed, “Desist immediately from linking NOW in any way with Valerie Solanas. Miss Solanas’s motives in Warhol case entirely irrelevant to NOW’s goals of full equality for women in truly equal partnership with men.” When Atkinson, a former Friedan protégée, ran for reelection as president of NY NOW, Friedan rallied the opposition; Atkinson and Kennedy defected to found the Feminists, an egalitarian, radical organization. 

By the 1970s, Friedan was increasingly marginalized from the movement she birthed. Her attempts to make common cause with other factions could be cringingly tone deaf—she organized a truck bearing watermelon and fried chicken (a “Traveling Watermelon Feast”) in support of Black Congresswoman Shirley Chisholm’s 1972 presidential campaign. But Friedan continued to work on behalf of women’s equality. Shteir notes that in many ways, Friedan was ahead of the culture, writing about the pressures of the “double shift,” paid maternity leave, universal child care, and pressure to choose between family and career. In later years, she wrote about menopause, women’s right to love and sexual satisfaction, and aging. But on many issues, Shteir concedes, Friedan was on the wrong side of history. She viewed rape, domestic abuse, sexual liberation, pornography, and abortion as distractions from the fundamental fight for gender equality. 

Could a different, more flexible leader have navigated the transition from the early women’s movement, which emphasized legal and political strategies, to women’s cultural liberation? Perhaps. Shteir blames Friedan’s centrism and incrementalism for accelerating a mass defection of young women from NOW into radical feminism. But it’s hard not to see Friedan’s limitations as those of personality. Obliquely referring to Friedan, Steinem told a reporter, “I know other women with whom I have the same ideological differences with whom I can work.” 

Friedan was, by all accounts, difficult. Shteir’s interviews with Friedan’s former colleagues and family members provide some of the most biting commentary in the biography. Her own brother described her as “a cross I had to bear.” Within the movement, she turned on former allies, maligning them behind their backs. Friedan had a fierce temper, was imperious and demanding, and insisted that she receive proper deference. In a pointed obituary, Germaine Greer noted wryly of Friedan, widely acclaimed as the mother of second-wave feminism, “She thought she was the wave.” 

Jaw-droppingly hostile press coverage skewered Friedan’s clothing, hairstyle, weight,
and facial features. In a
Philadelphia Inquirer profile, a sympathetic female reporter offered a backhanded compliment, writing, “[Friedan] is not as grotesque as the press and many photographs would have you believe.”

Shteir shares the catty comments from Friedan’s fellow feminists, and the jaw-droppingly hostile press coverage, which skewered Friedan’s clothing, hairstyle, weight, and facial features. In a Philadelphia Inquirer profile, a sympathetic female reporter offered a backhanded compliment, writing, “[Friedan] is not as grotesque as the press and many photographs would have you believe.” There is more than a tinge of antisemitism to many attacks—her “long nose,” and “bulging” eyes—and in a movement filled with Jewish activists, Friedan seemed uniquely targeted. 

The discussion of how a subject is perceived by colleagues and family is fair game in a biography, but there’s something that feels cruel, almost—dare I say?—antifeminist about Shteir’s ample attention to these personal flaws. For a movement that trumpeted that “the personal is political,” Shteir’s repetition of the slurs—even as a form of reporting—feels gratuitous, even as they create the context for the hostile environment in which feminism flourished. And one wonders whether all of Friedan’s negative attributes—her bluntness, bossy demeanor, and assertiveness—might have been viewed as virtues in a male counterpart.

Ultimately, Shteir successfully argues that Friedan’s legacy rests on the work itself, rather than on her character, an assessment Friedan herself would have found gratifying. Through her writing, her organizations, and her unrelenting prodding at social norms, Friedan transformed the way women viewed themselves, even as true equality remains unrealized. Shteir concludes, “Friedan was no saint. But she was an oracle and an iconoclast, ahead of her time … She imagined herself under the shadows of history and eternity, acting with remorseless courage.” A fairly magnificent legacy, indeed.

The post The Lost Mystique of Betty Friedan appeared first on Washington Monthly.

]]>
149812 Nov-23-Books-Shteir Betty Friedan: Magnificent Disrupter by Rachel Shteir Yale University Press, 384 pp.