January/February/March 2023 | Washington Monthly https://washingtonmonthly.com/magazine/january-february-march-2023/ Mon, 15 Dec 2025 16:00:14 +0000 en-US hourly 1 https://washingtonmonthly.com/wp-content/uploads/2016/06/cropped-WMlogo-32x32.jpg January/February/March 2023 | Washington Monthly https://washingtonmonthly.com/magazine/january-february-march-2023/ 32 32 200884816 Reviving America, One College Town at a Time https://washingtonmonthly.com/2023/01/08/reviving-america-one-college-town-at-a-time/ Mon, 09 Jan 2023 01:50:00 +0000 https://washingtonmonthly.com/?p=145071

How symbiotic relationships between colleges and their communities have reaped rewards in Erie, Pennsylvania, and Waterville, Maine.

The post Reviving America, One College Town at a Time appeared first on Washington Monthly.

]]>

Gannon University started out in a working-class community during the boom years of the 1920s. John Mark Gannon, a native of Erie, Pennsylvania, who returned from the Vatican to become the Catholic bishop of Erie, had long been conscious of what he called the “cruel inequality” of higher education. “Those whose parents are wealthy may set out for college,” he said, according to the university’s history. While “the sons of workingmen, no matter how virtuous or talented, are forced to give up hope of a college education.” 

The two-year college for which he led a building campaign, which is now a full-fledged university named for him, opened in 1925 in downtown Erie with the goal of bringing opportunity to its industrial community.

Nearly a century later, many schools like Gannon—small, private, regionally focused, not part of the crazed status competition for elite admissions slots, not in a fancy small-college town—have been struggling. 

By comparison, Gannon is thriving. Its total enrollment is modest by national standards but has been going steadily up rather than down. Ten years ago, it had just over 4,000 graduate and undergraduate students. This year it has nearly 4,800, the largest in its history. Of the current students, about 20 percent are international, and 14 percent are counted as diverse. It is known for its programs in engineering, the health sciences, infotech and cyberskills, business, and others. 

Gannon’s recent success parallels the ambitions of its home city, which is more than a decade into a sweeping civic recovery process. Tucked into the northwest corner of the state, Erie has a spectacularly beautiful natural setting but has been through the rigors of many northern industrial towns. Between the 1970 and 2020 censuses, Erie shrank from 130,000 people to 95,000, as manufacturing and other business fled the city amid post–World War II deindustrialization. Yet in the past decade the city has seen significant revitalization—in significant part because private businesses have invested hundreds of millions of dollars in Erie, especially its downtown.

Among corporations, the locally headquartered Erie Insurance has led the way, with a new $150 million campus in what had been a decaying part of downtown. One of the biggest investors, however, is Gannon itself. The university has renovated woebegone buildings to launch the Center for Business Ingenuity in the city’s downtown; contributed to a research and conservation program for Lake Erie; and opened the Institute for Health and Cyber Knowledge, or I-HACK. But as I have heard on numerous visits to Erie in recent years, Gannon’s role has been indispensable not just in money but also in leadership, in coordination, in tying its ambitions to those of the town. For instance, it cofounded Our West Bayfront, a citywide initiative to revitalize a distressed middle-class neighborhood near the university. 

Colleges can make news for a lot of the wrong reasons: How they’re ranked. Which applicants they admit, and why. How much they cost. What they teach students about the world, and what they encourage or permit them to say.

These are real issues. But there is another underemphasized way to talk about, report on, and assess colleges and their success. It’s the one that Deb Fallows and I wrote about in the Monthly’s college issue last fall: that is, how seriously and skillfully colleges take the opportunity that many of them have to become centers of “place-based” economic and civic renewal in their home communities. 

Where colleges are located is, in most cases, now a given, like the presence of a river or a major transportation hub. But what a college decides to do with and for the community outside its gates is a choice. And, as Deb and I wrote, more and more colleges are recognizing both the responsibility and the potential rewards of choosing to make “town and gown” a serious priority rather than just a slogan. Gannon University epitomizes that trend, as does Colby College, a liberal arts institution in Waterville, Maine.

The sense of responsibility comes in different forms for different institutions. At some private colleges, it may arise from a faith-based mission or other founding ideals. For many public institutions, it may come from a land grant history or other conditions of state funding. It may reflect a personal vision or commitment by current leaders. 

The potential rewards, for institutions of all varieties, usually boil down to making their home location more attractive—to students, faculty members, donors, international applicants, graduates who are deciding whether to stay or go after they get a degree. A college can’t easily pick up and move its campus, as a company might, if things get tough where it is now. So the smart bet, as well as the socially beneficial move, for college leaders is to recognize their interests as being linked with their communities.

Keith Taylor has been president of Gannon University since 2011, after six years as provost. Though still in his late 50s, a stripling by college leadership standards, he has just announced his retirement after a successful run in which his school’s enrollment, financial assets, program range, physical plant, and national and international connections have grown. Taylor, who got his original degree in physical therapy and later a PhD in anatomy and cell biology, is a trim, bald, fit-looking man who radiates a “Let’s get it done!” nervous energy. He is not looking for another job in academia, and he told me that he has been studying the post–White House career of Jimmy Carter as a model for how a relatively young “retiree” can start new projects.

“When I first got here, some people would say, ‘As Gannon goes, so goes Erie,’ ” Taylor told me on my most recent visit to Erie, just after the 2022 midterms. But after nearly two decades at the university, Taylor has rethought that maxim—or, rather, reversed it. “Really the way to think about it is, “As Erie goes, so goes Gannon,’ ” Taylor told me. That is, the better life became for people who stayed in or moved to Erie, the better the university’s prospects would be as well. This is a similar formula to the one we described from Geoffrey Mearns, the president of Ball State University in Muncie, Indiana, last fall, and it’s a useful shorthand for the place-based commitments we’ve seen in many institutions.

The saga of “As Erie goes”is complicated and still incomplete. Signs of recovery are evident all over town. The downtown is dotted with the local restaurants, coffee shops, breweries, and other facilities that mark the difference between districts that are dying and those that are returning to life. The Flagship City Food Hall has half a dozen vendors under one roof. A downtown Starbucks closed during the pandemic, but the local Ember+Forge coffee shop kept going strong. The Lavery Brewing Company has grown to include a kitchen and live-events space. Federal Hill Smokehouse, about a mile from downtown, was recognized by Food & Wine magazine as having the best barbecue in the state. 

And then there are the new or maturing businesses: Menaj-Erie Studio, a video and design organization run by a young husband-and-wife team. Epic Web Studios, a web design and digital marketing firm that I’ve hired for projects. Velocity Network, a regionally known IT company. BOTH Studios, which produces high-end custom craft furniture. The print-edition Erie Reader, founded nearly a dozen years ago. The century-old Erie Playhouse, and new performance spaces. A lot is going on.

What does Gannon have to do with this? “Gannon is really Erie’s university,” Ferki Ferati, head of the Jefferson Educational Society, an Erie-based think tank, told me recently. “They’ve lived the idea that universities need to be a real part of the community, and part of any local solution.”

I asked Keith Taylor what it meant, in practical terms, to be “part of any local solution.” There’s a long list of specifics, such as the university’s new Center for Business Ingenuity on the State Street downtown corridor. Gannon is part of an ambitious “water sustainability” initiative, with the clumsy acronym of “Project NePTWNE” (pronounced Project Neptune), to protect Lake Erie and study the Great Lakes region’s climate change advantage as the Earth’s freshwater repository. The university has taken a nearby middle school under its wing, with support from Gannon students and faculty; absenteeism there has fallen dramatically. 

When I’ve asked people in Erie about Gannon’s influence, the project they most often mention is Our West Bayfront. This is a community-wide effort, launched in the mid-2010s, to restore a large, distressed area to its onetime status as a thriving, middle-income, owner-occupied residential neighborhood. It’s also the neighborhood that surrounds Gannon—think of a much smaller version of the University of Chicago’s presence in the city’s South Side or south-central LA’s USC. The project involves bottom-up community surveys and planning discussions; public and private investments to restore homes and parks; digital mapping and planning tools; volunteer tutoring and mentoring; tree planting; road repair; art fairs and other community events; and the other mixture of the high-concept and the nitty-gritty that goes into place-based improvement.

“The West Bayfront is the part of town really with the most potential, as a residential community,” Anna Frantz, the head of the Our West Bayfront organization, told me. It is full of small, single-family, century-old bungalows, interspersed with parks and small stores. “But we have had really significant challenges in terms of blight and crime.”

State Street, which runs toward the Bayfront and was once Erie’s main commercial boulevard, is still a kind of borderland. On one block, there are new locally owned restaurants and businesses, and old office buildings being converted into apartments. On the next block, a shuttered store or low-end liquor shop. But the trend is positive. “I think we’re just past the takeoff stage, and headed toward cruise,” Frantz told me. 

I’ve talked with Keith Taylor about it at length. I’ll summarize a few of the points he thinks other communities might learn from:

Geographic concentration. Gannon had a long history of involvement around the city, but it began to see better results after concentrating on a few neighborhoods like West Bayfront. “It’s not like we weren’t involved before,” Taylor told me. “We were doing things all over the place. Now we’re intentionally focused. We decided to confine our work more to the neighborhoods right here on the west side.”

Sustained involvement. Taylor rolled his eyes when he talked about the long saga of students or professors digging into a community project—for a year or two, and then moving on. “You’ll get students or faculty members who are so passionate. But then the student graduates, and the new students care about something else.” Instead, he said, “we realized that we needed to build these things very intentionally into all of our academic programs. You don’t want this to be a set of one-off random efforts.” One example: Students who are studying GIS digital mapping tools apply them in surveys of problems and solutions in their own vicinity. “Let’s go out in the neighborhood and see where there are dilapidated roofs, or broken windows, or siding problems,” Taylor said. “Or rates of truancy or food insecurity. The students tie it together with their mapping, and are intentional about making it a permanent part of their course work, and research, and community efforts.” 

An emphasis on the highly practical. “We realized that the physical environment was a huge piece of how people feel about their neighborhoods,”Taylor said. Sidewalk pavement was broken; streetlights were dim. Students and residents alike felt more comfortable walking at night down Sixth Street, which was well lighted, rather than Eighth Street, which was not. “So we put in $1.25 million worth of lighting from Third Street to the Ninth Street, so that our students and neighbors felt safer going out after dark.”

A goal of affordable neighborhood life, versus “gentrification.” The West Bayfront area has a long way to go before worrying about Brooklyn-style residential dislocation. But the stated goal is to steer the neighborhood’s recovery toward affordable homeownership, starting with the families who are already living there.

“We should be doing things our students need, our community needs, our society needs,” Taylor told me. “That should really be the heart of what we’re about.”

As Waterville goes …

Colby College, in Waterville, Maine, has almost too perfect a backstory for what has become an enormous investment in downtown renewal. A century ago, the college had outgrown its cramped site by the railroad tracks and was considering a relocation to Augusta. (Yes, some colleges have picked up and moved, but it’s rare, and rarely very far. Augusta is 25 miles away from Waterville.). In response, average citizens in the town raised $100,000, at the time a huge sum, for a new hillside campus. It has been the college’s home ever since. Waterville was there for Colby when it mattered; recently it has been the college’s turn to pitch in—which it has done, on a vastly greater scale. Colby will always remember that $100,000 commitment. Residents of Waterville may reflect on the $100 million the college is now investing in them.

“When I came through the city, it was like so many New England or midwestern towns,” David Greene, who became Colby’s president in 2014 after stints at Brown and the University of Chicago, told me recently. “The buildings all along Main Street had terrific bones, but many had not been cared for in years.” 

Like other New England mill towns, post–World War II Waterville had been a thriving blue-collar industrial community, with steady jobs at the paper mill, the textile mill, the shirt factory. “We were the real hub of the working class,” Bill Mitchell, who grew up in Waterville in the 1970s and became a successful local business owner and real estate developer, told me. “We had a vibrant, robust economy—and then, of course, the fundamentals of all those industries changed.”

By the time David Greene arrived at Colby, the city’s story had reversed. As the mills went away, “urban renewal” came in. “In Waterville that meant two things,” Greene told me. “One was razing big tracts of residential and retail space in what had been downtown.” The other was the completion of I-95 on a route that bypassed the city center. “All of a sudden the big-box stores opened up along the highway,” Green said. “It took the whole core of the city and collapsed it.”

In Waterville’s declining days, the storefronts on downtown’s Main Street were largely vacant or low-end. Colby students didn’t have much reason to venture there from the campus, two miles away on Mayflower Hill. 

A college can’t easily pick up and move if things get tough where they are. The smart, and socially beneficial, bet is for it to recognize that its interests are linked with its community.

Now there’s a new look. Waterville, with its population of 16,000, now has a high-end boutique hotel, the Lockwood. Some 200 Colby students, along with faculty and staff, live not on campus but downtown in the Main Street Commons building, opened in 2018. Students who apply for spaces there must be involved in community-improvement projects. There are all the marks of a growing rather than declining downtown, from startup centers to bike stores to locally owned cafés. 

This past December, Colby formally opened what Greene has described as the capstone of “stage one” of the ongoing downtown renewal project, which represents an investment of some $200 million in public and private funds. Nearly $100 million of that came from Colby itself—part of it in new fund-raising, and part in cash and debt. This new capstone downtown building is the $18 million Paul J. Schupf Art Center—named after the late trustee and art collector who donated generously to the college—which contains performance spaces, museums, stores, and a café. I am eager for the next visit to see what it’s brought to the town. 

“A local economy, and economic development in general, is often highly visible when you walk or drive down a main street,” Garvan Donegan, of the Central Maine Growth Council, told me. “You can see the lights turning on, the changing skyline take shape, new construction and operations progressing, and feel the energy and vibrancy.” Waterville used to look tired, he said. Now it looks alive. 

Many organizations and funders have been part of this process. Yet Colby, like Gannon, has not only put in serious money but by all accounts has played a central planning role.

Why did it bother? Greene and the others I spoke with said there were two main reasons.

One was simple self-interest on Colby’s part. “It was hard to recruit people to Waterville,” Greene told me. It was particularly hard to build the kind of liberal arts college town culture that emerges when faculty and staff live nearby, rather than just commuting in. “But if you’re a liberal arts college, where you believe that faculty-student interaction is really critical, and that the nature of residential education is something that’s powerful and transformative for students, then you can’t be in a place where all of your faculty and staff decide not to live.” 

Has this part of the “As Waterville goes …” strategy paid off for the college? As recently as 2014, Colby had around 5,000 applicants for its entering class. This year it had nearly 17,000, an increase among the greatest for all small private colleges, and it enrolled its largest-ever first-year class, of 676. “Our ability to get our first choice in faculty is almost universal now,” Greene said. “It wasn’t before. And we’re seeing people who want to move to Waterville, for the scale and quality of life, who we just couldn’t recruit before.”

I asked whether the people Greene must finally answer to, in the alumni community and on the board, share his belief in the results. “I must say the most surprising thing to me is how powerful and popular this has been with the alums,” he said. “There’s a sense of pride and purpose that connects with people.”

This sense of purpose is the other reason Greene gave—the idea that universities owe something practical to the world around them. “You have some cloistered places where the whole idea is to go away and think your deep thoughts,” he said. Other institutions have tried to wall themselves off from the complications and strife of surrounding neighborhoods. Greene said he believes in the exact opposite. 

“If there is anything we should be focusing on now, it’s how to make colleges and universities a part of their communities, not separate from them,” he said. “We should see colleges and universities as places that are helping to solve the most complex and intractable problems. And the way to do that is actually to be engaged in a deep way in the world, beginning with—and most importantly with—your own community.”

As Gannon and Colby go, so should many more.

The post Reviving America, One College Town at a Time appeared first on Washington Monthly.

]]>
145071
The New Political Economy  https://washingtonmonthly.com/2023/01/08/the-new-political-economy/ Mon, 09 Jan 2023 01:45:00 +0000 https://washingtonmonthly.com/?p=145095

In the immortal Federalist No. 10, James Madison wrote, The most common and durable source of factions has been the various and unequal distribution of property. Those who hold and those who are without property have ever formed distinct interests in society. Those who are creditors, and those who are debtors, fall under a like […]

The post The New Political Economy  appeared first on Washington Monthly.

]]>

In the immortal Federalist No. 10, James Madison wrote,

The most common and durable source of factions has been the various and unequal distribution of property. Those who hold and those who are without property have ever formed distinct interests in society. Those who are creditors, and those who are debtors, fall under a like discrimination. A landed interest, a manufacturing interest, a mercantile interest, a moneyed interest, with many lesser interests, grow up of necessity in civilized nations, and divide them into different classes, actuated by different sentiments and views. The regulation of these various and interfering interests forms the principal task of modern legislation, and involves the spirit of party and faction in the necessary and ordinary operations of the government.

That was back in 1787, before there was a ratified Constitution, when the U.S. government barely existed. But Madison’s framing is instructive in a number of ways. He evidently assumed that the main source of political strife in the new nation would be clashes among economic interests, and that the main task of government would be adjudicating these disputes. Anybody who has ever spent time in a legislative body at any level will recognize the rough ongoing truth of Madison’s observation. Back in the early days of the republic, globalist slave-holding plantation owners battled over trade policy with protectionist northern manufacturers. The First and Second Banks of the United States gave rise to fights over centralized financial power. There were disputes over taxation, territorial expansion, and “internal improvements” like roads and canals. Today, economic interest groups are still fighting each other: over trade, over the role of unions, over the power of Big Tech, over the transition to clean energy, and over a zillion other issues.

There is a disconnect between this version of politics and the version we typically get in contemporary public conversations. We are in the habit of thinking of noneconomic issues (abortion, immigration, policing) as being debated on their merits, often as fundamental moral questions, but of economic issues as being properly understood in terms of the technical management of “the economy” by experts, not of power struggles between interests. What’s the unemployment rate? The inflation rate? The strength of the dollar? The trade deficit? How much is the Federal Reserve going to raise interest rates at its next meeting? How will the financial markets react? These are the kinds of economic questions one is likely to see addressed on front pages and on television news. The Madisonian version of the American political economy still goes on—not exactly behind the scenes, just insufficiently noticed—but it doesn’t command our primary attention.

As the 20th century wore on, a series of venerable political-economy tools came to be seen, at least in elite circles, as counterproductive—almost silly. On this list would be trade restrictions; price controls; industrial policy; attempts to break up big economic concentrations; attempts to shore up specific cities, towns, and regions; and policies aimed at promoting unionization.

How did this happen? It seems fair—especially in the light of recent historical work that understands slavery as a form of capitalism—to say that economic issues were at the center of American politics, and were understood and debated as power struggles, from the founding until World War II. These debates were particularly intense as the economy became industrial and this generated mass immigration, urbanization, and unprecedentedly large concentrations of wealth (in individual hands) and power (in the hands of trusts and corporations). The early decades of the 20th century saw the advent, in response, of federal regulatory agencies, central banking, a government-enabled mass union movement, and a modern welfare state.

The war ended the Great Depression, and postwar prosperity softened American politics’ focus on economic battles. The progress of the welfare state stalled. The postwar years were full of assertions, including by liberals, that the United States had developed a workable economic order, dominated by heavily regulated industrial corporations that provided their employees with many of the welfare state functions that governments provided in other industrial democracies. The rise of Keynesian economics was a part of this story. In 1946, economics became an academic discipline with an official permanent presence in the White House, the Council of Economic Advisers. This was a manifestation of the new faith that by monitoring and managing fiscal and monetary policy, the government could keep the economy growing, inflation and unemployment under control, and future depressions at bay. This idea had the political advantage of not automatically entailing conflict in the way that, say, labor law or antitrust actions did. And it placed the focus on macroeconomics, instead of the endless jostling for advantage among economic interests. One could see “politics” and “interest groups” as the enemies of government management of the economy, rather than as its essence.

As the 20th century wore on, a series of venerable political economy tools came to be seen, at least in elite circles, as counterproductive—almost silly. On this list would be trade restrictions; price controls; industrial policy; attempts to break up big economic concentrations; attempts to shore up specific cities, towns, and regions; and policies aimed at promoting unionization. Many of these play out in politics as contests between economic institutions (Ida Tarbell battled Standard Oil on behalf of small-scale oil producers, like her father), but both academic economics and economic policy had become uninterested in the interplay of institutions. The overall health of the economy, and the welfare of consumers, became the only proper targets of economic policy. Inequities and disruptions could be addressed after the fact, through redistributionist tax policies. These were not just conservative ideas. Liberal administrations enthusiastically participated in the deregulation of airlines, trucking, energy, telecommunications, finance, and other industries. In 1987, The New York Times published a lead editorial (which it has since renounced) calling for the abolition of the minimum wage. The establishments of both parties supported NAFTA and a long series of succeeding free trade treaties.

This economic regime produced a steady rise in inequality, of both income and wealth, beginning in the early 1980s, that has not abated. That was change on the boiling-a-frog model: gradual rather than in the form of unmissable events. It took the 2008 financial crisis and the subsequent Great Recession to produce a strong political reaction against the economic certainties of the late 20th century. Since then, the unexpected rise of populist and nationalist movements—some on the left, more on the right, sometimes with a strong cultural element, always rooted in economic discontent—has dominated politics all over the world, sometimes in ways that fundamentally threaten the ongoing health of democracy. As happened in the early 20th century, in the early 21st voters have forced policy makers to pay much closer attention to the political economy than they had been paying. We are in the early stages of that period now.

Political economy becomes visible when life isn’t going so well for you. When the factory in your town moves offshore, you can see that trade policy has adversely affected your life. But if you’re well educated, living in a prospering metropolis, and you get a good job, it’s because free markets work. A necessary first step in reawakening our long-dormant awareness of political economy is realizing that economies are made, not born. There are many capitalist countries, each with a distinctive version of capitalism, shaped by law and custom and subject to ongoing modification. Individual companies—farms, hedge funds, auto manufacturers, social media platforms, pharmaceuticals—prosper (or not) based not just on their own work and ingenuity, but also on the way government has laid out the shape of the playing field and the rules of the game for them. Phillip Longman’s essay in this issue (“Everyday High Prices”) calls attention to this aspect of political economy: the vicious, but not publicly visible, struggles for advantage between retailers and their suppliers. These always involve government as a not always impartial referee. The economic prominence of private equity, a field that didn’t exist 50 years ago, rests on a series of little-noticed changes in federal regulations. The consolidation into “Big Four” or “Big Five” firms that has swept across industry after industry would not have happened with more robust antitrust policies. The mega success of tech companies like Google and Facebook was enabled by their exemption from legal responsibility for the content they carry. All these policies are the kind that happen in courts and hearing rooms, with lobbyists but not the press or the public paying close attention.

Just as the making of the current American political economy—featuring high inequality, dramatic regional and racial disparities, and a great deal of disruption of ordinary people’s lives—was too little noticed as it was happening, so too is its remaking, which is already well under way. One of the most underreported stories in America is the Biden administration’s dramatic departure from the economic policies of the past several administrations, including the Democratic ones. This administration is the most aggressive on antitrust in decades. It has made strong regulatory moves in the financial sector. It has made major forays into industrial policy, by, for example, trying to strengthen the domestic semiconductor industry and to jump-start the green energy industry. Barry Lynn’s essay in this issue (“Manufacturing and Liberty”) tells that story, and urges the administration to do more.

One should resist the temptation to believe that Republicans’ taking back control of the House of Representatives means that the age of significant Biden administration economic policy making has come to an end. It may be that a full revival of the multitrillion-dollar Build Back Better bill is not possible, but elements of it, like enhanced programs in education, training, and apprenticeship, and an in-effect industrial policy for “care work,” may well reappear. That is partly because the Republican Party is betting its future on its ability to continue taking working-class voters away from the Democrats, and doing this will require delivering more than just relentless rhetorical assaults on wokeness. Also, because so much of economic policy is made by courts and agencies, the Democrats’ continuing control of the Senate means that the administration can keep getting appointees confirmed who can carry out its mission.

Political economy becomes visible when life isn’t going so well for you. When the factory in your town moves offshore, you see the adverse effects of trade policy. But if you’re well educated, living in a prospering metropolis, and get a good job, it’s because free markets work. A necessary first step is realizing that economies are made, not born.

If you had to take a test on your familiarity with the Biden administration’s American Rescue Plan, the Infrastructure Investment and Jobs Act, and the Inflation Reduction Act, down to the level of spending programs of $50 million and up, would you pass? I don’t think I would. These major initiatives tend to be covered as if they were championship games that the White House wins or loses, rather than for their content. It’s vitally important right now for liberals and progressives to pay attention to the enormous changes happening in economic policy. Especially on issues like antitrust, financial regulation, labor policy, and the future of what conservatives call “the administrative state,” the people on the other side are going to be in the room where decisions are made. Will they be there alone?

Along with a closer focus on these economic issues—in general, and in detail—liberals need to develop a new economic vocabulary. If you ever took an introductory economics course, you were probably taught that government attempts to reshape economies are doomed to failure, that any economic burden placed on businesses will just be transferred to consumers, that deficit and debt are irresponsible, that industry concentration is not a problem as long as it doesn’t directly harm consumers, that trade restrictions are always a bad idea, and that creative destruction is an inevitable and healthy aspect of a market economy. Attempts to push back against these bromides are often dismissed as the tiresome and counterproductive activities of politicians trying to get pork barrel projects for their districts, as opposed to good public policy. A new set of guiding principles for economic policy would help to reframe a wide range of issues, to communicate with the many voters who feel left behind and ignored in the current economy, and to guide our assessments of specific proposals.

The Biden administration has made a dramatic departure from the economic policies of the past several administrations. This administration is the most aggressive on antitrust in decades. It has made strong regulatory moves in the financial sector. It has made major forays into industrial policy, by, for example, trying to strengthen the domestic semiconductor industry and jump start the green energy industry.

I’ll propose just a few of these principles now. First, great concentrations of economic power are not healthy, either for people’s well-being or for the health of our democracy. Economic power converts itself into political power, and that upsets the balances and the protections of minority rights that the Constitution aimed to establish. Princes of property (that’s Franklin D. Roosevelt’s phrase) don’t have to be ill-intentioned to do harm—only excessively influential and blind to the concerns of ordinary people. A country with large and growing gaps between people depending on their education levels, on their race, on where they live, on what kind of work they do, can become unjust and unstable unless the gaps are corrected. In economics as in politics—to quote James Madison again, from another of the Federalist Papers—ambition must be made to counteract ambition.

Second, the economy should be designed and managed not only to promote its overall health and growth, but also to minimize the harms to lives, to health, and to communities that constant economic disruption can bring. When large economic entities swallow up smaller ones, often through taking on debt that puts enormous pressure on them to lay off employees and otherwise behave in socially destructive ways, we should stop believing that as long as it was a free market transaction, it’s good for the country. Capitalism always produces dislocation along with dynamism. A model for how to deal with them is that prevention—the job not lost, the benefits not cut, the neighborhood not allowed to wither—is far preferable to correction after the fact.

Third, economic politics, like all politics, fundamentally entails conflicts between interests. Madison had that right back in 1787. Political economy isn’t technical. It isn’t nonpartisan. It isn’t best left to experts. It isn’t best handled by applying broad universal concepts. Sweeping assertions about the virtues of markets have often served to shut down discussions that we should have had, and that we need to have now. The future of the American political economy is in play, and that means that we need to engage in the specifics, with our closest attention. That is what this package of stories aims to do.

The post The New Political Economy  appeared first on Washington Monthly.

]]>
145095
Everyday High Prices https://washingtonmonthly.com/2023/01/08/everyday-high-prices/ Mon, 09 Jan 2023 01:40:00 +0000 https://washingtonmonthly.com/?p=145079

For years, the only supermarket serving the Pine Ridge Indian Reservation in southwest South Dakota was run-down and a threat to public health. Inspectors from the Indian Health Service repeatedly cited its distant corporate owners for food safety violations, such as mixing rotten hamburger with fresh meat and repackaging it for sale. So leaders of […]

The post Everyday High Prices appeared first on Washington Monthly.

]]>

For years, the only supermarket serving the Pine Ridge Indian Reservation in southwest South Dakota was run-down and a threat to public health. Inspectors from the Indian Health Service repeatedly cited its distant corporate owners for food safety violations, such as mixing rotten hamburger with fresh meat and repackaging it for sale. So leaders of the Oglala Sioux Tribe were thrilled when, in 2018, they persuaded an experienced grocer to buy the store and commit to running it right. 

R. F. Buche, whose family business has operated independent groceries throughout South Dakota for four generations, started with months of demolition and extensive remodeling. Today, except for the signs written in Lakota, the store looks just like any supermarket you might find in any middle-class neighborhood. Floors are clean, and shelves generally well stocked, including with an abundance of fruits and vegetables that were never available before. This is particularly important in a community where poverty is so extreme that most people don’t own cars and the next-nearest grocery store is nearly 40 miles away. 

But two big problems remain. The first is affordability. To stock his store, Buche has to pay wholesale prices that are often nearly double what Walmart pays and must pass on much of that cost to his customers. The second is that when national shortages of critical items like baby formula emerge, Buche and the Ogala Sioux are often the hardest hit, either having to do without or enduring longer waits for critical supplies than people elsewhere. 

Yet while these problems may be extreme on the Pine Ridge reservation and in other very poor places, Americans everywhere are also harmed in serious ways by the zombie policy idea that has created these inequities. It’s a notion that’s supposed to bring everyday low prices for everyone. But in practice it has proved to have the opposite effect, creating more markets in which those with the least power pay the most, while those with the most pay the least. 

Economists use a $20 word to describe the kind of market in which this occurs. They say it’s a monopsony. Monopsony is like monopoly but it’s when big buyers, not big sellers, dominate a market. When many sellers compete for the business of just a few big buyers, that gives the big buyers the power to coerce the sellers into giving them discounts and other concessions none of their smaller competitors can get.

Bottom of the food chain: The “monopsony” power of giants like Walmart means that prices are higher and shortages more frequent at R. F. Buche’s (left) grocery store on the Pine Ridge Indian Reservation. Credit: Courtesy of the GF Buche Company

Concerned with the way the abuse of monopsony power could suppress fair competition and foster corporate concentration, President Franklin D. Roosevelt signed landmark legislation in 1936, known as the Robinson-Patman Act, that made this kind of business practice illegal. And for many decades afterward, the law was a key pillar of America’s political economy, helping to sustain the broad prosperity of the mid-20th century. But in what has turned out to be a colossal policy mistake, politicians in both parties decided to stop enforcing the act after the 1970s. 

That decision, combined with lax enforcement of other antitrust laws, has led to truly baleful consequences. Indeed, though it’s only dimly understood by most people—and outright denied by economists on the left and right who should know better—unrestrained growth of monopsony power has become a major source of the stubborn inflation, supply chain fragility, and gross inequities that define today’s economy. Fortunately, senior officials in the Biden administration are increasingly aware of the problem and willing to do something about it. And they don’t have to get a bill through a suddenly more hostile Congress to do so; they can just enforce a law that’s already on the books.

To illustrate how the neglect of Robinson-Patman affects his business, Buche starts with the example of the price he must pay for a box of Tide laundry detergent. Like many independent grocers, Buche belongs to a purchasing co-op called Associated Wholesale Grocers, which he uses to get volume discounts. In business since 1924, AWG is a big operation with huge economies of scale. It consolidates more than $10 billion in yearly wholesale purchases from 3,100 independent grocery stores in 28 states. As David Smith, the president and CEO of AWG, recently explained in testimony to Congress, because the co-op buys by the truckload and operates highly efficient billion-dollar-plus warehouse facilities, it can get volume discounts for its member stores that they could not get if they acted alone. 

Yet the best wholesale prices the co-op can consistently get for its members are still far above what Walmart and other giant grocery chains routinely pay to restock their shelves. When Buche buys a standard-size box of Tide from AWG, for example, he typically must pay around $21. By contrast, Procter & Gamble, the maker of Tide, sells the same product to Walmart for a much lower price. Just how low is a trade secret, but it is so low that the Walton family makes money reselling it, even in its most remote stores, at an everyday retail price of $14 dollars and change.

Why do the co-op and its member grocery stores like Buche’s have to pay P&G so much more than Walmart does for the same product? It’s not because it costs P&G more to deliver a truckload of Tide to one of AWG’s warehouses than to one of Walmart’s. In fact, it has almost nothing to do with the actual cost of making and delivering products. Instead, it’s because of Walmart’s monopsony power over its suppliers. 

If Walmart ever decided, for example, not to stock P&G products in its 10,000-plus stores, or even to just give those products less prominent shelf space, P&G sales would tank and there would be no way for the company to sell that much product to other retailers. As Albert Foer, former president of the American Antitrust Institute, pointed out in a 2006 study, P&G was at that time (and still is) one of Walmart’s largest suppliers, but it accounted for only 2 percent of Walmart’s sales. By contrast, nearly a fifth of P&G’s sales depended on its sales to Walmart. Once a supplier becomes that hooked on sales to a single buyer, Foer observed, it becomes nearly impossible to resist demands for price cuts and special favors. 

And who pays for those concessions? Giving a special discount to one retailer has the same practical effect as imposing a surcharge on its competitors. The change in relative prices skews the terms of competition and, if the discount is large enough, will lead to monopoly as it drives those who can’t get the discounts out of the market. 

Beyond that, the economist Paul W. Dodson points to what he calls the “waterbed effect.” As suppliers like P&G attempt to recoup the revenue they lose through price concessions to power buyers like Walmart, they may well feel compelled to charge weaker buyers, like Buche, still more. This is especially likely if the suppliers have previously been unable to meet the margins demanded of them by investors and now have no other way to meet their profit targets or cover their fixed costs. Lower prices for players on one side of the waterbed thus can lead to even higher prices for those on the other, putting them at an even greater competitive disadvantage. 

Power buyers are also able to dictate not just prices, but also terms of service to their suppliers in ways that can hurt many innocent bystanders. For example, during the pandemic, when supply chain disruptions caused shortages of meat, baby formula, and many other items, Walmart issued a directive to its suppliers that they must either fulfill 98 percent of its orders or face steep penalties. Consequently, smaller grocers saw their orders for scarce goods only partially filled or not filled at all. Buche says that at one point his allocation of baby formula for all his 22 stores was cut back to just 10 boxes a week. Yet demand at the Pine Ridge Store alone normally comes to 50 boxes. Raw monopsony power, not underlying cost or even the textbook laws of supply and demand, determined which hungry babies got fed and at what price. 

Some people, including highly credentialed experts, say there is no problem here that can’t be fixed with still more monopsony power. Sure, it’s too bad about the poor Native Americans, they will say. And sure, it’s sad to see independent grocers like Buche often put out of business just because a few dominant corporations have more buyer power. But all the Oglala Sioux really need, according to this point of view, is a Walmart. That would bring them lower prices and more secure supplies, and in the process, so goes the argument, increase society’s total consumer welfare. 

Indeed, for a long time, that’s been a dominant frame of analysis applied not just by many conservatives and Big Business apologists, but also by many prominent Democratic policy intellectuals. As far back as the early 1950s, the towering liberal icon John Kenneth Galbraith, for example, defended the growth of the giant retailers of his day, like Sears Roebuck and the Great Atlantic & Pacific Tea Company (A&P). His argument was that these chains provided “countervailing power” to major manufacturers in ways that benefited consumers. 

He cited the price concessions that the Big Four tire makers had to make when they sold tires to Sears, or the discounts on cornflakes that the A&P forced consolidated food processors to offer. In this way, Galbraith argued, the giant chain stores played the same role as European buyer co-ops like the Swedish Kooperativa Förbundet and the British Co-operative Wholesale Societies. The co-ops, of course, were nonprofits dedicated to the welfare of small businesses and their working-class customers, while the chain stores were controlled by Wall Street banks intent on maximizing returns to shareholders, but that did not give Galbraith pause or make him consider what the long-term effects would be. Indeed, Galbraith built a whole philosophy of government around the notion that the promotion of retailer monopsony and other forms of countervailing power had, as he famously put it, “become in modern times perhaps the major domestic peacetime function of the federal government.”

Raw market power, not underlying cost or even the textbook laws of supply and demand, determined which hungry babies got fed and at what price.

Galbraith’s influence later waned, but his faith in the virtues of concentrated buyer power ossified into economic orthodoxy.  It’s what explains why so many liberal economists of the past two generations learned to love big-box stores. In 2006, Jason Furman, who would later become a top economic adviser in the Obama White House, called Walmart a “progressive success story,” citing its ability to drive down prices for poor and moderate-income consumers. In 2013, Charles Kenny, a senior fellow at the Center for Global Development, took up the torch when he published a piece in Foreign Policy under the title “Give Sam Walton the Nobel Prize.” 

In recent years, many well-placed Democratic economists, including both Furman and his mentor Larry Summers, have belatedly discovered the negative effects of monopsony in labor markets. Experience shows that when fewer employers compete for each worker’s labor, that drives down wages. But such is the force of received ideas that many elite policy makers continue to contemptuously reject the idea that monopsony in other parts of the economy can also be harmful. Referring to legislation offered by Senator Elizabeth Warren to curb the power of monopsony in setting prices, Furman tweeted last May, “If you think the baby formula shortage is a problem just wait to see what the world would look like if this became law.” 

But the world does not always work the way neoliberal orthodoxy presumes. As it has turned out, over time it’s not just small businesses and Main Street America that suffer when government tolerates, much less encourages, the continuing growth of private, unregulated monopsony power. We all pay a big and growing price, as consumers, producers, and citizens. Indeed, to the extent that unfettered monopsony chokes off avenues for entrepreneurship and upward mobility, it becomes a threat to economic dynamism and to the very fabric of our democracy. 

Allowing prices to be determined according to who has amassed the most buyer power sets off waves of mergers and acquisitions that ultimately make inflation worse.

The damaging effects begin with the by now well-documented phenomenon of hard-pressed suppliers cutting quality, R&D, wages, health care benefits, pensions, and the like, or outsourcing production to foreign sweatshops, all in order to meet giant retailers’ continuing demands for more and more wholesale price concessions. Back in 2006, the respected journalist Charles Fishman published a book called The Wal-Mart Effect, in which he documented case after case of growing monopsony power already creating these kinds of harms. 

Since then, the continuing rise of monopsony power has revealed another, ultimately even graver consequence. Allowing prices to be determined according to who has amassed the most buyer power sets off massive waves of mergers and acquisitions that over time make the inflationary problems they are supposed to solve far worse. 

The dynamic starts when sellers fight back against the power of giant buyers with defensive consolidations of their own. For example, in response to concentrated buyer power at the retail level, the meat-packing industry has now consolidated to the point that just four vertically integrated giants control 85 percent of the beef market. In a wicked twist, these giant international corporations have not only managed to gain enough monopoly power to countervail against Walmart and other grocery chains, but they have also secured enough monopsony power to extract deep price cuts from their own captive suppliers. As a Biden White House study reveals, this perverse market structure has led to huge across-the-board increases in meat prices for consumers regardless of where they shop, combined with lower incomes for ranchers and farmers who have nowhere else to sell their animals, and record profits for the packers themselves. 

Other suppliers have also been madly combining with each other in order to resist the buyer power of Walmart and other large retailers like Amazon. P&G paid more than $50 billion for Gillette in 2005 and has since gone on a tear of mergers and acquisitions. That largely explains why, when P&G raised prices on a broad range of products in early 2022—from Gillette razors to Dawn dish soap and NyQuil cold medicine—it experienced a sharp boost in net sales: Thanks to relentless consolidation, consumers simply have fewer and fewer alternatives to paying more for P&G’s sundries. Other major Walmart suppliers, like the food processor General Mills, have also been on acquisition binges that today allow them to raise prices across the board while earning record profits

In a vicious cycle, these mergers among suppliers in turn have led to another massive round of defensive consolidations among retailers themselves. A recent example is Kroger’s $24.6 billion acquisition of Albertsons, which, if approved by regulators, will create a mega-grocer with roughly the same buying power as Walmart. Notes Stacy Mitchell, codirector at the Institute for Local Self-Reliance, if the deal goes through there will be 160 cities in America where more than 70 percent of the grocery sales are controlled by just these two massive companies. With that degree of retail domination, the duopoly will be more able to extract deep discounts from its suppliers while having less and less reason to pass along any of the savings to mere shoppers. 

The same dynamic is at play in many sectors of the economy, but perhaps most tragically in health care. Ever since health care inflation became a major societal concern in the 1970s, health care policy experts, including highly influential figures like the late Uwe E. Reinhardt, have promoted the idea that health care costs could be best reduced by increasing the monopsony power of large, private purchasers of health care, such as HMOs and other health care insurance plans. The idea was that by subjecting hospitals and other health care providers to more concentrated buyer power, they could be coerced into accepting lower reimbursement rates and other price concessions. But while round after round of mergers and acquisitions among insurers did contribute to a pause in the growth of health care expenditures in the 1990s, it soon set off a counterwave of mergers among hospitals and other providers that is still building, with baleful results.

By now, many communities are completely dominated by a single integrated giant health care system, encompassing hospitals, doctors’ practices, and labs, that faces virtually no competition. Abundant studies show that these behemoths don’t share any savings they might achieve through increased efficiency or economies of scale. Nor do they deliver any better quality of care. Rather, they swell their revenues by jacking up prices for patients and their health care plans. 

Even in markets where some competition still exists, it largely takes the form of insurance company bureaucrats and hospital chain administrators competing to see who can impose what price discrimination on whom, rather than over who can provide the best health care to the community. At the same time, consolidated hospitals have enough monopsony power to drive down the wages they pay to nurses and other health care workers, who have nowhere else to sell their labor without moving to another city whose health care sector has not yet become so thoroughly concentrated. As with meat-packing and many other industries, the combination of monopsony and monopoly power in health care makes the rich richer and leaves most everyone else paying more for less. Though classified as charitable “nonprofits,” many hospitals have found an extractive business model that targets services to the most lucrative patients and treatments while financing inflated CEO compensation packages and imperialistic building programs. In some major cities, like Pittsburgh, the cycle has culminated with the hospitals and health insurers simply consolidating into one giant platform in which buyers and sellers of health care are part of the same entity and as such can legally collude in charging patients and their insurers whatever they please. 

Efforts to control health care inflation through the promotion of monopsony power have also backfired when it comes to the supply chains for medical equipment and drugs. As far back as 1910, for example, hospitals began participating in so-called group purchasing organizations (GPOs), which allowed them to gain volume discounts by pooling their orders for hospital supplies in much the same way independent grocers have long relied on buyers’ co-ops like AWG. But in 1987, Congress perverted this cooperative system by granting GPOs exemption from anti-kickback laws. 

This led to a system in which the largest GPOs could use their buyer power to coerce special “rebates” and “administrative fees” from suppliers—which they often didn’t deign to share with hospitals. This increase in buyer power and self-dealing in turn incentivized defensive mergers up and down the health care supply chain that ultimately worsened the disease for which it was supposed to be the cure. Major GMOs became captured by major hospital suppliers, like Becton Dickinson. Meanwhile, med-tech companies like GE HealthCare and Medtronic engaged in frantic mergers and acquisitions activity to ensure that they acquired the market share needed to stand up to the increasing concentrated buyer power of GPOs. In turn, GPOs madly merged with each other to maintain or augment their own countervailing power. 

Meanwhile mega hospital chains grew so large that they could unilaterally dictate prices to their suppliers, thereby gaining an even greater, competitive advantage over smaller, community-owned hospitals. In his book The Hospital, which chronicles the decline of one such facility in rural Ohio, Brian Alexander notes that the best price it could get for a stent commonly used to open up clogged arteries was around $1,400, while big hospital chains use their monopsony power to buy the same product for roughly half that price. Adding to the social and economic harms set off by the concentration of monopsony in health care have been loss of innovation and shortages of essential drugs. As giant incumbents throughout the supply chain used mergers and decriminalized kickback schemes to suppress competition, key technologies such as lifesaving retractable hypodermic needles, for example, remained unavailable for years, while the number of companies manufacturing antibiotics and oncology drugs dangerously dwindled

Similar harms have flowed from the growth of so-called pharmacy benefits managers. Reformers hoped that health insurance plans and pharmacies could use these purchasing agents to boost their buyer power and wrest lower prices from drug companies. But as with GPOs, allowing those with the most buyer power to get the lowest prices set off a cycle of collusion, kickbacks, cost shifting, and corporate consolidation that ultimately not only drove up prices but also deeply compromised supply-chain resiliency

You might be thinking at this point that someone should pass a law to prevent these kinds of inflationary, inequitable, inefficient business practices and channel market competition back to productive purposes. But remember, someone already has. 

When FDR signed the Robinson-Patman Act, supporters hailed it as the “Magna Carta of small business.” Detractors called it the “cracker barrel bill. 

Championed primarily by the populist Texas Democratic Congressman Wright Patman, the law was mostly intended to benefit small independent grocers, mom-and-pop pharmacies, and other locally owned enterprises. But it did so not by protecting them from competition, as some critics claimed. Instead, the law helped to prevent the abuse of concentrated corporate buyer power and create a fair and level playing field for all businesses by applying basic principles of political economy that Americans had long used to manage competition in other key sectors.

In 1887, for example, Congress passed the Interstate Commerce Act, which made it illegal for railroads to favor large powerful shippers with special rebates and discounts. When everyone paid the same rate for the same freight service, competition shifted from who had the most pull with the railroads to who had the best product. Robinson-Patman similarly made it illegal for retailers, manufacturers, and distributors operating at the wholesale level to engage in price discrimination based on buyer power. 

Under Robinson-Patman, it remained permissible to offer volume discounts or adjust prices to reflect the demonstrably different costs of serving different customers. It also remained legal to lower prices across the board to match those offered by a competitor. But it became illegal to offer different prices or terms of service to different customers based simply on their market share. This meant that a large retailer like the A&P, for example, could no longer use its monopsony power to coerce special treatment from its suppliers, such as lower prices, rebates, special advertising allowances, and the like. Under Robinson-Patman such business practices became classified as forms of commercial bribery or kickbacks. In effect, the act required all suppliers to offer the same prices and terms of service to all dealers, large and small. 

The Federal Trade Commission rigorously enforced Robinson-Patman until the mid-1970s. The results were salutary. The law encouraged competition and innovation at the retail level while also preventing ever-tightening cycles of offensive and defensive mergers leading to more and more corporate concentration. During this era, for example, American consumers benefited from the spread of modern, well-stocked supermarkets and department stores. But enforcement of Robinson-Patman ensured that most were operated by local or regional chains and not controlled, like the old A&P, by a handful of distant Wall Street banks. 

Moreover, enforcement of Robinson-Patman ensured that the growth of large retailers was based on superior efficiency, service, selection, and real economies of scale, not on using monopsony to coerce special discounts and rebates from suppliers and thereby suppress competition from other stores. America thus enjoyed a vibrant, balanced, and diverse retail sector in the postwar decades in which locally owned stores and locally owned suppliers thrived alongside national chains. In the mid-1950s, more than 70 percent of retail sales went to independent retailers with a single location. More than a fifth of all retail workers owned the store in which they worked, either as a sole proprietor or in partnership with others. Small store and restaurant ownership, from kosher groceries and Greek diners to hardware and hobby stores owned by other “hyphenated” Americans continued to provide ladders of upward mobility for generations of immigrant families. 

Key figures in both parties are realizing that enforcing Robinson-Patman will help build the fairer and more competitive economy demanded by Americans across the political spectrum.

Despite these clear benefits, however, Robinson-Patman came under growing attack from increasingly powerful elements in both parties. As Matt Stoller chronicles in his book Goliath, by the early 1970s a new generation of Democrats tended to view Robinson-Patman and other expressions of Depression-era populism as embarrassing relics. Concerns over building inflation also caused leading voices within the party to become increasingly persuaded by Galbraith’s argument that government should encourage the unrestrained growth of giant discount stores as a way of getting better prices for consumers. 

Meanwhile, an increasingly powerful movement of “free market” conservatives also attacked the law, arguing that any tendency toward monopoly that might follow from its repeal would be automatically corrected by market forces. Robert Bork, who once attacked Robinson-Patman as the “Typhoid Mary of antitrust,” published a highly influential book in 1978 in which he blithely rejected concerns that legalizing price discrimination based on buyer power could ever lead to monopoly. “Impossible,” he wrote. If power buyers abused their market strength, he promised, “the market” would simply replace them.

Today, we know better. Congress never dared to repeal Robinson-Patman, but enforcement slowed dramatically in the late 1970s and effectively stopped after Ronald Reagan became president, with long-term results that should have been predictable. When combined with lax enforcement of other antitrust and competition policies, the retreat from Robinson-Patman gradually restructured industry after industry in ways that are today driving up prices, suppressing wages, and contributing to the undersupply and maldistribution of more and more essential goods and services—from baby formula and affordable healthy food to prescription drugs and hospital beds. 

Fortunately, key figures in both parties are waking up to the need to enforce this vital law once again. The FTC, under the chairmanship of the Biden appointee Lina Khan, announced in June that it is studying the use of Robinson-Patman to prosecute illegal bribes and rebates among pharmacy benefit managers. Alvaro M. Bedoya, an FTC commissioner, has also become an articulate champion of expanding enforcement of Robinson-Patman across the board. This follows a letter sent to Khan in March by 43 members of Congress, more than half of them Republicans, urging her and the other FTC commissioners to use Robinson-Patman to investigate the anticompetitive effects of price discrimination that “ripple through the entire supply chain—harming consumers as well as independent producers.” 

As in the 1930s, much of the support for Robinson-Patman comes from struggling small business owners in rural congressional districts—notably independent grocers like Buche as well as increasingly well-organized independent pharmacists. But today the role of monopsony in driving up prices and deepening inequities across the board gives the broader public a building case for insisting that Robinson-Patman be enforced. It may be that the act should be amended to make it clearer what companies and practices it covers and how it applies to today’s giant e-commerce platforms like Amazon. But as it is, enforcing the law provides a ready vehicle, requiring little to no appropriation from Congress or use of tax dollars, for rebuilding the fairer and more competitive economy demanded by Americans across the political spectrum. It’s high time we used it. 

The post Everyday High Prices appeared first on Washington Monthly.

]]>
145079 Jan-23-Longman-BucheStore7 Bottom of the food chain: The “monopsony” power of giants like Walmart means that prices are higher and shortages more frequent at RF Buche’s (left) grocery store on the Pine Ridge Indian Reservation.
Manufacturing and Liberty https://washingtonmonthly.com/2023/01/08/manufacturing-and-liberty/ Mon, 09 Jan 2023 01:35:00 +0000 https://washingtonmonthly.com/?p=145088

As Biden strives to break China's hold on the West, it's time to relearn America's no chokepoint strategy.

The post Manufacturing and Liberty appeared first on Washington Monthly.

]]>

What a glorious moment it seemed, the mid-1990s. The Soviet Union had collapsed, and peoples around the world were embracing American-style liberal democracy and capitalism. Better yet, America was a hegemon with no need ever to twist another arm. Economists had begun to speak of a revolutionary new approach to managing power. Three great natural forces—globalization, digitization, the market itself—were remaking the world for us, by destroying all antidemocratic concentrations of political and economic control.

The business corporation, we were told, was melting away. America was becoming a nation of “free agents” able to work with whoever we wanted, bound by little more than a gossamer-thin net of contracts. At the social level, this meant no more need for regulation of business, or for checks and balances between the state and private enterprise. Give free rein to these forces, and they would evolve the economy all but automatically toward a world characterized by, as Robert Reich wrote in The Work of Nations, the “diffusion of ownership and control” within a “global web.”

More radical yet, the nation-state itself was vanishing into the mists of history. The 1980s had been a decade of alarms about new Soviet arms systems, and how Japanese and Germany industry threatened U.S. factories and jobs. Then suddenly these worries vanished. Deep industrial interdependence, we were assured, was tying the people of the world into a single borderless economic community, which would ensure not just mutual prosperity but peace on Earth. As a book titled The Pentagon’s New Map put it, the world had a new “operating theory” in which “connectivity” would “trump” all the old tensions and rivalries. Indeed, within “globalization’s Functioning Core” of industrialized nations, armed conflict had already become impossible.

So for a quarter century we cruised, with hardly a thought about how and where the goods, foods, and drugs on which we depend were made, grown, and traded. Sure, there were a few glitches in the new global matrix—September 11th, the Lehman crash of 2008. But on we went, largely heedless of how the capitalists were using their corporations to concentrate control over factories and foundries and chemical plants, and were then shifting these capacities to the far side of the ocean in ways that destroyed far more than jobs. Even when Donald Trump howled the words “America First,” most college-educated liberals dismissed the idea as silly—just angry white racists, pining for a moment that efficiency and social progress had rendered obsolete. Soil and grease under our nails? Ha! Fingers were for summonsing Ubers and pushing “buy” buttons.

Well, the coronavirus pandemic, Russia’s invasion of Ukraine, and China’s blockade of Taiwan have slapped us awake. And what we see is terrifying. Just in the past three years we have found ourselves suddenly without face masks to protect against a pandemic, without the chemicals we need to test for viruses, without container ship and rail capacity to move basic goods, without semiconductors to build airplanes and medical devices, without formula to feed babies, without natural gas, and with roaring inflation in almost every sector of the economy. Worse, we’ve realized that it’s not hard to imagine far more catastrophic industrial crashes, and, indeed, the White House recently warned that a conflict in Asia could cause $2.5 trillion in damages the first year alone. Rather than harmonious interdependence among peoples we may soon be forced to choose between dependence on autocratic regimes and wider war.

For a quarter century after the end of the Cold War we cruised, largely heedless of how the capitalists were using their corporations to concentrate control over factories and foundries and chemical plants.

Josep Borrell, the European Union’s head of foreign affairs and security policy, put the problem succinctly. “We have decoupled the sources of our prosperity from the sources of our security,” he said in a recent speech. For decades, the West as a whole relied on China for manufactured goods and on Russia for energy, he said. But that “world … is no longer there.”

It’s not surprising that Americans and Europeans are suddenly scrambling to devise “industrial policies” able to ensure that we can build what we need to be secure and to live well. And that in the United States, President Joe Biden and Congress have already pledged hundreds of billions of dollars to build new semiconductor foundries and components for electric vehicles.

That’s a very good thing. But these still modest advances are already threatened from every side—by the monopolists and the people in their pay, by the Chinese Politburo and the people in their sway, even by the temptation to load every progressive dream onto every individual project.

Done right, a new industrial strategy can help America and its allies solve most of the great crises of this moment. But left incomplete, industrial policy will make many of the biggest problems only worse. And unfortunately, today we still lack a clear hierarchy of threats to guide our decisions, an understanding of how to use competition principles and policies to achieve our ends, and a political narrative that explains why we must see this grand effort through to success.

The extreme concentration of capacity today is something largely new in the world. At the end of the Cold War, most industry was distributed widely. The United States, Europe, and Japan each manufactured their own vehicles, electronics, semiconductors, chemicals, and metals, all the way from subcomponent through finished product. Yes, many Americans drove Toyotas and many Europeans owned Fords, but those cars were usually built at factories within those regions. Today, by contrast, we see increasingly extreme chokepointing within most industrial systems, often to the point where a vital product or key component is manufactured in a single location—sometimes even a single factory—on the other side of the world.

The most well-documented such chokepoint is in semiconductors. Here, lack of supply can trigger cascading shortages across entire global industrial sectors. This is what  happened over the past two years with car production, which in turn led to shortages of used cars and rental cars. Taiwan alone manufactures 92 percent of the world’s most advanced chips, and 75 percent of total semiconductor capacity is located in East Asia. But we see similar concentrations of capacity in the manufacture of pharmaceutical ingredients, antibiotics, agricultural chemicals, industrial food chemicals, industrial gases, and basic materials and minerals including polysilicon, graphite, cobalt, magnesium, and rare earths. Similarly, we see extreme chokepointing of the capacity to assemble iPhones and laptops and to manufacture basic electronics components.

The cause of this concentration is easy enough to discover. Hidden behind the utopian rhetoric used to justify U.S. abandonment of industrial policy in the 1990s was simply an alternative industrial policy—basically, “Let the monopolists rule.” 

We can trace the origins of this ideology to the old feudal systems of corporate control that Americans rejected in the Revolution. In the 1970s, the United States faced a harsh combination of inflation, recession, and rising international competition. In response, left-wing pro-monopolists led by the economist John Kenneth Galbraith and right-wing pro-monopolists at the University of Chicago led by the economist Milton Friedman began to argue that antitrust and other regulation was inefficient and that it was smarter to allow big corporations to, in essence, regulate themselves. Thus unleashed, monopolist corporations at home and mercantilist states abroad ruthlessly consolidated power, then used that power to strip out industrial redundancies in ways that left vital production chokepointed in all but a handful of places.

Japan, Taiwan, South Korea, Germany, and even the Netherlands all ended up holding concentrations of core industrial capacities. But it was China that captured the vast majority of such chokepoints, and the ones most important for maintaining day-to-day life.

The most pressing threat posed by this new structure is that entire industrial systems will simply crash due to a sudden loss of access to some keystone component. The distributed industrial structure of the postwar era ensured that there was always a backup when something went wrong. Today, by contrast, the loss of any one of many single points of failure can trigger cascading collapses of fundamentally important industrial systems. Since the late 1990s, we have seen many such events. In addition to the disruptions caused by COVID-19, there were earthquakes in Japan, floods in Thailand, a volcanic explosion in Iceland, a political spat between Seoul and Tokyo, various financial crises, and the stranding of a single container ship in the Suez Canal, each of which triggered cascading shutdowns of entire industrial systems, including automobiles, electronics, and chemicals. And thus far we’ve been lucky. As destructive as many of these disruptions have been, in every instance it is easy to see how the shock could have been far worse.

Politically, the main threat is that China will simply squeeze or cut off shipments of products we need, in order to force the United States, its allies, or individual Western corporations to cede to some specific demand. China’s control over the production of so many essential components—as well as large shares of the profits of Apple, Disney, Volkswagen, and other corporations—gives it innumerable ways to bend even the most powerful actors to its will.

The United States and its allies can retaliate by blocking shipments of certain products and materials to China—and, indeed, the Biden administration in early October sharply limited sales to China of high-end chips and manufacturing tools. Similarly, Trump-era sanctions cost the Chinese manufacturer Huawei billions of dollars in sales of mobile phones and communications equipment. But the ultimate question in the event of a true showdown between nations is which has more leverage. Or, rather, which nation has the ability to force the other to say “Uncle.”

In our present confrontation with China, we don’t appear to have gamed out how to respond should Beijing counter by cutting off shipments of iPhones, drugs, and chemicals essential to farm and food production.

If there’s any doubt how this works, we need only look to Russia’s effort to force Europe to abandon Ukraine by cutting oil and gas flows. Putin’s power play also teaches us that extreme dependence on another nation may actually tempt that nation to act aggressively. There is much evidence, for instance, that Germany’s dependence on Russian energy helped convince Vladimir Putin that Ukraine was his to take.

Monopolist corporations at home and mercantilist states abroad ruthlessly consolidated power, then used that power to strip out industrial redundancies in ways that left vital production chokepointed in all but a handful of places.

The Ukraine war also demonstrates how monopolists can work together to buttress each other’s power. Many German manufacturers, for instance, have responded to soaring prices for gas and oil by shifting yet more production to China of everything from chemicals to electrical equipment to automobiles. As one Chinese commentator gloated recently, Russia’s war has created “tremendous opportunities for China.”

The Declaration of Independence set many revolutions into motion, as white men in America declared themselves free from the power of crown, church, and corporation. But it was the independence of nation from nation that led the citizens of the new United States to understand that they needed a true industrial strategy. If Americans meant to keep their democratic republic, one skill they had to master was manufacturing the weapons and ships they would need to protect themselves. 

In 1791, Treasury Secretary Alexander Hamilton made the first effort to devise such a plan. In his “Report on Manufactures,” Hamilton proposed to subsidize construction of new factories and then to use tariffs to pay for the subsidies and to protect the new factories from foreign rivals. From the first, however, Thomas Jefferson, James Madison, and others assailed Hamilton’s vision as mainly a way to centralize wealth, power, and political control in the hands of a new gang of homegrown monopolists.

After winning the presidency in 1801, Jefferson—along with Secretary of State Madison and Treasury Secretary Albert Gallatin—began to develop an industrial policy they believed posed fewer threats to American democracy. Their approach was based on the simplest of rules: Break all potential economic chokepoints at home and act to assure America’s independence from all economic chokepoints abroad.

Under their model, the government did subsidize certain industries essential to defense, such as the Springfield Armory and the Brooklyn Navy Yard, but only if under direct government control. Their main tool was competition policy, both in the form of strict controls over banking and the governance of corporations and in the form of strong enforcement of anti-monopoly laws at the state and local levels.

By the time the French political writer Alexis de Tocqueville visited the United States in 1831, it was clear that this approach was a startling success. In Democracy in America, he wrote,

The United States of America have only been emancipated for half a century from the state of colonial dependence in which they stood to Great Britain; the number of large fortunes there is small, and capital is still scarce. Yet no people in the world has made such rapid progress in trade and manufactures.

By the early 1850s, Samuel Colt, Isaac Singer, and Cyrus McCormick had demonstrated how to manufacture great numbers of identical pistols, sewing machines, and reapers using machines arrayed in assembly lines. This new “American system” of production so impressed British industrialists that Parliament held hearings. Then, in 1853, Colt built the world’s first overseas factory, near Hyde Park in London.

The distributed industrial structure after World War II ensured that there was a backup when something went wrong. Now, the loss of any point of failure can trigger cascading collapses of important industrial systems.

After the Civil War, however, the maturation of American railroads provided would-be moguls with a way to leverage their way to great power. Although the railroads themselves were highly regulated through their corporate charters and later through the Interstate Commerce Act, industrial barons including John D. Rockefeller and Andrew Carnegie figured out how to exploit the monopoly nature of railroad networks to grow their own businesses to great scale and scope.

In the 1880s, Wall Street bankers took monopolization to the next level, through cartelization of capital, new ownership structures, and other restrictions on the industrial liberty of rival entrepreneurs. The banker J. P. Morgan is remembered today largely for monopolizing the gates to credit. But he also exercised power through interlocking directorships and via control over a vast array of patents in the electrical, telephone, steel, and other industries.

In his presidential campaign in 1912, Woodrow Wilson aimed largely at Morgan’s system of control, and charged Wall Street with threatening democracy itself. But once in the White House, Wilson—along with his intellectual partner Louis Brandeis—looked far beyond Morgan and set about a complete updating of Jefferson’s no-chokepoints rule for the industrial 20th century. Within Wilson’s first 18 months, this included helping to pass the Clayton Antitrust Act, Federal Trade Act, Federal Reserve Act, and first modern tariff system, as well as forcing Morgan-owned AT&T to spin off the Western Union telegraph company.

Wilson, Brandeis, and their allies called their vision the “New Freedom,” and they centered their system on simple bright-line limits on the structure of markets and behavior of corporations. One core rule was that there always be multiple rivals in any business—at both the national and local levels. A second core rule was that corporations that control essential networks like the railway and telephone treat every customer the same. They believed that these two rules, in combination, would in turn deliver fair market wages and prices, opportunity to compete, high-quality goods, and rapid technological innovation. Or at least, by breaking all major concentrations of private power, make it easier to use the power of public government to achieve such aims.

In the mid-1930s, Franklin D. Roosevelt used the New Deal to extend the goals and principles of Wilson’s updated anti-monopoly system into all realms of U.S. political economic regulation. In 1940, FDR’s administration threw America’s competitive industrial system into overdrive with a plan to massively subsidize construction of factories to prepare for war with Germany and Japan. Using an agency called the Defense Plant Corporation, the government paid for some 2,300 factories in 46 different states. Under this model, the government directly owned the plants, then leased them to private operators.

In the 1940s and ’50s, the administrations of Harry Truman and Dwight Eisenhower strengthened and refined the no-chokepoints system. They formalized the idea that government must act to ensure that no corporation control more than 25 percent of the market for any industrial good. And they moved to immediately achieve this end by carefully selling government-owned defense plants to whoever promised to compete with dominant manufacturers. In one famous case, the government broke Alcoa’s long-held chokehold on U.S. aluminum by selling government-built mills to rival manufacturers Kaiser and Reynolds and by using antitrust authority to all but force Alcoa to share its patents.

The Truman and Eisenhower administrations also exported the no-chokepoints system to Europe, Asia, and the larger world trading system. They did so by imposing strict antitrust regimes on Germany and Japan during the postwar occupations of those countries. And although they failed in an effort to build anti-monopoly principles into the postwar trading system, they used America’s tariff system and economic might to achieve the same basic ends by steering much new industrial capacity to France, Germany, Italy, and Japan, and later to South Korea and Taiwan. They also used the new World Bank system to expand industry in Brazil, Mexico, Argentina, India, and elsewhere. Perhaps most surprising from today’s perspective, for more than 40 years the U.S. government used anti-monopoly law to force America’s most powerful manufacturers—including General Electric, AT&T, and RCA—to share their patents with rivals, including corporations abroad.

One of the purest illustrations of how the United States used anti-monopoly principles to enforce the no-chokepoints system internationally came in the mid-1980s. Japanese manufacturers launched a coordinated effort to capture control over the production of semiconductors and the components in personal computers. Ronald Reagan’s administration responded by using tariffs, quotas, subsidies, and other forms of pressure to break Japan’s hold on these capacities. They did so in ways that boosted production not only in the United States, but also in Europe and in countries new to these businesses, including South Korea, Taiwan, Malaysia, and Singapore.

The result was a truly international system with almost no industrial chokepoints. When the Soviet Union collapsed—in no small part because its own highly chokepointed industrial system had seized up—the rest of the world was served by the most open, stable, and innovative production system in human history. It was a system that brought many nations together in constructive cooperation. In short, by the end of the Cold War America’s no-chokepoints system had delivered both a phenomenal prosperity and a wide and growing peace to many of the peoples of the world.

Tragically, at that very moment of triumph, the United States was already in the process of dismantling the system that had delivered so much success. The Reagan administration, even as it continued to enforce a no-chokepoints regime abroad, effectively abandoned the enforcement of antitrust law in the United States, under the theory that monopolistic efficiency would result in greater “welfare” for “consumers.” Beginning in 1993, Bill Clinton’s administration applied this pro-monopoly thinking to trade policy, most dramatically through the Uruguay Round of the General Agreement on Tariffs and Trade, in 1994. This one-two punch left corporate managers free to concentrate capacity at home and then to shift that capacity to whichever nation provided them with the most lucrative deal.

And so, over the next quarter century, monopolists and mercantilists concentrated sector after sector, using their power to strip much, if not all, of the redundancy and resiliency from these production and transportation systems on which we depend. 

Our challenge today, as in 1801 and 1912, is to break all potential economic chokepoints at home and assure America’s independence from all chokepoints abroad.

To succeed, our industrial strategy must fit the realities of this moment. Simply attempting to replicate what we did during World War II won’t work. Our situation today could hardly be more different. Then, almost every factory on which we depended was located in the United States. Today, many of our most important factories lie within the borders of China, a strategic rival with whom we are already in industrial conflict. Further, the extreme concentration and tight gearing of today’s production and transportation systems means that any sudden disruption can trigger a truly shattering shock to our nation and the world as a whole.

Our task, in short, is to design and implement a strategy to widely redistribute industrial capacity, within a world in which production and control are now highly chokepointed, without triggering industrial collapse, provoking a catastrophic conflict, or yielding to industrial coercion. Success demands that we view this task as, to a very large degree, a matter of engineering the U.S. and international industrial systems—using law and policy—to ensure their stability.

The lessons of American policy over our nation’s first two centuries point to a relatively simple six-point plan:

The main threat is that China will simply squeeze or cut off shipments of products we need, in order to force the U.S., its allies, or individual Western corporations to cede to some specific demand.

Classify. Our first task is to map every potentially dangerous industrial chokepoint, then determine which pose the gravest threats. In doing so, we must keep in mind that China is not the only nation that might seek to exploit such chokepoints to extort political or economic benefit. We must imagine potential acts by Russia, North Korea, Iran, even our own allies, as well as factions within all these nations. In identifying dangerous chokepoints, we must also imagine the potential effects of natural disasters, financial crises, and third-party wars. This must include every chokepoint within the borders of the United States and our allies, so we know exactly what levers we still hold.

Collaborate. Second, we must work closely with our G-7 allies and other key industrial partners both to identify and classify chokepoints, and to devise a plan to break every dangerous concentration of capacity. Trump’s America First message and some of the more aggressive statements by Biden officials may play well with many voters. But if our goal is to increase our security and reduce tensions as swiftly as possible, blunt protectionism makes for bad policy. We already have an institution purpose built for such a project—in the Organisation for Economic Cooperation and Development (OECD). Let’s put it to its original use again now. 

Construct. Third, we must begin to rebuild almost every industrial capacity the monopolists and mercantilists concentrated, roughly in the sequence demanded by the relative level of threat posed by each chokepoint. The CHIPS and Science Act and the Inflation Reduction Act are an excellent start. But it’s vital also to begin immediately to rebuild outside of China the capacity to produce pharmaceutical inputs, antibiotics, industrial materials, electronics components, and agricultural and food system chemicals. This will require a level of investment and coordination far larger than we have yet seen. It will also require the United States to strongly encourage Europeans and other allies to do the same within their own nations, rather than complaining about U.S. investments designed to make all nations safer. 

Thomas Jefferson’s industrial policy was based on the simplest of rules—break all potential economic chokepoints at home and act to ensure America’s independence from all economic chokepoints abroad.

Compete. Fourth, we must restore true competition among manufacturers in all essential industrial systems. Our goal in rebuilding factories in America is not to provide “national champion” manufacturers with a quiet and wildly profitable life. It is to assure a safe physical distribution of industrial capacity around the world, and of ownership and control over that capacity. There are many ways to structure competition to ensure that capacity within the international industrial system remains distributed, once we have rebuilt our factories. Perhaps simplest and least intrusive is to impose a set of negative quotas designed to guarantee that no one nation controls more than a specific fraction of our consumption of any good, component, or material. (I wrote about this in the July/August 2021 issue of Foreign Affairs.)

Converse. Fifth, we must engage China in an unfettered discussion about how to work together to avoid a catastrophic disruption of industrial production, and how to cooperate in redistributing capacity to assure the stability and resiliency of systems. The obvious model is the negotiation the United States undertook with the Soviet Union after the Cuban Missile Crisis to avoid a mutually catastrophic nuclear war. Like any monopolist, China has an interest in maintaining its hold over the chokepoints it controls. It has reaped enormous political and economic benefit from such power. But no matter how confident the country’s leaders may be of prevailing in a winner-take-all industrial showdown with the West, they also know that such a conflict could devastate the victor almost as much as the vanquished. Practically, this will require the United States to demonstrate the will to swiftly escalate any industrial conflict, in order to ensure a constructive balance of fear. This will also require the United States and its allies to demonstrate that their goal is not a full industrial decoupling or permanent hobbling of China’s industrial arts and sciences, but rather a careful reduction of flashpoints and tensions.

Cooperate. Finally, we must relearn how to treat nations beyond the G-7 and China as equals. The original promise of the financial and trading systems established after World War II was to develop the industry and skills of all peoples. For half a century, albeit imperfectly, this worked. Since the mid-1990s, however, the general practice of the richest nations has been to subject production and finance in the global South to ever more rapacious and authoritarian control by Western corporations, banks, and “multilateral” institutions, many of which in turn delivered these national markets to governance by Chinese industry and the Chinese state. Over the past decade, the problem has been made only worse by the political and social effects on these nations of unregulated communications and commercial platforms like Facebook and Twitter, and the more clearly imperial projects of China-controlled platforms such as Alibaba.

The moral reason for reversing this despoilment and disenfranchisement of half the world is obvious. There’s a more selfish reason as well. The swiftest way to fully engage the people of India, Brazil, South Africa, Indonesia, Mexico, and other nations in the project of building a secure and resilient industrial system is to give each a full seat at the table. And the best way to enable these peoples to build their own democracies and societies in the fashion that fits them is to impose traditional regulatory controls on Google, Facebook, Twitter, and other platform monopolists.

In two years, the Biden administration has overseen the greatest change in industrial strategy in America in more than half a century. This is most clear in practical policy, where the White House and Congress have directed more than $200 billion to rebuilding industrial and scientific capacities in the United States. 

It’s also true in terms of the philosophy the government relies on to understand and regulate power in the political economy, at home and abroad. In July 2021, for instance, Biden personally condemned the ideology of Robert Bork, who in the early 1980s led efforts to reorient domestic competition policy around the goal of promoting monopoly in the name of efficiency. Internationally, the White House has renounced the extreme free trade thinking President Clinton embraced in the 1990s. Not only did the Biden administration keep Trump-era tariffs on China, it all but abandoned the World Trade Organization, ignored calls for new multilateral trade agreements, and introduced wide-ranging “Buy American” rules for electric vehicles.

In October, National Economic Council director Brian Deese summed up the efforts in a speech in Cleveland. The United States, he said, was back in the business of using government to shape industrial outcomes. “There’s a strong animating vision that unifies” these efforts, he said, “a modern American industrial strategy.” 

Unfortunately, huge holes both in the conception and the practice of this strategy have left much work undone, and threaten the achievements of the past two years.

If we view the government’s overall effort in relation to the magnitude of the threat, and against the guidance provided by the “Six Cs” plan for rebuilding America’s traditional no-chokepoints system, we can identify many fundamental flaws. These include, foremost:

No overarching plan. The government has yet to devise a plan that sets priorities for what to invest in first, or even a plan to address any sudden break in the international production system. It doesn’t even have a team tasked with refining and overseeing such a plan. This is why the U.S.-China Economic and Security Review Commission recently called for the creation of an “office within the executive branch to oversee, coordinate, and set priorities … to ensure resilient U.S. supply chains and robust domestic capabilities, in the context of the ongoing geopolitical rivalry and possible conflict with China.”

Too narrow a focus. The CHIPS and Inflation Reduction Acts will, in theory at least, jump-start the rebuilding of large parts of America’s semiconductor and electric vehicle production systems. Yet despite many warnings, the United States has yet to devise policies to address the extremely dangerous chokepointing—mainly in China—of chemicals, drugs, electronics, materials, and electronics components. Nor has the government fully developed a plan to address the chokepointing of the supply systems for chips, despite extreme concentrations of capacity not only in China and Taiwan, but also in Japan, South Korea, Europe, and the United States.

No means to compel action. The key political lesson of America’s traditional no-chokepoints strategy is that the state must use anti-monopoly and trade power to force corporations to serve the public and to invest in new capacities and skills. Although Jefferson, Wilson, FDR, and Eisenhower sometimes used subsidies to boost production and research, they understood that without a judicious use of the stick, not even the biggest pile of carrots would keep a horse from bolting once he’d eaten his fill. Unfortunately, as Brian Deese made clear in Cleveland, the Biden administration’s industrial policy thus far consists of one tool only—the use of public investment to subsidize private investment, and profit.

Confusion of policy for strategy. The White House has at various times said its new industrial strategy aims at more jobs, more rapid decarbonization, more opportunity for small business, and less dependence on China. Such microtargeting of the message helps to sell legislation. But if taken literally, it can swiftly distract us from the core structural changes necessary to achieve even one of these goals. The key insight of America’s traditional no-chokepoints system is that structure is strategy, and that we should always focus on shaping markets and the behaviors of corporations to break all concentrations of power. The resulting distribution of capacity and control all but automatically assures the redundancy and stability of industrial systems, while empowering us to respond to any specific shock or political threat. Such distribution of capacity and control also makes it far easier for citizens to use government investment to achieve other political, social, and economic goals, by lessening the political power of those few who favor the status quo.

In the 1940s and ’50s, the Truman and Eisenhower administrations formalized the idea that government must act to ensure that no corporation control more than 25 percent of the market for any industrial good.

No game theory. Over the decades, the U.S. military has developed numerous detailed plans to respond to potential Chinese military actions. But the U.S. government appears to have spent little time thinking through how to respond to a Chinese industrial blockade or embargo. Nor how its own sanctions on China might affect Beijing’s decisions and actions. As the Financial Times columnist Rana Foroohar wrote recently, the Biden administration is “pushing geopolitical hot buttons at a time when the US has yet to develop a detailed action plan for the economic fallout from such a conflict, or even the continued decoupling of the US and Chinese economies.” Absent such scenario planning, many U.S. actions today border on recklessness.

No master narrative. The White House has put together everything necessary for a story fit to guide the American people through a turbulent period of industrial rebuilding. It has identified villains, in the form of intellectuals whose ideas empowered today’s monopolists. It has set in motion a plot, in which the government both breaks the power of the old guard and subsidizes the rise of the new. It has pointed to the promise of a better world, in the form of stronger democracy, faster decarbonization, a lasting peace. But the White House has yet to weave these elements into a master narrative that explains in simple language how we got here, where we are going, how we’ll get there, and why. Which in turn has left the White House and Democrats in general subject to the misinformation and propaganda of others, be it Republicans yapping about inflation or grand capitalists and their academic lackeys warning of the economic costs of any radical change in policy toward the monopolists and China.

This last failure is especially dangerous. For 20 years now, I and others have raised many alarms about the threats posed by the extreme chokepointing of industrial capacity. This includes writing the first article on the issue in June 2002, publishing the first book on the threat in July 2005, and raising the first warnings about China’s direct control over U.S. corporate leaders in October 2015. 

Over that time, many officials within the U.S. government responded. I myself engaged directly with top leaders in the Treasury and Commerce Departments under President George W. Bush; with top leaders in the Department of Defense, Central Intelligence Agency, and White House under President Barack Obama; and with dozens of senators and members of Congress from both parties, including through testimony on these issues.

In two years, the Biden administration has overseen the greatest change in industrial strategy in America in more than half a century. But the pressure to abandon real reform is building fast.

We should have mastered all these threats long ago. Yet every time policy makers focused on the need for fundamental change, we allowed those who profit from the chokepoints to distract us from our goals. So too today. Thus far, Biden’s industrial policies enjoy strong support in both parties and from key allies. But the pressure to abandon real reform is building fast.

Germany’s new government has been warning businesses for months about the dangers of dependence on China, and the new chancellor, Olaf Scholz, recently said he was “surprised at how dependent some companies have made themselves on individual markets and have completely ignored the risks.” Yet when members of the government challenged the CEO of the chemical giant BASF for deciding to build an immense new chemical plant in Guangdong, the CEO calmly dismissed the public interest and said, simply, “We have an extremely profitable China business.”

Here in the United States, we have an even more brazen example in Apple CEO Tim Cook. Late last year, The Information reported that Cook in 2016 signed a secret agreement with Beijing. In exchange for a promise by the Chinese Communist Party to relax political and economic pressures on Apple, Cook promised to invest $275 billion in Chinese manufacturing and research. Yet despite being completely dependent for production, research, and profit on a strategic rival of the United States, Cook has personally taken a leading role in fighting new antitrust legislation in Congress designed to lessen the economic and political power of his and other corporations. Which is hardly less audacious than if the CEO of Krupp Steel had lobbied Congress against establishing the Defense Plant Corporation in 1940. 

Or consider the lineup at a recent all-day event at the Cato Institute in Washington, D.C., where participants gathered to discuss the “ascendant political threats” to “the free economy” posed by “antitrust populism, protectionism, business politicization” and “the regulatory state”—in other words, the policies of the Biden administration. What was odd was not that Cato convened the event. It was some of the people who showed up: top economic advisers to President Obama and to John McCain’s 2008 campaign; extreme protectionists and extreme defenders of laissez-faire trade theory; a former FTC commissioner who is one of the main advocates of pro-monopoly competition policy; and Google’s chief economist. All joined their voices to sing the virtues and huzzah the power of monopoly and of China, while wielding the word freedom in ways that would have astonished Orwell himself.

It’s important to understand the game here. The goal of those who oppose this most commonsense distribution of power and risk is not to win any intellectual argument; most now fully understand that they can never restore the magical thinking of the 1990s. Their goal is simply to mislead, bewilder, confound, and delay and delay and delay until once again we lose our way, and fail to throw off the leash the monopolists have fastened on our neck.

Winston Churchill, in a speech in Parliament in October 1938 during debate over the Munich Agreement with Nazi Germany, described the ultimate threat posed by allowing a rival nation to capture an ability to choke off essential supplies. The gravest danger, Churchill said, comes from “our existence becoming dependent upon their good will or pleasure.” Should that ever happen, he warned, “in a very few years, perhaps in a very few months, we shall be confronted with demands [that] affect the surrender of territory or the surrender of liberty.”

That’s why Biden must now devise a true industrial strategy—one designed to break every chokehold on industry—and see that strategy through.

The post Manufacturing and Liberty appeared first on Washington Monthly.

]]>
145088
Need a New Economic Vision? Gotcha Covered https://washingtonmonthly.com/2023/01/08/need-a-new-economic-vision-gotcha-covered/ Mon, 09 Jan 2023 01:32:00 +0000 https://washingtonmonthly.com/?p=145102

The American Promise isn’t looking too promising to a lot of Americans.  Many young people, especially those with a college education, think the country is basically irredeemable. In their view, it was founded on racism, sexism, and genocidal colonialism, and all the Framers’ fine words about equality of opportunity are sick jokes played on marginalized […]

The post Need a New Economic Vision? Gotcha Covered appeared first on Washington Monthly.

]]>

The American Promise isn’t looking too promising to a lot of Americans. 

Many young people, especially those with a college education, think the country is basically irredeemable. In their view, it was founded on racism, sexism, and genocidal colonialism, and all the Framers’ fine words about equality of opportunity are sick jokes played on marginalized communities. The only possibility of real change, these Millennial and Gen Z Americans believe, is to fundamentally reprogram how average people think and to replace capitalism with Scandinavian-style socialism—and they rate the chance of those happening before they reach an impoverished old age and/or the Earth becomes uninhabitable at about zero.

Meanwhile, at the other end of the age-educational-ideological spectrum, a sizable chunk of the electorate thinks America is decadent. To them, it is obvious that the Christian values they believe undergird constitutional liberties have eroded to the point of near collapse; that illegal immigrants are being invited in to dilute their political power; that wokeness and pedophilia have infested all the country’s institutions; and that if these threats can’t be beaten back through electoral politics, then authoritarian means will have to do.

In between are the majority of Americans. They don’t buy into these apocalyptic worldviews. But they aren’t feeling too cheery about America’s future, either, with the country’s politics so poisonous and their own economic situations so perilous.

It could be that the darkest fears prove accurate, and that America has crossed a point of no return toward inevitable decline. Perhaps the next election will be the last truly democratic one. It’s possible that real or perceived antidemocratic moves by one party will lead to a violent insurgency by followers of the other. At the very least, it’s easy to see the political trench warfare of a country split roughly 50-50, with one party representing the majority of voters and GDP and the other a majority of acreage, continuing to hinder the actions necessary to deal with mass generational downward mobility and the growing oligarchic control of the economy that are the primary fuel of today’s fiery political dissention.

But there is another possibility: that we’ve not crossed a point of no return but have stepped back from it. The defeat of so many antidemocratic MAGA candidates in swing states this past November is one sign, but I think it’s bigger than that. A decade and a half of economic and political turmoil—the Great Recession, followed by the Occupy Wall Street protests, followed by Tea Party–induced near debt default, followed by the election of Donald Trump, followed by the Women’s March and the Black Lives Matter protests, followed by the January 6th coup attempt, followed by rampant inflation and shortages of everything from cars to baby formula, followed by a bloody proxy war against Russia, followed by a midterm in which democracy barely dodged a bullet—might finally have convinced enough Americans, especially elites, that the current economic order is not working and that we need a new one.

The central tenet of the current order, sometimes called “neoliberalism” (not to be confused with the very different meaning that word had for Monthly founder Charles Peters when he used it to describe the philosophy of this magazine), is that markets are best left to run themselves, and that government restrictions on corporate behavior generally impede efficiency and reduce prosperity. Both political parties, to one extent or another, embraced this neoliberal thinking several decades ago. Both, to different degrees, are now backing away from it. The Democrats’ Inflation Reduction Act, with its hundreds of billions of dollars for green energy, is the most obvious example. But even establishment Republicans, responding to their angry base, are trying to put some daylight between themselves and corporate interests. (“I didn’t even know the Chamber [of Commerce] was around anymore,” then House Minority Leader Kevin McCarthy quipped last fall.)

While the old economic order is failing, we don’t yet know what paradigm will replace it—though what it should achieve is clear enough. To be judged successful, a new economic order would need to reverse the income and wealth declines that the bottom 60 percent of the country has suffered for decades. It would have to deliver those gains across the board—to Blacks, Latinos, and working-class whites, in metro and rural areas alike. It would have to protect America from the economic predations of China and Russia and, relatedly, strengthen the fragile supply chains that nearly brought the U.S. and world economies to their knees. Because of this international dimension, a sound economic strategy would need to be attractive to other countries, especially our closest allies in Europe and Asia. Most of all, if there is any hope of repairing America’s fractured democracy, it would have to appeal politically to a broad majority of American voters.   

In think tanks and advocacy groups around Washington, the race is on to provide this new economic vision, but so far, the most talked-about candidates don’t inspire much confidence. Many on the socialist left are promoting hitherto unthinkably large social interventions, like a guaranteed basic income, that are unlikely to gain wide political support—they didn’t even fly with Democratic primary voters in 2020—and in any event are intended to alleviate the failures of a capitalist economy, not fix capitalism. In more establishment center-left circles, where corporate lobbyists carry some weight, the big idea is for government to subsidize “strategic” industries, as the new CHIPS Act does microchip makers, without asking very much of these industries in return. On the right, the excitement is around “postliberal” intellectuals who admire foreign strongmen like Hungary’s Viktor Orbán. These thinkers call (vaguely) for an expanded welfare state to rebuild families and communities, but one guided by a theocratic central government. 

There is another emerging economic paradigm, however, that is more promising. The Biden administration’s crackdown on monopolies reflects this thinking, but it encompasses a wider range of policy tools than just antitrust enforcement. And it is rooted in a theory of politics and economics that derives not from European socialist or reactionary traditions but from a uniquely American one articulated by the Framers, especially Thomas Jefferson and James Madison.

A stable republic requires that government guarantee individuals not just an equal right to vote but also an equal chance to make a respectable living.

The basic idea is that preserving democracy and liberty requires checks and balances not just on government but on the economy as well; that power, whether political or economic, should not be granted to any one entity, but distributed broadly; and that a stable republic requires that government guarantee individuals (white men at the time, all adult citizens today) not just an equal right to vote but also an equal chance, through possession of assets (a farm, a business, professional or trade skills, and so on), to make a respectable living. 

This tradition, sometimes called civic republicanism, animated many of the country’s most successful economic reforms. These include the Northwest Ordinance of 1787, which provided small, reasonably priced land holdings and free public schools to western settlers, including Black people (though at the expense of Native Americans); the Civil War–era Morrill Act, which created land grant colleges to “promote the liberal and practical education of the industrial classes”; the Sherman and Clayton Antitrust Acts, which gave federal agencies authority to break up monopolies and open markets for entrepreneurs; the 1963 Higher Education Facilities Act, which funded the building of community colleges; and the 1965 Higher Education Act, which, for the first time, provided federal financial aid to any aspiring student, regardless of class, race, or gender, who lacked the means to go to college. 

Beginning in the 1970s, however, America was hit by a brutal combination of recession, inflation, and rising international competition that seemed to defy conventional solutions. Perplexed policy makers turned to the idea that unshackling markets from government restraint was the answer. For a while, that seemed to work. But over time, this libertarian approach led to (among other foreseeable evils) monopolization, deindustrialization, plummeting entrepreneurship, Gilded Age levels of income and wealth inequality, billionaires corrupting the political process, supply chains stripped of resiliency, and, once again, inflation. This free market ideology was even applied to non-profit sectors of the economy, like hospitals and colleges. The result was greater privilege for the wealthy, soaring costs for the many, and scandalous treatment of the poor.

If this history sounds familiar, it may be because versions of it have appeared in hundreds of separate articles in the Washington Monthly over the years. But we’ve never really pulled the different strands of our work and thinking together. In 2023, we are going to try to do that, with the aim of offering an integrated theory that can compete in the contest now under way to replace the neoliberal regime. 

That effort starts with this issue, which features a package of articles that unearth civic republican approaches to past economic dilemmas that can be applied to current ones. Nicholas Lemann explains how the concept of “political economy,” a central focus of American politics from the founding to World War II, disappeared from the national conversation but is making a much-needed comeback. Phillip Longman reports on a largely forgotten statute, the Robinson-Patman Act, which barred price discrimination based on market power in retailing, and how that law can be used to combat today’s inflation, shortages, and inequality. And Barry Lynn offers a treatise on how to eliminate dangerous chokepoints in international supply chains by rebuilding a version of industrial policy that Washington foolishly abandoned decades ago. 

Elsewhere in the issue, James Fallows provides the latest in a series of dispatches he and his wife, Deborah, are writing on colleges that are using their power and resources to improve the economic and civic life of their local communities.

I can appreciate the deep pessimism many Americans feel about the country and its future. It is often born of personal experience and a keen appreciation of America’s (often monstrous) failings. But while I understand the feeling, I don’t share it. Quite the opposite. Not to get all state-of-the-union on you, but I’m convinced that America’s best days are ahead of us, and that this country’s uniquely rich endowment of institutions, resources, and culture provides it with a greater potential to achieve broad-based prosperity than any other country on Earth, by a mile. But to tap that potential will require a plan, a vision. Watch this space.

The post Need a New Economic Vision? Gotcha Covered appeared first on Washington Monthly.

]]>
145102
Emancipation Relocation https://washingtonmonthly.com/2023/01/08/emancipation-relocation/ Mon, 09 Jan 2023 01:30:00 +0000 https://washingtonmonthly.com/?p=144997

In 1848, at a time when Black families were routinely ripped apart and Black bodies were bought and sold in public squares as chattel, a Black man and his mixed-race wife pulled off one of the most daring and ingenious feats of self-emancipation imaginable. Defying conventions of race, class, and gender, William and Ellen Craft […]

The post Emancipation Relocation appeared first on Washington Monthly.

]]>

In 1848, at a time when Black families were routinely ripped apart and Black bodies were bought and sold in public squares as chattel, a Black man and his mixed-race wife pulled off one of the most daring and ingenious feats of self-emancipation imaginable. Defying conventions of race, class, and gender, William and Ellen Craft of Macon, Georgia, transformed their appearances and engineered an extraordinary flight to freedom. Their daring escape energized the abolitionist movement and helped change the course of history.

Master Slave Husband Wife: An Epic Journey From Slavery to Freedom by Ilyon Woo Simon & Schuster, 416 pp.

Ilyon Woo’s new book, Master Slave Husband Wife, lifts the curtain on a largely unknown chapter in America’s complicated racial history. In her first book, The Great Divorce: A Nineteenth-Century Mother’s Extraordinary Fight Against Her Husband, the Shakers, and Her Times, Woo wove history and narrative together in a compelling look at 19th-century women’s rights. In this book, she tells the true story of a courageous married couple who challenged slavery, America’s original sin. She also sheds light on America’s original blessing—the efforts of free Blacks and people of all races to answer the clarion call of the Declaration of Independence by risking their own “lives, fortunes, and sacred honor” in the fight to end slavery and racial discrimination. 

William Craft, a skilled cabinetmaker, was the property of Ira Hamilton Taylor, a banker who rented him out to a local shop owner. Ellen Craft was the daughter and property of James Smith, who had enslaved and impregnated her mother. When James’s wife could no longer tolerate Ellen’s presence in the house, Ellen was given to her white half sister, Eliza, as a wedding present. She then became the legal property of Eliza’s new husband, Dr. Robert Collins. 

Both Ellen and William were trusted “favorites” of their owners. But favored slave status was no match for their dream of a free life together. William and Ellen first met in 1841, when he was 18 and she was 15. As their relationship deepened, Woo writes, Ellen made clear that she would not marry or bear children until they escaped bondage, “not until her own body—and therefore her children—belonged to her.” Ultimately, however, Ellen relented. Though they were denied a sanctified Christian wedding, with the permission of their owners they “jumped the broomstick”—a tradition among enslaved people, who were denied legal weddings—in 1846. 

Two years later, with Ellen disguised as a wealthy, white, disabled slave-owning man, and William playing the role of her attentive slave, they began a harrowing flight from bondage that took them from Macon through Philadelphia and Boston, on their way to Canada. Traveling by train, boat, steamship, and the Underground Railroad—and relying on Ellen’s ability to pass for white, on her seamstress skills, and on William’s abundant creative instincts—they escaped in plain sight, always one step ahead of capture. After sneaking through the streets of Macon as master and slave, they arrived at the train station. There, they were able to elude their first pursuer, the cabinetmaker from the shop where William worked. The man had tracked them to the station and was looking into train windows for the fugitives when the train pulled out. Ellen also hoodwinked a station porter who had previously known and been interested in her. “This man now called her ‘Young Master’ and thanked her for the tip she gave him,” Woo recounts. From there, it was a four-day race to the Mason-Dixon Line and free soil. But four days was hardly the end of it. In fact, the chase had just begun.

Is it coincidence that the current voting rights debate is center stage in Georgia, the state that denied dignity and voting rights to William and Ellen Craft? 

Woo reminds us that the Fugitive Slave Act—first signed into law by President George Washington in 1793, and later toughened in 1850 by President Millard Fillmore—authorized local and federal government officials to hunt down, capture, and return escaped slaves to their owners and imposed penalties on anyone who aided their flight. This meant that no matter how far north William and Ellen ran, as long as they were on American soil they were in constant jeopardy of being kidnapped and returned to bondage in Georgia. 

Following the passage of the 1850 Fugitive Slave Act, private citizens in northern states formed so-called Vigilance Committees to protect escaped slaves and thwart the efforts of professional bounty hunters. When Robert Collins sent two men, Willis Hughes and John Knight, to Boston to bring back the Crafts, the Boston Vigilance Committee came to the rescue. With the help of sympathetic local attorneys, commissioners, and judges, the committee orchestrated a series of judicial delays. Hughes and Knight were arrested for conspiring to kidnap the Crafts. Upper-class Bostonians mocked the uneducated slave catchers, and street boys threw spoiled eggs and garbage at them. As Woo writes, “It was too much for the Georgians to bear. They had left Macon as heroes and expected to return triumphant, captives in hand. Instead, they had been the ones chased, ridiculed, spat upon, hunted down by law, man, woman, and child.”

After that incident, the Crafts were advised to keep running to Canada. But they were tired of running. At the urging of the charismatic ex-slave William Wells Brown, they took a detour to join him as powerful speakers on the antislavery lecture circuit. They toured Europe, and then settled in England. This was another pivotal act of courage that elevated them on the international stage. As William regaled audiences with the dramatic retelling of their bondage and escape and Ellen appearing as “the white slave,” their lectures raised much-needed money and support for the abolitionist movement.

In chronicling this expansive saga, Woo does not shy away from recounting some of the most heinous horrors of slavery. But she simultaneously introduces us to the compassion and commitment of a number of white abolitionists, including Quaker families and people like William Lloyd Garrison, Lucy Stone, Robert Purvis, and the Reverend Theodore Parker. These and other men and women like them formed allegiances with Frederick Douglass, William Wells Brown, and other free Blacks on the frontlines of the 19th-century antislavery movement. A number of these activists played key roles in the remarkable self-
emancipation journey of William and Ellen Craft. In fact, it was Parker, a Unitarian minister and avid abolitionist, who ultimately persuaded the bounty hunters Hughes and Knight to abandon their mission.

From their new home in England, the Crafts continued their activism; wrote a book, Running a Thousand Miles to Freedom; founded a school in Africa, and another in Georgia; and mobilized support for the Union side during the American Civil War. They also fulfilled their dream and raised six freeborn children. 

Parts of Woo’s story unfurl with dramatic cinematic sweep, but ultimately this is a meticulously researched work of narrative nonfiction, documented with a bibliography and extensive notes. Woo writes in the opening overture, “Though propelled by narrative, this work is not fictionalized. Every description and line of dialogue originates in historic sources.” 

Woo’s account provides the backdrop for the larger story of the roots of America’s continuing racial divide and the sacrifices many have made to create a more perfect union. We have made great strides since the fight over slavery led to a bloody civil war, but fissures remain, and lingering vestiges of the past still afflict many communities. These include the post–Civil War “Black Codes,” which sanctioned racial discrimination; Jim Crow; redlining; and mass incarceration. 

Is it coincidence that the current voting rights debate is center stage in Georgia, the state that denied dignity and voting rights to William and Ellen Craft? Can we deny that there is a link between the reality that as an enslaved couple, the Crafts were forbidden to learn to read or write, and the separate and unequal education that continues across the United States today? Clearly, the journey to freedom isn’t over. Master Slave Husband Wife is a welcome addition to a growing effort to fill historical gaps and tell the unvarnished truth about the past so we can build a better future. It is a riveting American saga and a teachable moment for these times.

The post Emancipation Relocation appeared first on Washington Monthly.

]]>
144997 Jan-23-Books-Woo Master Slave Husband Wife: An Epic Journey From Slavery to Freedom by Ilyon Woo Simon & Schuster, 416 pp.
Medicine at the Mercy of Wall Street   https://washingtonmonthly.com/2023/01/08/medicine-at-the-mercy-of-wall-street/ Mon, 09 Jan 2023 01:25:00 +0000 https://washingtonmonthly.com/?p=145007

The pharmaceutical industry is the great white whale of American medicine. No matter how many harpoons activists, progressive politicians, journalists, and scholars hurl at its bloated body, it not only survives, it grows fatter by feasting on the patients and payers that are the krill of the U.S. health care system. The drug price controls […]

The post Medicine at the Mercy of Wall Street   appeared first on Washington Monthly.

]]>

The pharmaceutical industry is the great white whale of American medicine. No matter how many harpoons activists, progressive politicians, journalists, and scholars hurl at its bloated body, it not only survives, it grows fatter by feasting on the patients and payers that are the krill of the U.S. health care system.

Capitalizing a Cure: How Finance Controls the Price and Value of Medicines by Victor Roy University of California Press, 245 pp.

The drug price controls in the recently enacted Inflation Reduction Act (IRA)—touted as the first-ever defeat for the drug industry lobby in Washington—offer the latest example of how the industry manages to outrun its harpooners. While the new law finally gives the federal government the power to negotiate drug prices for seniors (who constitute just a third of the nation’s drug spending), intense industry lobbying limited its reach to just 10 drugs starting in 2026, growing to only 20 drugs in 2029.

The law does not apply to drugs purchased by private payers, who cover more than half the population. It does nothing to rein in launch prices for new drugs, which have increased from $1,376 in 2008 to $159,042 in 2021. (The median price for drugs launched in 2022 has reached a staggering $257,000 per year!) And its hard-to-enforce provision giving the government the right to claw back price increases above the inflation rate will undoubtedly be subjected to extensive industry opposition during the rule-making process and eventually in the courts.

The industry’s public posture during the debate leading up to passage of the IRA was little changed from its historic justification for high drug prices. Their argument, reduced to its essence, is a form of blackmail targeting patients with chronic and incurable diseases. PhRMA, the industry’s lobbying group, repeatedly says that without high prices, industry investment in research and development will decline and medical innovation will wither. It is the same argument the industry made in the late 1950s when Senator Estes Kefauver held hearings on the antibiotic cartel; in the early 1990s when the first biotechnology drugs came to market at exorbitant prices; in the mid-1990s when AIDS activists protested the high price of the new medications that turned their death sentence into a manageable disease; and in the early 2000s when President George W. Bush, anxious to eliminate any potential roadblock to his reelection, pushed through a Medicare prescription drug benefit with no constraints on industry’s pricing power.

But while the industry’s public posture hasn’t changed, its behind-the-scenes argument has shifted subtly in the past decade. Without abandoning its false claim to be the fount of innovation, its top executives and their enablers in think tanks, academia, and patient advocacy groups (mostly funded by the industry) have added the assertion that the high prices charged for the latest FDA-approved drugs are justified by the value they bring to patients and the economy.

The pharmaceutical industry’s lobbying group says that without high prices, industry investment in research and development will decline and medical innovation will wither. Their argument is a form of blackmail targeting patients with chronic and incurable diseases.

To back that claim, the industry applies cost-benefit analysis to pharmaceuticals. Using patient outcomes data gleaned from the clinical trials submitted for Food and Drug Administration approval of a new drug, industry economists measure the number of quality-adjusted life years (QALYs) gained by its use, calculate a net present value for all the personal and economic benefits accrued by averting downstream disease, and set a price that is slightly below that total. Voilà. Price justified.

It is that argument, and industry’s claim that its central role in the innovation process justifies their capturing the lion’s share of that value, that Dr. Victor Roy, a post-doctoral fellow at Yale University, effectively demolishes in his new book, Capitalizing a Cure. Roy’s doctoral thesis at the University of Cambridge conducts a deep dive into the development and marketing of Gilead Sciences’ Sovaldi, the hepatitis C drug whose $84,000 price tag for a 12-week course sent shock waves through patients, payers, the press, and the public after it was approved by the FDA in late 2013. Roy convincingly shows through this example how venture capital, Wall Street, and the industry’s top executives have turned small biotechnology firms and Big Pharma corporations into vehicles for extracting wealth from the health care system, even as these ostensibly health-promoting companies deny access to millions of needy people at home and abroad and undermine the financial well-being of patients and payers. 

Roy begins his story with a familiar tale: how government-funded academic researchers were largely responsible for the development of the drug sofosbuvir, which Gilead later named Sovaldi. (I say familiar because I published a book on this subject in 2004 that covered medical innovation in the last quarter of the 20th century, which, full disclosure, Roy generously credits.) This government-to-industry development path is, if anything, even more central to the drug development process today than it was two decades ago. Government-funded research lies behind the development of the COVID-19 vaccines; the latest cancer therapeutics, like CAR-T; and new drugs for treating many rare diseases.

Roy also reminds readers that at the dawn of the neoliberal era, it was deliberate government policy to turn the fruits of its research over to private industry without any strings attached. The Bayh-Dole Act of 1980 allowed the National Institutes of Health and universities housing government-funded scientists to patent and transfer (for royalties, of course) their scientific discoveries, research tools, and drug candidates to private developers. The 1982 Small Business Innovation Development Act accelerated the process by creating small business innovation research (SBIR) grants, which primarily went to biotech start-ups to develop these new tools and drugs. The new laws weren’t limited to biomedicine, of course, but surveys of university technology managers show that four out of every five transferred patents and SBIR grants involve medical technologies. That’s not surprising, given that the NIH’s budget—$45 billion in 2022—consistently weighs in at about five times the size of the National Science Foundation, which funds all other sciences.

Hepatitis C is a bloodborne pathogen that causes liver disease. It is primarily found in current or former intravenous drug users and people at risk of sexually transmitted diseases. In the mid-1990s, it became a prime target for academic researchers who had been involved in the hunt for an AIDS cure because the genetic makeup of the two viruses is similar. These researchers included Emory University’s Ray Schinazi, who in 1996 created a biotech company called Triangle Pharmaceuticals to develop an AIDS drug discovered in his university lab called emtricitabine. By 2004, with emtricitabine showing great promise in clinical trials, Schinazi and his partners sold Triangle to Gilead Sciences for $464 million, laying the foundation for that company to become the leading purveyor of AIDS antivirals. Schinazi cleared a third of the $200 million lavished on emtricitabine’s developers through the sale of their start-up’s stock. 

He used that capital to launch another company, Pharmasset, to develop drugs for other viral diseases, including a candidate for treating hepatitis C, which had also been developed with government grants. As Roy points out, the company’s name embodied its business strategy. The idea was to develop intangible financial assets—patents on promising drug candidates—that could then be sold to Big Pharma. Less than a decade later, Schinazi became a repeat winner in the biotech sweepstakes when he sold Pharmasset to Gilead for $11 billion, from which he cleared an estimated $440 million. 

How could a small biotech company that had only one promising drug for hepatitis C—a disease that infected only 4 million Americans and 15 million people worldwide, only 30 to 40 percent of whom would develop liver disease—sell for that staggering sum? The only existing treatment, interferon, cost over $30,000 for its course of treatment. It only helped about half of patients and had severe side effects. In Pharmasset’s early efficacy trials, sofosbuvir had shown that it could clear the virus in well over 90 percent of patients. It was all but an assured bet for the Big Pharma company that bought it; and, given its greater efficacy and sharply reduced side effects, sofosbuvir could command a price that was more than twice that of interferon.

The drug’s eventual price had nothing to do the cost of development (Roy estimates that the government, Pharmasset, and Gilead spent less than $1 billion over the decade it took to develop the drug); the risks Gilead took; or the value the drug delivered to patients and the broader economy. Roy writes,

Gilead’s senior leadership saw their company as a late-stage acquisition specialist, buying compounds in their final steps of development and thereby taking control of potential future earnings streams just as the compounds neared and then crossed the regulatory finish line … 

Gilead’s approach had by then become common across the industry. [Emphasis in original.]

Though from a science and regulatory perspective sofosbuvir was Secretariat, Gilead’s bet paid off like a long shot. Drug purchasers coughed up more than $46 billion in the first three years sofosbuvir-containing products were on the market—four times Pharmasset’s purchase price and 50 times the amount invested in R&D by all parties. “Gilead’s power to project this future drew on two sources: its anticipation of acquiring Pharmasset’s intellectual property and gaining monopoly power over prices; and its confidence that health systems could be compelled to pay more for a better drug,” Roy writes.

Only after Gilead set its price did it turn to the new argument that it reflected good value for payers and patients. For that, the company relied on high-powered health economists whom it funded in academia. Looking at the savings from reduced liver transplants and hospitalizations, one study, funded by Gilead and published in Health Affairs, estimated that giving sofosbuvir-based treatments for hepatitis C could generate $610 billion to $1.2 trillion in value to the U.S. economy and $139 billion in health care cost savings—even though people with advanced liver disease from hepatitis C rarely get liver transplants. Amitabh Chandra of the Kennedy School of Government at Harvard made a similar argument in the Harvard Business Review, where he also disclosed funding from Gilead.

Even as these academics were defending Gilead’s extraordinarily high price, the company was using the largest portion of its windfall to buy back stock, lavishly reward its top executives, and renew its hunt for new drug candidates on Wall Street. Meanwhile, federal agencies like the Veterans Administration, Medicaid, and the nation’s prisons had to ration access to the drug. The denials of care “disproportionately fell on those populations at the most risk for worsening hepatitis C as well as transmission of the infection: low-income patients and those with a history of injection drug use,” Roy writes.

Is there any evidence to suggest that the arrival of Sovaldi created significant value from a health care perspective? After all, it is a miracle drug. It wipes out the infection in almost all patients with only a three-month course of treatment. Yet, according to the Centers for Disease Control and Prevention, there are still anywhere from 2.7 million to 3.9 million people in the U.S. living with hepatitis C, only slightly below where we were a decade ago. Why? There are more than 100,000 new infections every year, in part because access is limited by the drug’s high price. Moreover, there were 9,236 liver transplants in 2021, the highest number ever, according to the United Network for Organ Sharing. The total has gone up in every year since the FDA approved sofosbuvir. 

Venture capital has turned small biotech firms and Big Pharma corporations into vehicles for extracting wealth from the health care system, even as these ostensibly health-promoting companies deny access to millions of needy people at home and abroad.

In other words, by allowing publicly funded research to be turned into a privately held financial asset; by allowing venture capitalists and Wall Street to drive up the price of that asset; by allowing a private corporation to set a maximum price point for that asset; and by watching hired economists justify that price point using questionable value metrics, the U.S. health care system has created the ultimate unvirtuous circle. Pricing for value as Wall Street defined it made rationing inevitable and turned a significant breakthrough by medical science into a setback for both public health and fiscal sustainability.

Roy’s book concludes, as all would-be harpooners’ tales must, with an alternative vision for developing innovative medicines. First, reformers must break the cycle that allows academic scientists and their venture capitalist backers from turning publicly financed cumulative knowledge into monetizable assets through the patent system. Once patent control is turned over to biotech start-ups and big drug companies operating as acquisition specialists, the inevitable outcome is a system that maximizes returns to venture capitalists and the big firms’ stockholders and executives even as it ignores the needs of most patients, payers, and public health.

It also debases the scientific process by emphasizing the development of drugs with the greatest revenue potential, which, Roy notes, “reduces companies’ appetite for making the long-run and risk-laden investments needed to create breakthrough medicines.” Instead, too many companies invest their own R&D dollars into me-too drugs that replicate products already on the market. And, even when a breakthrough drug like sofosbuvir comes along, the patent system as presently constructed incentivizes firms to postpone development of improvements until existing patents expire, which in turn leads to the high prices, rationing, and patent gaming that maximize the revenue stream over the drug’s patent life.

Instead, Roy resuscitates a vision for developing innovative technologies that was first articulated by New Deal–era Senator Harley Kilgore of West Virginia. In contrast to the FDR science adviser Vannevar Bush, who thought the government should stick to basic science, Kilgore called for public financing of the entire development process and a patent system that protected government-financed inventions from private-sector profiteering. Roy calls for the creation of a publicly financed Health Innovation Institute that would take responsibility for developing government-funded inventions, all the way from perfecting the molecules to financing final clinical trials. The goal would be pricing them closer to their manufacturing costs so access and affordability were no longer problems.

The idea is not unique to him, nor is it far-fetched. Indeed, there are many examples where government has performed nearly every task involved in a drug’s development. These range from developing the process for mass production of penicillin during World War II to running trials for the earliest AIDS drugs to doing everything from start to finish for the first hormone replacement treatments for genetic mutation-caused rare diseases. Since the 1970s launch of the war on cancer, government has financed an extensive academic network for conducting cancer clinical trials. It remains to be seen if President Joe Biden’s newly created Advanced Research Projects Agency for Health at the NIH will include technology development as part of its mission. 

The problem is not skill, it is political will. The one good thing you can say about the financialization of drug development is that it provides a huge incentive for private investors to invest over many years in biotech start-ups. R&D for new drugs takes a long time and, in most cases, does not pan out. To hedge against failure, venture capitalists take a portfolio approach. The gargantuan payoff from the one in 10 drug that succeeds not only pays for the failures, it provides a more than generous return for investors.

A government-run public option alternative would have to take a similar long-term approach—without the promise of huge returns other than improved public health and cheaper medicines. That requires permanent funding (perhaps a surcharge on all drug expenditures, something like the gas tax that funds roadbuilding) and insulation from political manipulation.

It also does not deal with the legacy problem that the public already pays far too much for many drugs. Here, I think, Roy is too dismissive of the nascent price controls in the IRA. The camel’s nose is inside the tent. The political capital needed to create an effective drug development agency is even greater than what it would take to expand the government’s drug price negotiating authority and eliminate patent gaming, two reforms that would provide a more immediate counter to the problem of drug prices that are just too damn high.

The post Medicine at the Mercy of Wall Street   appeared first on Washington Monthly.

]]>
145007 Jan-23-Books-Roy Capitalizing a Cure: How Finance Controls the Price and Value of Medicines by Victor Roy University of California Press, 245 pp.
The Plumbers Who Couldn’t Fix a Leak https://washingtonmonthly.com/2023/01/08/the-plumbers-who-couldnt-fix-a-leak/ Mon, 09 Jan 2023 01:20:00 +0000 https://washingtonmonthly.com/?p=144898

For me, the puzzle of Watergate is why Richard Nixon, who wasn’t responsible for the Democratic National Committee break-in, decided to lead the cover-up just days after the burglary undertaken by a band of White House–led dirty tricksters known as “the Plumbers.”  What made the president order the CIA to shut down the FBI probe […]

The post The Plumbers Who Couldn’t Fix a Leak appeared first on Washington Monthly.

]]>

For me, the puzzle of Watergate is why Richard Nixon, who wasn’t responsible for the Democratic National Committee break-in, decided to lead the cover-up just days after the burglary undertaken by a band of White House–led dirty tricksters known as “the Plumbers.” 

The White House Plumbers: The Seven Weeks That Led to Watergate and Doomed Nixon’s Presidency by Egil “Bud” Krogh and Matthew Krogh St. Martin’s Griffin, 208 pp.

What made the president order the CIA to shut down the FBI probe of the scandal—a fateful decision documented on the June 23, 1972, White House recording whose release in 1974 finally ended Nixon’s presidency?

The answer, it turns out, was owing to earlier White House crimes that even Nixon’s tough guy friend, attorney general, and fellow criminal, John Mitchell, would christen the “horrors.”

Chief among those horrors was the 1971 break-in of the office of Lewis Fielding, the Beverly Hills psychiatrist whose patient was Daniel Ellsberg, the man who leaked the famed Pentagon Papers to The New York Times. Nixon believed that Ellsberg wasn’t a misguided or malevolent liberal. The president had no doubt that the Santa Monica, California–based RAND Corporation employee was part of a conspiracy directed by Moscow to sabotage the American war effort in Indochina and the nation’s defenses. (There’s no evidence to that effect whatsoever.) Nixon feared that Daniel Ellsberg’s geyser of leaks wouldn’t stop with the lies and catastrophic decisions in Southeast Asia made by his predecessors, which is what the Pentagon Papers were primarily about. Nixon feared that the lies and catastrophic decisions of his Vietnam policy would be forthcoming. 

Anger over Ellsberg’s disclosures, and fear of more to come, led to the creation of the White House Plumbers—a mysterious, and at times comic, cabal of hardball political operators, intelligence and law enforcement veterans, and other oddballs. The name came from one of the members telling his grandmother that his job at the White House was to stop leaks. “Oh, you’re a plumber,” she said naively. The name stuck. The group’s members included a former intelligence operative and author of espionage novels (E. Howard Hunt); a right-wing mustachioed former FBI man (G. Gordon Liddy) who’d go on to be a staple of right-wing talk radio; and a cynical, seasoned political operative (Charles Colson), who later became a widely admired born-again founder of a nationwide prison ministry.

The White House Plumbers tells the story of one plumber—Egil “Bud” Krogh, a most unlikely crook. A 31-year-old attorney, track star, and Navy veteran at the time of the Ellsberg break-in, Krogh came to the White House through John Ehrlichman, a family friend from Seattle and Nixon’s infamous right hand. (Krogh occasionally babysat the Ehrlichman kids.) Ehrlichman was assistant to the president for domestic policy, and Krogh his deputy. While that’s usually a nerd’s job, it was political and criminal under Nixon, and even occasionally fun. Krogh wound up handling Elvis Presley’s famous letter to the president, in which the King requested to become a “federal agent at large” helping prosecute Nixon’s war on drugs. “The drug culture, the hippie elements, the SDS, Black Panthers, etc. do NOT consider me as their enemy or as they call it The Establishment,” Elvis wrote Nixon. “I call it America and I love it. Sir, I can and will be of any service that I can to help The Country out.” Krogh recommended the meeting that Nixon took and would later write a book about it.

Nixon feared that Daniel Ellsberg’s geyser of leaks wouldn’t stop with the lies and catastrophic decisions in Southeast Asia made by his predecessors, which is what the Pentagon Papers were primarily about. Nixon feared that the lies and catastrophic decisions of his Vietnam policy would be forthcoming.  

But the life-altering moment for Krogh came when he was assigned to direct the Plumbers’ first project, breaking into Fielding’s office. (Others came up with the original idea of going after the psychiatrist to dig up dirt that could discredit Ellsberg, the Pentagon Papers, and the antiwar movement.) A good part of this slim volume—written
by Egil Krogh with a preface by his son Matthew and published two years after Bud’s death in 2020is devoted to why such a straight arrow became an integral member of one of America’s most famous crime gangs. (A version of this memoir was published in 2007 by Public Affairs under the title Integrity: Good People, Bad Choices, and Life Lessons From the White House.) 

“Dad had all kinds of metaphors for integrity,” Matthew writes in the book’s preface, “but he spent years, decades, trying to understand why his own integrity was breached.” 

For Bud Krogh, the turn toward darkness was partly about blind loyalty to higher-ups, but he was ultimately swayed by Nixon’s notion of “national security.” Breaking into Ellsberg’s psychiatrist’s office wasn’t a politically self-serving crime but a battle in the Cold War. As Bud Krogh writes, “We believed then that these leaks constituted a national security crisis and needed to be plugged at all costs. But we were wrong, and the price paid by the country was too high.” 

“I really need a son of a bitch,” Nixon told Ehrlichman when he was looking for someone to run the operation, saying, “I’ll direct him myself. I know how to play this game, and we’re going to start playing it. I want somebody as tough as I am for a change.”

Bud Krogh proved a surprising but fitting choice: “Even though I don’t think I fit the dark profile the president wanted for the job,” he writes in White House Plumbers, “perhaps a simpler reason I got it was that it was understood that I would do it to the best of my ability and not ask questions.”

Nixon had Krogh read the chapter in his 1962 memoir, Six Crises, on the Alger Hiss case, a Cold War cause célèbre. Hiss was a State Department official who had been accused of aiding the Soviet Union and was convicted on perjury charges after he squared off at a riveting congressional hearing with Whittaker Chambers, the “red” turned right-winger and zealous anticommunist. Nixon championed Chambers and his accusations, while liberals and even many Republicans took the side of the polished, Ivy League–educated Hiss. 

Nixon tried to convey to Krogh that the leaker of the Pentagon Papers was as guilty of Soviet espionage as was Hiss, who was convicted of perjury over his denial that he had passed State Department information to the Soviets. Krogh writes,

Nixon wanted me to understand unequivocally that he viewed the problems with Ellsberg’s release of the Pentagon Papers as a full national security crisis, one comparable to the career-defining—for him—conviction of a traitor in the full glare of publicity in 1948. Nixon was offering me the chance to succeed as he had succeeded and to draw the obvious inference about what such a success might portend for my own future career in government.

Of course, it didn’t turn out that way. None of it did. Unlike the glory that Nixon’s exposure of Hiss brought the 35-year-old California Republican congressman in 1948—a successful Senate bid in 1950, the vice presidential nomination and election in 1952—Krogh was stuck cleaning up Tricky Dick’s malevolent buffoonery without the Cold War intrigue. Nixon had also held out the Ellsberg break-in as a peace mission. More leaks from Ellsberg, the president insisted, might scuttle his embryonic peace overtures to the North Vietnamese. 

What’s delightful about White House Plumbers is that Krogh’s story carries the dread of one of the Halloween movies where the hero feels the danger but can’t resist courting it. 

“Extreme illegal acts were undertaken to prevent this discovery, including perjury, obstruction of justice, and the payment of hush money to the perpetrators of the 1971 crime to keep them from revealing it during the Watergate investigations,” Krogh writes. “But the burglary of Dr. Fielding’s office constituted the most extreme and unconstitutional covert action taken to that date, setting the stage for the downfall of the Nixon presidency. Once taken, it was an action that could not be undone or explained away.”

Without giving it all away, it is Krogh’s honesty in his account—especially his guilt—that makes the story work. This confessional tone makes comparing Krogh to former White House Counsel John Dean almost irresistible. Like Dean but without the White House counsel’s mesmerizing testimony before the Senate Watergate Committee, Krogh came clean and took his lumps. (Dean’s famed memoir, Blind Ambition, was written with the assistance of Washington Monthly contributing editor Taylor Branch.) Krogh’s story is coming out posthumously and will soon be a five-part HBO series with Mad Men’s Rich Sommer as the naïf Krogh, Woody Harrelson as Hunt, and Justin Theroux as Liddy. The story is cinematic, as ripe for comedy as it is drama. 

Liddy has been portrayed in films before and became a popular right-wing radio talk show host. His memoir, Will,includes accounts of his holding an open palm over a flame to test his capacity to withstand pain. At one point, he seemed to hint that lives might be lost in the Ellsberg operation. As Krogh writes,

Liddy was the kind of guy you’d want next to you in a foxhole, where he’d cover your back and take a bullet to save your life. He projected a warrior-type charisma and seemed to possess a great deal of physical courage. He was tough, smart, disciplined, and loyal. During the Watergate investigations Liddy never “squealed” or “snitched” on anyone.

For all of the cloak-and-dagger drama, the Fielding break-in was botched, just like the Watergate break-in. The anti-Castro Cubans whom Hunt got to perform the actual breakup screwed up. A door that was to be left open was not. The culprits chose to cover their tracks by trashing the office, making it appear like a burglar—perhaps an addict looking for drugs—had ransacked the place. The super sleuths couldn’t find any notes on Ellsberg’s therapy session but kept a photograph of Liddy posing in Fielding’s parking place during a “reconnaissance mission,” a snapshot later discovered by the CIA when it was assisting the operation.

The Plumbers trampled Fielding’s constitutional rights, and that’s about all they got.

Egil “Bud” Krogh pled guilty to his role in the Fielding break-in and served six months in prison. “You come in here as a white man, a lawyer, a Nixon dude,” his cell mate admonishes him. “Don’t you never hold yourself out better than anyone else in here.” A contrite Krogh holds that truth closely. As he writes,

While the idea for the Fielding break-in originated with Hunt and Liddy, I fully endorsed their recommendation. In fact, I had pushed them hard for aggressive action without fully understanding what that might entail. Because I could have stopped the operation and didn’t. I was fully responsible.

As for Krogh’s reckoning with his sins, there’s a scene in the memoir I love. It’s on the campus of the University of California, Berkeley, where the Watergate special prosecutor, Leon Jaworski, interrupts a speech he is giving on “Morality in Government.” 

“One of the men who was involved in this case is in our audience tonight,” said the courtly Texan who replaced Archibald Cox, the prosecutor Nixon had had fired. “What is more, he asked for no favors or special privileges, from the prosecutor or the court. He said he found his own conduct indefensible, and he was willing to take the punishment for what he had done.”

For Bud Krogh, the turn
toward darkness was partly about blind loyalty to higher-ups, but he was ultimately swayed by Nixon’s notion of “national security.” Breaking into Ellsberg’s psychiatrist’s office wasn’t a politically self-serving crime but a battle in the Cold War.  

Slowly and with some reluctance, a man in the audience took to his feet.

“This is Egil (Bud) Krogh,” Jaworski said. 

“I do not know how many political rallies I have attended,” Jaworski wrote later and is quoted by Krogh. “But I have never seen or heard anything quite as genuine as the emotion that crowd gave to Bud Krogh, an ex-lawyer who had just been introduced by the man who sent him to prison.” 

For Egil Krogh, “atonement for his impact on America, on the executive branch, on the rest of us,” his son writes in the preface, became “a core theme of his life, since he got out of prison in 1974.” Indeed, Krogh’s penitence helped him get his law license back, and he ran a private practice in Seattle.

 A witness to that personal resolution was Jaworski, who added at the lecture: “The enduring question of Watergate is whether we, as a people, will learn from it. Some have.”

The post The Plumbers Who Couldn’t Fix a Leak appeared first on Washington Monthly.

]]>
144898 Jan-23-Books-Krogh The White House Plumbers: The Seven Weeks That Led to Watergate and Doomed Nixon’s Presidency by Egil “Bud” Krogh and Matthew Krogh St. Martin’s Griffin, 208 pp.
The Courage and Compromises of George P. Shultz  https://washingtonmonthly.com/2023/01/08/the-courage-and-compromises-of-george-p-shultz/ Mon, 09 Jan 2023 01:15:00 +0000 https://washingtonmonthly.com/?p=145013

When he died, at the age of 100, in 2021, George P. Shultz was widely hailed as a consummate pragmatist who represented a type of sober conservatism that has been almost entirely eradicated in the modern Republican Party. A charter member of the foreign policy establishment, he occupied four cabinet positions, including secretary of labor […]

The post The Courage and Compromises of George P. Shultz  appeared first on Washington Monthly.

]]>

When he died, at the age of 100, in 2021, George P. Shultz was widely hailed as a consummate pragmatist who represented a type of sober conservatism that has been almost entirely eradicated in the modern Republican Party. A charter member of the foreign policy establishment, he occupied four cabinet positions, including secretary of labor and secretary of state, in the administrations of Richard Nixon and Ronald Reagan. In those posts, he earned his reputation for statesmanship by espousing enlightened policies on issues like civil rights and playing a key role in ending the Cold War. His death was taken as the symbolic passing of a bygone style of judicious and sane conservative governance that predominated in Republican administrations before Donald Trump took office. 

In the Nation’s Service: The Life and Times of George P. Shultz by Philip Taubman Stanford University Press, 449 pp.

But did Republican administrations ever follow Shultz’s principled brand of politics—or, in fact, did Shultz himself? Philip Taubman’s new biography of Shultz, In the Nation’s Service, offers a more complicated assessment of the well-known government official and of the modern history of the GOP. Taubman, a former reporter at The New York Times, first encountered Shultz when he was Reagan’s secretary of state. Decades later, Shultz asked Taubman if he would like to write his biography and promised exclusive access to his papers, which were housed in a sealed archive at Stanford’s Hoover Institution. Taubman has extensively drawn on them to show that Shultz had real accomplishments but failed to stand up for his principles against unscrupulous conservative operators at key moments during the Nixon and Reagan presidencies. His story hints toward the GOP’s long tradition of struggling—and often failing—to check the callous self-interest and viciousness that came to define the Trump White House. 

Taubman traces Shultz’s innate conservatism back to his father, a lifelong Republican who worked on Wall Street and emphasized the importance of social and professional status. As a teenager, Shultz viewed FDR’s administration with misgivings, recalling that he felt he was “seeing this big intrusion and hoping it would work and realizing it didn’t work very well.” After serving in the Marines in the Pacific theater during World War II, Shultz entered graduate studies at MIT, studying economics and industrial relations. Never an impassioned ideologue, Shultz was less interested in economic theory than in public policy issues. He landed his first Washington job in 1955, when Arthur F. Burns, who became head of the Federal Reserve under Nixon, hired him as a senior staff economist on the Council of Economic Advisers. Shultz was impressed by Burns’s professional and apolitical stance as chairman of the advisory group. “It was this approach to government service, more than anything else, that became Shultz’s takeaway from his first episode in Washington,” Taubman writes.

George Shultz’s selective memory throughout the Iran-Contra affair echoed Richard Nixon’s warning in 1982 that Shultz had “a wonderful ability to, when things look iffy or are going wrong, contend he never heard about the issue and was never briefed and was not a part.”

Shultz became a national figure when he was appointed dean of the University of Chicago Graduate School of Business at the age of 41. His encounters at Chicago with the likes of Milton Friedman and George Stigler fortified his faith in free markets and opposition to government regulation, but he was also careful to tend to his ties to Washington, serving as chair of a task force of the U.S. Employment Service during the JFK administration and directing a government task force on African American unemployment during Lyndon Johnson’s administration. Johnson told Shultz, “George, if you have a good idea, and it’s your idea, it’s not going to go very far. But if it becomes my idea, it just might go somewhere. Do I make myself clear?” He did. According to Taubman, Shultz learned the “benefit of subsuming his own ego in service to a higher goal and higher-ranking officials, a self-effacing attitude that would reappear repeatedly as he attained higher office.” As he worked on jobs creation in the Johnson administration, Shultz also was active at Chicago, where he established one of the first minority scholarship programs at an American business school. 

Shultz’s work caught the eye of Nixon, who appointed him secretary of labor in 1968. In his new post, Shultz continued to champion civil rights, but always from behind the scenes. “The progressive minority employment and school integration agendas he supported,” Taubman writes, “seem driven more by his sense of fair play and adherence to the law than a burning desire to protect and enhance the rights of minority groups.” One of his most notable stands was his promotion of Nixon’s Philadelphia Plan, which created a federal affirmative action program. Shultz was critical to ensuring that the plan was approved by Congress, including defending it at a White House news conference in 1969. Following the vote, Republican Senator Hugh Scott of Pennsylvania, the minority leader, handed Shultz the tally, which he proudly displayed all his life. 

But Nixon and his advisers were often vexed by what they viewed as Shultz’s pusillanimity about wading into the political fray to target Democrats. As treasury secretary, Shultz tried to fend off the demands of Nixon, H. R. Haldeman, and John Ehrlichman to wield the IRS as a weapon against their adversaries. Shultz successfully defied White House Counsel John W. Dean III’s demand that he order the IRS to “pursue a list of several hundred George McGovern staff members and campaign contributors.” Shultz had the list buried in an IRS safe. He also defended the IRS when Nixon’s tax return came up for a random audit. 

But he acquiesced to Nixon’s vindictive score-settling when the White House pressured the IRS to investigate Lawrence O’Brien, the chairman of the Democratic National Committee. Watergate itself began when burglars broke into the DNC headquarters in 1972, where they hoped to replace wiretapping devices. Two months after the bungled burglary, Nixon leaned on Shultz to target O’Brien for tax evasion and fraud. On White House tapes, Nixon fulminated to Haldeman and Ehrlichman, “If you could dirty up O’Brien now I think that might be a lot better than waiting until later.” Shultz updated the White House in August on the investigation, noting that O’Brien appeared to be clean. Taubman recounts that when he confronted Shultz in 2017 about his role in the O’Brien investigation, he “seemed stricken … It was clearly not a topic he relished discussing.” In the end, Shultz remained loyal to Nixon, resigning only in May 1974, three months before the end of his sordid administration. 

In 1982, two years into Reagan’s first term, the president tapped Shultz for secretary of state. Shultz’s reputation as a Republican moderate preceded him, prompting a cabal on the right to seek to undermine him from the outset. They saw Shultz as a stalking horse for a perfidious Republican establishment that supported easing Cold War tensions with Moscow. For the right, any talks with the Kremlin, however innocuous, were tantamount to appeasement. According to Taubman, 

For a man who had witnessed at close hand the scheming of Nixon aides and the disintegration of a presidency, Shultz seemed surprisingly flummoxed by the chaotic foreign political operations of the Reagan administration. He looked outmatched by his opponents and unable to count on decisive support from the president in policy debates.

When Reagan embraced the Strategic Defense Initiative in 1983—the infamous “Star Wars” nuclear warhead interception system—Shultz only issued the mildest of demurs. He had what Taubman calls an “ingrained instinct” to fall into line and to remain loyal to his superiors and colleagues despite any deep policy disagreements. 

Shultz’s lowest moment, however, was the Iran-Contra affair. Taubman writes that Shultz is blameworthy for his “failure to stop the arms-for-hostage dealing at several critical moments when he heard about pieces of it, objected to it but stopped short of forcefully intervening.” Shultz managed to cover up his culpability—“he barely escaped indictment by [Independent Counsel] Lawrence Walsh for obstruction of justice,” writes Taubman—despite his deputy Elliott Abrams’s ouster as a principal player in the illegal scheme. Taubman notes that Shultz’s public denials about knowledge of the scheme do not conform with the detailed notes by his executive assistant Charles Hill. Shultz’s selective memory throughout the episode, Taubman writes, echoed Nixon’s warning in 1982 that Shultz had “a wonderful ability to, when things look iffy or are going wrong, he’ll contend he never heard about the issue and was never briefed and was not a part.” Nixon, a canny judge of character, and himself an expert at trying to dodge responsibility for illegal actions, had Shultz’s number.

It was U.S.-Soviet relations that allowed Shultz to make his mark and Reagan to rescue his presidency after the Iran-Contra affair came to light. Taubman’s most intriguing discovery in Shultz’s archive was a voluminous diary kept by his shrewd executive assistant, Raymond G. H. Seitz, who chronicled the vicious power struggle that took place between his boss and Reagan’s coterie of conservative foreign policy advisers, including Caspar Weinberger, William Clark, and William Casey. These anticommunist ogres, who headed the Defense Department, National Security Council, and CIA, respectively, were terrified that Reagan might show even a hint of flexibility toward the Kremlin. By patiently forging close ties with Ronald and Nancy Reagan, Shultz was able to outlast the other men and leave a historic imprint on U.S.-Soviet relations.

For all his anticommunist rhetoric, Reagan himself was alarmed about the possibility of nuclear war and repeatedly sought to contact Soviet leaders, starting with Leonid Brezhnev. It was not until Mikhail Gorbachev became the new leader of the Soviet Union that real change became possible. With the backing of Nancy Reagan, who detested the ideologues surrounding her husband, Shultz successfully encouraged the president to negotiate with Gorbachev and conclude sweeping arms control agreements in 1988. 

It was the high-water mark of Shultz’s career. He went on to champion George W. Bush and to support the Iraq War, which he viewed as spreading American values in the Middle East. He told Daniel Henninger of The Wall Street Journal in 2006, “I don’t know how you define ‘neoconservatism,’ but I think it’s associated with trying to spread open political systems and democracy.” In fact, it represented the triumph of the hawks whom Shultz had battled during the Reagan administration. 

The Theranos scandal was Shultz’s final ignoble episode. He publicly backed CEO Elizabeth Holmes over his own grandson, Tyler, who worked at the company and blew the whistle on its bogus procedures. Taubman suggests that for Shultz, “personal financial gain was likely a motive,” in addition to his deference to authority. It was his loyalty to Nixon during Watergate and capitulation to Reagan during Iran-Contra all over again. 

Reagan’s coterie of conservative foreign policy advisers were terrified that Reagan might show even a hint of flexibility toward the Kremlin. By patiently forging close ties with Ronald and Nancy Reagan, Shultz was able to outlast the other men and leave a historic imprint on U.S.-Soviet relations. 

In the Republican wars over domestic and foreign policy, Shultz was the good soldier. A sensible, responsible, and prudent official, he compiled a record of worthy accomplishments. But the contortions he had to perform during his government service to appease movement conservatives illuminate the abiding nature of public service in Republican administrations. They indicate that even reasonably balanced, rational, and public-spirited appointees often find themselves, after a period of resistance, caving to their more retrograde colleagues and superiors out of some combination of self-interest and belief that living to fight another day will allow them to continue shaping policy. Shultz’s saga of triumph and turmoil offers a reminder that the brutal moral conditions Republican administrations impose on those who work in them were not just confined to Trump, but have been manifest all along.

The post The Courage and Compromises of George P. Shultz  appeared first on Washington Monthly.

]]>
145013 Jan-23-Books-Taubman In the Nation’s Service: The Life and Times of George P. Shultz by Philip Taubman Stanford University Press, 449 pp.
Donald Trump Is No Grover Cleveland https://washingtonmonthly.com/2023/01/08/donald-trump-is-no-grover-cleveland/ Mon, 09 Jan 2023 01:10:00 +0000 https://washingtonmonthly.com/?p=145020

We are cursed to live in a time when Donald Trump’s grotesque shadow hangs over almost any topic of political or cultural conversation. So if you read Troy Senik’s new biography, A Man of Iron, about the life of Grover Cleveland, you can’t help but try to detect any relevant parallels between the one person […]

The post Donald Trump Is No Grover Cleveland appeared first on Washington Monthly.

]]>

We are cursed to live in a time when Donald Trump’s grotesque shadow hangs over almost any topic of political or cultural conversation. So if you read Troy Senik’s new biography, A Man of Iron, about the life of Grover Cleveland, you can’t help but try to detect any relevant parallels between the one person who won, then lost, then won the presidency, and the person trying to be the second.  

A Man of Iron: The Turbulent Life and Improbable Presidency of Grover Cleveland by Troy Senik Threshold, 383 pp.

Trump, thankfully, is only mentioned in passing, as the president who proposed including Cleveland in a new sculpture garden honoring American heroes, a project scotched by Joe Biden. Senik—formerly a speechwriter for President George W. Bush and a vice president at the conservative Manhattan Institute for Public Policy think tank—is wise to leave the distracting former president out of his taut and punchy narrative. How Cleveland—the only Democratic president elected in the post–Civil War 19th century—pulled off the greatest comeback in the history of American presidential politics is a fascinating subject. But it’s not one that offers a model of success for Trump to emulate. The Cleveland story is a morality play, exemplifying how unwavering principle can sustain a politician through difficult times. “Virtually everything worth saying about Grover Cleveland boils down to that one elemental fact: he possessed moral courage at almost superhuman levels,” writes Senik, in a sentence one would never write about Donald Trump. 

Granted, after learning about Cleveland’s first term record—228 penny-pinching vetoes of pension awards to individual Civil War veterans; a tariff reform push that hit a brick wall in the Republican Senate; and an obsessive personal involvement in minor civil service appointments—some readers might conclude that Senik’s praise is excessively effusive. Voters in 1888 were certainly underwhelmed. After his first term, Cleveland irritated wide swaths of the electorate: the powerful Grand Army of the Republic veterans lobby with his vetoes; Democratic job seekers with his commitment to civil service reform; and northerners who detected a bias toward industries from the Democratic South in his party’s kludgy tariff reform bill. Senik acknowledges that Cleveland led with his chin too often to win reelection, though that only contributes to his argument that Cleveland put principle before politics.

Cleveland’s reputation for rectitude did aid his 1892 comeback. After keeping a low profile in his first two years out of office, Cleveland faced a nomination challenge from a fellow New Yorker, the patronage-friendly Tammany Hall–backed Governor-then-Senator David Hill. In late 1891, Hill controversially kept being governor for a few weeks even after the state legislature appointed him to the Senate, raising concerns about his accumulation of power. Then, in full command of the state party, he tried to stack the Democratic National Convention with his delegates by holding the New York state convention in the snowy winter, making it harder for rural backers of Cleveland to attend. However, the scheme was so transparently Machiavellian that it made Cleveland look good in comparison, especially since, as Senik writes, “the intended victim of the plot was a man known for his sense of propriety and fair play.” Cleveland beat Hill in the convention on the first ballot. 

Then in the general election—a rematch against Republican Benjamin Harrison—the political tables turned. Under the Harrison administration, Republicans busted the budget with veteran pensions and raised the price of goods with protectionist tariffs. Cleveland’s stubbornness looked more virtuous, even populist. Senik quotes Harrison’s woeful postelection summation: “The workingman declined to walk under the protective umbrella because it sheltered his employer also. He has smashed it for the fun of seeing the silk stockings take rain.”

Harrison’s analysis captured the rising populism of the Gilded Age that had begun to transform the Democratic Party, a populism that, ironically, the party’s standard-bearer was determined to resist. 

Populist Democrats wanted to move America’s currency off the “gold standard,” in which the value of paper money was tied to the value of gold. As gold is a finite resource, the gold standard limited the availability of money and credit. A modest amount of less precious silver was included in the money supply, thanks to legislation that was enacted in 1878. But that wasn’t enough for the populists, who wanted an unlimited amount of silver. 

Cleveland opposed the nascent “Free Silver” movement in his first term. After his failed reelection bid, his Republican successor in 1890 accepted a compromise between the gold and silver advocates that increased the limit on silver. Once Cleveland began to reemerge publicly after the 1890 midterms, in which Democrats took control of the House and the silver forces gained steam, he issued a statement recommitting to the gold standard, which even his closest aides feared was political suicide. That it wasn’t, to Senik, was proof of his ability to “defy the laws of political gravity” because whatever controversy he sparked was “overtaken by the respect his candor inspired.”

Senik does not let his praise for Cleveland’s character and philosophical bent blind him to Cleveland’s shortcomings as president. His second term began in 1893 with what was deemed at the time to be the “Great Depression.” Cleveland’s response in his first year back in office was not to radically loosen the currency system, but to repeal the 1890 silver bill in hopes of replenishing the nation’s gold reserves. 

The conservative Senik is sympathetic toward Cleveland’s attempted application of classic liberalism. (He describes Cleveland as “the final Democratic president to embrace the classical liberal principles of the party’s founder, Thomas Jefferson … a narrow interpretation of the Constitution, a limited role for the federal government, and a light touch on economic affairs.”) Yet Senik also acknowledges that “the policy didn‘t work” to stabilize the currency, let alone end the panic. Not until 1895, when Cleveland reluctantly turned to J. P. Morgan to bail out the financial system, did the economy start to heal. But such a solution further fueled the populist faction, and put Cleveland further out of step with his own party. By 1896, Cleveland was a spent political force and the party fell in behind the charismatic young populist William Jennings Bryan, who would be nominated for president in three of the next four presidential elections.

Where Senik does betray a blind spot is in his assessment of Cleveland’s personal character. In the 1884 campaign, Cleveland was dogged by allegations that he fathered a child out of wedlock with a Buffalo woman named Maria Halpin, broke a promise to marry her, then used his political connections to have her institutionalized and her baby adopted by another family. In response to the attacks, Cleveland famously instructed his allies, “Whatever you do, tell the truth.” He never denied paternity, though he did not cop to any of the other allegations. 

Near the end of the 1884 campaign, two affidavits from Halpin, signed on October 28 and 29, added the accusation of rape, the latter being the most specific: “While in my rooms he accomplished my ruin by the use of force and violence and without my consent.” The 2011 book A Secret Life: The Lies and Scandals of President Grover Cleveland, by Charles Lachman, argued that the affidavits should be believed. Senik rejects the rape charge completely, however, and notes that Lachman ignored an interview of Halpin published in the November 3, 1884, edition of the Detroit Free Press, which reads, “I have no fault whatever to find with Mr. Cleveland” and “Words have been put into my mouth which I never uttered.” The paper further claimed that the Republican campaign hired people who duped Halpin into signing the affidavits. 

It’s true that Lachman leaves out this important news clip. However, Senik ignores a Chicago Tribune interview of Halpin, published on October 31, 1884, along with the latter affidavit, in which she said “the statement I made last night is true,” Cleveland “had attempted to pile up mud upon me,” and that “no one” pressured her to make her statements. She added, “Allow me to tell you the meanness of the man. When I sent for him, and informed him of my condition, he said: ‘What the devil are you blubbering about? You act like a baby without teeth. What do you want me to do?’”

Lachman should have included the Detroit Free Press dispatch, but Senik should have included the Chicago Tribune dispatch. The reality is that one of the two articles is false, and we don’t know which one. But Senik does not want to entertain the possibility that the character of Cleveland’s private life did not match the character in his public life. Any uncertainty would interfere with the book’s premise, that Cleveland had “superhuman moral courage.”

Whether or not Cleveland deserves scorn for sexual misconduct, Senik still deserves credit for a brisk and substantive assessment of Cleveland’s presidency that helps readers understand the confusing politics of the Gilded Age. And while it is doubtful that Senik had Trump on his mind, his portrayal shows how Trump cannot model himself after Cleveland. Trump may outlast his Republican rivals in the 2024 primary, and maybe even reclaim the presidency. But it won’t be because he is a man known for his sense of propriety and fair play.

The post Donald Trump Is No Grover Cleveland appeared first on Washington Monthly.

]]>
145020 Jan-23-Books-Senik A Man of Iron: The Turbulent Life and Improbable Presidency of Grover Cleveland by Troy Senik Threshold, 383 pp.