June/July/August 2016 | Washington Monthly https://washingtonmonthly.com/magazine/junejulyaug-2016/ Sun, 09 Jan 2022 09:58:58 +0000 en-US hourly 1 https://washingtonmonthly.com/wp-content/uploads/2016/06/cropped-WMlogo-32x32.jpg June/July/August 2016 | Washington Monthly https://washingtonmonthly.com/magazine/junejulyaug-2016/ 32 32 200884816 Populism With a Brain https://washingtonmonthly.com/2016/06/12/populism-with-a-brain/ Sun, 12 Jun 2016 17:00:25 +0000 https://washingtonmonthly.com/?p=57744 Populism with a brain

Ten old/new ideas to give power back to the people.

The post Populism With a Brain appeared first on Washington Monthly.

]]>
Populism with a brain

National Review recently described Bernie Sanders and Donald Trump as “two populist peas in a pod.” This was not a compliment. Across the political spectrum, people stick the “populist” label on politicians they see as exploiting the worst resentments and envies of some tribe or another. The segregationist George Wallace, by this reckoning, was a populist. So, too, Jean-Marie Le Pen.

Yet there is a richer tradition of populism in the United States that has new relevancy today. The term itself dates to the early 1890s, when, as the historian Michael Kazin notes, journalists used it to describe members of the newly formed People’s Party. These Populists with a capital P were men and women who, like us, faced an America in which monopolists were fast tightening their grip on all realms of the economy and concentrating immense wealth and political power.

These first Populists drew upon a political philosophy with roots back to the American Revolution. Part of this tradition is familiar—a belief that government must be run by the people. Populists called for direct election of senators and led the push for referendums and initiatives to bypass corrupt legislatures. But another part is largely forgotten—that the people are sovereign over the economy and have a responsibility to structure markets to promote the common good.

This was the “democratic republicanism” of Thomas Jefferson and James Madison. It holds that, just like political power, economic power must be distributed as widely as possible. Thus, the Populists focused much of their energy on combating efforts to monopolize commerce and natural resources, especially land. They also closely studied how to govern large corporations, and strongly supported unionization of workers and farmers to counter the power of concentrated capital.

In almost every key respect, the Populists succeeded in their revolution. In 1896 they captured the Democratic Party and ran William Jennings Bryan for president. He lost the election, but over the next sixteen years, even as the plutocrats tightened their grip, Bryan helped keep the fires of rebellion burning.

The turning point came in 1913 when Democratic President Woodrow Wilson, guided closely by future Supreme Court Justice Louis Brandeis, enacted key parts of the Populist agenda, modernizing anti-monopoly law, creating the Federal Reserve System, and reforming trade policy. It was these years that teed up the achievements of the 1930s, when the administration of Franklin Roosevelt and Populist members of Congress like Wright Patman carried anti-monopoly principles back into almost every realm of American life.

By the 1960s, the anti-monopoly movement championed by the Populists existed no more, but largely because it had become thoroughly institutionalized. In 1962, the Supreme Court upheld an antitrust case, brought by the Eisenhower administration, that prevented the merger of two shoe companies because it would have given a single distributor a 2 percent share of the national market.

It is important to understand what the Populists were not. In 1892 they did call for public ownership of railroads, but the movement generally looked askance at “socialistic” actions that might undermine wide-scale ownership of private property. Populists also parted company with those who, like Teddy Roosevelt, argued that the way to control Big Business was to regulate it with Big Government. Populists did favor using the federal government, but mainly to break up monopolies so as to make them small enough to be regulated by competition in open markets.

Some historians have judged all Populists by the actions of a racist minority. It is true that one early Populist leader, Tom Watson, later became one of the most virulent segregationists and anti-Semites in American history. And Woodrow Wilson himself segregated the U.S. government and U.S. military. But most Populists kept their focus on restructuring the economy to better promote equality and democracy. In many regions, Populists pioneered inter-racial organizing, often drawing the ire of established political groups. In one county in Texas, Democrats formed a “White Man’s Union” specifically to combat the alliance of whites and blacks in the local People’s Party.

In the late 1970s, policymakers in both parties, under the banner of “free markets” and “deregulation,” began dismantling the highly successful economic regime put in place by the Populists.

In the late 1970s, policymakers in both parties, under the banner of “free markets” and “deregulation,” began dismantling the economic regime put in place by the Populists and their followers. The concentration of wealth that resulted is now so extreme that even the Economist has chided Americans for failing to stand up to monopoly.

The Populists would certainly have supported Sanders’s call for public financing of campaigns and his embrace of “Break Up the Banks.” But the men and women who rose against America’s oligarchs of a century ago would have been deeply dismayed that no candidate in 2016 has taken a strong stand against monopoly.

And so we ask: If a candidate for office in 2016 were truly guided by traditional American Populist principles, what would he or she aim to do?

1. PROTECT DEMOCRACY BY RESTORING MARKET COMPETITION

In the campaign of 1912, Woodrow Wilson made one of the most powerful attacks on monopoly in American history. Wilson condemned monopolists for harming not only the economy but also democracy. “If monopoly persists, monopoly will always sit at the helm of the government,” Wilson said. “[W]hat we have to determine now is whether we are big enough, whether we are men enough, whether we are free enough, to take possession again of the government which is our own.”

Here is the Populist philosophy of competition at its purest. To ensure the liberty of the individual and protect democracy, citizens must force the powerful to compete. Adam Smith’s economics plays a role. But it is the stark realism of Jefferson and Madison, with their intense distrust of concentrated power, that governs. The fundamental aim of government is to extend the system of checks and balances to the political economy, and to break or neutralize all big concentrations of private power.

As Senator John Sherman, author of the Sherman Antitrust Act of 1890, put it, “It is the right of every man to work, labor, and produce in any lawful vocation and to transport his production on equal terms and conditions and under like circumstances. This is industrial liberty, and lies at the foundation of the equality of all rights and privileges.”

In this tradition, breaking up monopoly has little to do with promoting efficiency or better deals for consumers, and everything to do with protecting political equality, self-government, and democratic institutions. As Brandeis explained, “The doctrine of the separation of powers was adopted . . . not to promote efficiency but to preclude the exercise of arbitrary power.” The way to “save the people from autocracy,” he said, is precisely by building “friction” into the system.

Over the following decades, these principles guided how Americans distributed economic power and protected industrial liberty. Despite wave after wave of technological change, concentration declined in almost every realm of the economy.

But in the 1970s, a small group of intellectuals—some, like Alfred Kahn, with roots in mainstream liberalism; and others, like Robert Bork, with roots in conservative Chicago school economics—systematically targeted the achievements of the Populist tradition. While anti-monopoly laws remained on the books, they were reinterpreted in ways that defeated their historical purpose. No longer would the aim be to promote economic and political liberty. Instead, according to guidelines enacted in 1982 by Ronald Reagan’s Justice Department, big corporations would be allowed to get bigger so long as they did not immediately hike prices to the “consumer.”

In retrospect, the evidence is close to irrefutable that adoption of this philosophy of “efficiency” unleashed a process of concentration that over the last generation has remade almost the entire U.S. economy, and is now disrupting our democracy.

What would a True Populist do today? Immediately restore America’s traditional anti-monopoly philosophy, most importantly in the guidelines that determine how enforcers view the purpose of these laws, and start breaking up today’s monopolies.

2. USE TRADE POWER TO RESTORE AMERICAN INDEPENDENCE

The Boston Tea Party, every schoolchild knows, struck a blow for America’s independence from Britain. Less well known is that the Tea Partiers were specifically protesting domination by corporations.

As Samuel Adams and John Hancock put it in a 1773 letter, the Tea Act was dangerous precisely because, in giving a huge tax break to the British East India Company, it was “introductive of Monopolies which, besides the trains of evil that attend them in a commercial view, are forever dangerous to public liberty.”

For the next 200 years, these twin goals—independent citizens in an independent nation—shaped America’s trade policies. The basic aim was to protect the freedom of action of the United States and of individual Americans. The method was to break or deflect the power of all foreign state-directed, or “mercantilist,” trading systems. The principles were those of anti-monopoly, extended to the international sphere.

Populists opposed any permanent tariff wall that would reinforce the power of domestic monopolists. At the same time, they were aware, as were Americans of all stripes, that monopoly abroad was just as big a threat as monopoly at home. That’s why both Democratic and Republican presidents straight through Ronald Reagan aggressively used everything from temporary embargoes to temporary tariffs and quotas to disrupt attempts by trading partners to monopolize control over any vital manufactured good.

But starting in the 1990s, the U.S. largely abandoned any effort to use trade policy as a weapon against monopoly. Instead, in joining the World Trade Organization (WTO), the Clinton administration and Congress effectively outsourced the making of trade policy to a body dominated by immense international trading companies. Even when headquartered in the United States, these giants have little inherent concern for the independence of America as a nation or of Americans as individuals.

As a direct result, America today has been made to rely on single offshore sources for many of the basic goods we use every day. In the case of democratic allies like Japan and Germany, the risks of such dependence are limited. In the case of China—which now controls production of most or all of U.S. supplies of many vital drugs and electronics—the threats to U.S. political sovereignty are potentially immense.

Populists were aware that monopoly abroad was just as big a threat as monopoly at home. They applied the principles of anti-monopoly to international trade.

Almost as shocking, recent U.S. administrations have allowed foreign state enterprises to essentially rule large swaths of the U.S. economy, and many of the people who work in those industries. Control of America’s big beer, beef, and processed food sectors, for example, is increasingly in the hands of Brazilian state corporations like Anheuser-Busch InBev and JBS. Chinese state corporations control much of America’s pork supply, and have gained leverage over Hollywood’s cultural production through control of AMC theaters.

What would a True Populist do today? Not erect protectionist tariffs like those proposed by Donald Trump. Nor simply bad-mouth the Trans-Pacific Partnership trade agreement. A True Populist would abandon the WTO and apply the principles of anti-monopoly to international trade, to protect America and Americans from all foreign monopoly.

3. BAN PRICE AND DATA DISCRIMINATION

By the time the Populists took their stand, Americans had been fighting railroad tycoons for a generation. The problem, in their eyes, was not merely that the railroad monopolists charged too much. It was that these absentee corporations had the power to control, through the common railway practice of charging different people different prices for the same service, whether they and their communities succeeded or failed.

Charles Francis Adams Jr., the grandson of John Quincy Adams and son of one of the founders of Abraham Lincoln’s Republican Party, helped lead the railroad reformers. Adams described the practice of price discrimination as “favoritism of the grossest character,” which turned the railway into “a law unto itself.” The fortunes of even large cities including Philadelphia, Baltimore, St. Louis, and Cincinnati rose and fell according to how various railroad financiers, or “robber barons,” set their rates.

Some observers called for public ownership of railroads, pointing to the model that since ancient times has applied to most roads and highways. Most reformers, however, settled on an all-American approach that combined private ownership of the railroad with strict bans on most price discrimination. Drawing on the legal tradition of “common carriage,” under which American and English governments had for centuries prohibited private carters, cabbies, and innkeepers from charging different prices to different customers, railroads would be banned from favoring some shippers and passengers over others.

The reformers generally recognized that railways should be allowed to offer different terms to different classes of goods—charging less, say, for timber than for manufactured goods. But they condemned all discrimination within any given class, which the railway scholar Arthur Hadley described as the “most serious evil.”

Many states formed railroad commissions to enforce this principle of nondiscrimination, which, as Adams put it, should aim to permit “an unchecked flow of travel and commerce, the continuation of which may with safety be calculated upon.” With passage of the Interstate Commerce Act of 1887, the principle became federal law.

By 1913, this approach to railway regulation was so successful that, once in office, Wilson and Brandeis began to extend common carrier rules to many other realms of commerce—from electricity to retailing. The very first lines of the Clayton Antitrust Act set out to eliminate “discrimination in price, services, or facilities.”

Populists passed laws that prevented railroads and other monopolistic corporations from favoring some customers over others. But today, price discrimination by monopolists is increasingly the rule.

Yet in today’s America, price discrimination by monopolists is increasingly the rule. In hospital care, insurance, seeds, cable, and even books and news stories, we see monopolists discriminating among producers and increasingly among consumers. And with the rise of Big Data, the ability of today’s biggest companies to discriminate profitably grows almost by the day. The one exception is telecommunications, where the FCC’s “net neutrality” rules have restored traditional order to large parts of the industry.

What would a True Populist do today? Immediately apply common carrier principles to every realm where we see monopoly combining with Big Data, and ban individualized price and data discrimination—against both the producer and the consumer—everywhere.

4. BREAK UP AMAZON, FACEBOOK, GOOGLE, AND COMCAST

By the time Woodrow Wilson took office, Americans had long since learned that simply outlawing price discrimination was not always enough to ensure that a monopolist in control of a vital service would treat all customers the same.

The lesson was driven home by the rise of Carnegie Steel and Standard Oil. In both instances, their bosses—Andrew Carnegie and John D. Rockefeller, respectively—had captured de facto control over the railways that carried their goods to market. In both cases this control enabled them to exclude their rivals from the market and thereby to concentrate power and control.

The result was a new focus on banning certain forms of “vertical integration,” as when a railroad buys a coal mine and then favors that mine in ways that harm other coal mines that depend on the railroad to get their product to market. The goal was to prevent the actions of the monopolist from being warped by any “conflict of interest.”

Into the 1970s, America had hundreds of populist laws to ensure that farming, banking, services, and light manufacturing would remain open to the smallholder and controlled by the community.

At the federal level, the first effort to draw clear lines between particular types of business dates to the National Bank Act of 1863, which limited bankers to “the business of banking.” But it wasn’t until around 1912 that Americans made a concerted effort to ban vertical integration by other private providers of vital services.

Now courts blocked big manufacturers from buying retailers and using them to exclude rivals, as in the American Tobacco case of 1911. And Congress used the Clayton Antitrust Act of 1914 (and later the Robinson-Patman Act of 1936) to prevent giant trading companies and retailers from leveraging each other’s power. And Congress and the Roosevelt administration in 1935 banned electrical utilities from investing in unrelated businesses like street trolleys.

When it came to production goods like cars or chemicals, enforcers largely allowed managers of a corporation to decide what manufacturing activities to bring in-house. But for corporations that provided vital services to other firms, enforcers all but banned most direct ties. Their goal, as with anti-discrimination laws, was to ensure that such vital middlemen treat every customer the same.

This practice remained in effect even after the overturning of most other traditional anti-monopoly law in the early 1980s. The Reagan administration pushed for de-integration of AT&T in 1982. The FTC in the 1990s continued to enforce clear lines of separation between drugmakers, drug managers, and drugstores. In 1998, the Clinton administration demanded the complete separation of Microsoft’s DOS business from its browser business.

But about fifteen years ago, the Bush administration dropped the guard against vertical integration. Since then Comcast, which distributes television shows, has been allowed to merge with NBC, which produces shows. Amazon, the dominant retail marketplace for books, has been allowed to go big time into publishing books. And Google, which dominates search, has been allowed to compete directly with companies like Yelp, which rely on Google’s search engine.

What would a True Populist do today? Break up Amazon, Facebook, Google, Comcast, and any other essential network monopoly by banning them from owning companies that depend on their services.

5. LOCALIZE BANKING, RETAIL, AND FARMING

The spirit of America, Woodrow Wilson wrote in 1913, lies in “the enterprise of the people throughout the land. . . . [I]f America discourages the locality, the community, the self-contained town,” he said, “she will kill the nation.”

Wilson’s localism was not simply nostalgia for an age of general stores and town wheelwrights. Instead he was articulating a core principle of “democratic republicanism.” This holds that to be truly free, the citizen must be independent and self-governing, and must share equally in ruling a political community that controls its own fate.

A century ago, both the citizen and the community were under grave threat. Giant corporations increasingly determined how the farmer and skilled laborer worked and lived, and distant lords wielded growing sway over the local community. Just as Wilson took office in 1913, the Woolworth’s chain store corporation opened an immense office tower in Lower Manhattan. Here was a physical manifestation of concentrated power, the wealth of hundreds of American communities piled into the tallest structure on earth, on Wall Street.

True Populists would be appalled that most of the work of the federal government is today outsourced to contractors, whose employees outnumber the ranks of actual government workers by nearly fourfold.

Louis Brandeis, in 1933, described the threat posed by such giants. Speaking of the authors of an anti–chain store law, he said, “They may have believed that the chain store, by furthering the concentration of wealth and of power and by promoting absentee ownership, is thwarting American ideals; that it is making impossible equality of opportunity; that it is converting independent tradesmen into clerks; and that it is sapping the resources, the vigor and the hope of the smaller cities and towns.”

Beginning under Wilson and continuing into the 1970s, Americans passed hundreds of laws to ensure that farming, banking, services, and light manufacturing would remain open to the smallholder and controlled by members of the community. These included the Packers and Stockyards Act of 1921, to preserve competitive markets for the farmer; the McFadden Banking Act of 1927, to prevent banks from expanding across state borders; and the Robinson-Patman Act of 1936, to stem the rise of giant retailers like the A&P.

The laws worked. Between 1920 and 1980 the percentage of the market controlled by the top four meatpackers fell from more than 80 percent to about 25 percent. In 1966, the Supreme Court blocked a merger that would have combined 7.5 percent of the Los Angeles grocery market under one roof. And the resulting competition drove down prices in almost every sector of the U.S. economy.

But since the 1970s, both Democrats and Republicans have undone almost all these laws. The result has been a concentration of power and wealth that would have horrified True Populists. In groceries, pharmacies, hardware, and office supply, control has been consolidated in as few as one or two giants. So, too, wealth—the Walton family alone is now as rich as 140 million other Americans combined. And with the rise of online goliaths like Amazon, which aims to be the “Everything Store,” control will only be yet further concentrated.

What would a True Populist do today? Besides neutralizing large online retailers, a True Populist would revive the laws Americans used to localize banking, farming, and retail through the heart of the twentieth century.

6. MAKE ALL GOVERNMENT PUBLIC

In his 1904 book, The Shame of the Cities, the muckraking journalist Lincoln Steffens described how the “privatization” of government services had corrupted communities across America. Describing St. Louis, he wrote, “Along about 1890, public franchises and privileges were sought, not only for legitimate profit and common convenience, but for loot. . . . The riff-raff . . . drove out the remaining respectable men, and sold the city—its streets, its wharves, its markets, and all that it had—to the now greedy business men and bribers.”

Populists and reformers in all parties used various means to push such forms of corporate self-dealing out of government, sometimes under the banner of “No use of public powers or public property for private profit.” One early example was the Pendleton Civil Service Act of 1883, a bipartisan move to ensure that public officials were fully independent of private corporations.

In the case of companies that provide vital services like gas, water, and electricity, the Populists took a similar but more hybrid approach. As with railways, the Populists often chose to leave these businesses in private hands, then focused closely on preventing all discrimination in pricing and service. One goal was to ensure that executives received some sort of reward for good management. Another was to limit the ability of the masters of these public monopolies to exploit the power inherent in the monopoly to serve themselves or their friends. In the early years of the twentieth century, many communities established independent public service commissions to formalize this work.

But beginning in the 1970s, this legacy of Populism came under assault across America. In the name of “efficiency,” policymakers in both parties “privatized” more and more government functions, and more and more control over public utilities.

Today, most of the work of the federal government is outsourced to contractors, whose seven million employees outnumber the ranks of actual government workers by nearly fourfold. Similarly, hundreds of localities have cut regulation of monopoly utilities, like electricity.

One result is precisely the self-dealing that Populists worked so hard to abolish. Many of today’s federal contractors pay themselves lavish incomes, even though all or most of their work comes from the government. The CEO of Lockheed Martin, the federal government’s largest contractor, earned $20 million last year—100 times what taxpayers gave Defense Secretary Ash Carter, whose department oversees most of Lockheed’s work. Similarly, the managers of many utilities pay themselves kingly salaries for overseeing public businesses. The results can lead to a grotesque warping of incentives. In the nation’s capital, the CEO of the Washington Metro earns close to $400,000 annually; the CEO of the local electrical utility Pepco made $15.4 million in 2014.

What would a True Populist do today? Insist that the managers of any corporation receiving more than a quarter of its revenues from taxpayers—including defense contractors, universities, and hospitals—work at government wages. And require that the bosses of local public utilities earn no more than the public servants who regulate them.

7. PROTECT THE INDUSTRIAL ARTS

In 1892, the banker J. P. Morgan, taking advantage of an economic bust, bought up the patents of that most prolific of American inventors, Thomas Edison. He later combined Edison’s patents with those of George Westinghouse and Nikola Tesla and launched a new company, General Electric. The banker was now boss, and the industrial corporation was his tool.

By 1912, what to do with such industrial monopolies had become a major topic of debate. Most Americans welcomed such technological marvels as the mass-produced light bulb, telephone, automobile, and steel girder. But many also fretted about how to govern these immense corporations, which exerted so much power over people and places.

Teddy Roosevelt believed that the answer was to use the federal government to directly regulate the industrial corporation. But the idea of combining the power of the federal government with that of the giant corporation horrified the Populists. It is far safer, Brandeis said, to “regulate competition” than to “regulate monopoly.”

Over the next generation, the Populists came up with two answers. One was introduced by Thurman Arnold, the head of antitrust under Franklin Roosevelt. Arnold’s idea was to have at least three or four corporations manufacturing any particular item. This achieved Brandeis’s goal of external regulation through competition, but it also left the industrial corporations big enough—both horizontally and vertically—to provide scientists and engineers enough room and resources to work their wonders, at scale.

The second answer was to check the power that financiers and speculators exercised over the internal operations of corporations so that scientists and engineers, as well as their managers and frontline workers, could do their jobs. Rigorous antitrust enforcement suppressed the flow of mergers—hence the need for finance. But the Populists reinforced that measure with other actions. This included the Glass-Steagall Act of 1933, which stated that banks couldn’t use depositors’ money to make deals, and the Securities and Exchange Act of 1934, which buttressed the power of small shareholders. It also included support for unionization by skilled workers, to empower them to better resist the demands of the financier.

Since the 1980s, however, a general reversal of these policies has once again shifted power over the industrial corporation into the hands of the financier. One result is what’s happened to Pfizer. In 1950, it was this company’s scientists who discovered the antibiotic oxytetracycline. But in recent years, Pfizer’s managers have devoted themselves mainly to putting cash into the hands of the bankers who now effectively control the company. One way they have done so is by engineering a long line of giant mergers and then firing thousands of scientists. As one of Pfizer’s own top executives wrote, the impact of these firings “on the R&D of the organizations involved has been devastating.”

What would a True Populist do today? Besides forcing all industrial corporations to compete (see #1), a True Populist would use labor law and securities law to shift power away from the predatory financier to the scientist, engineer, and skilled worker.

8. TAKE BACK LEISURE

On July 4, 1915, Louis Brandeis delivered a holiday oration at Faneuil Hall in Boston that gave voice to an ideal deeply shared by his fellow Populists. “A short workday is as essential as adequate food and proper conditions of working and of living,” Brandeis declared. “The worker must, in other words, have leisure.”

By leisure, Brandeis did not mean idleness. As his biographer Jeffrey Rosen explains, he meant the free time necessary to pursue one’s deeper needs and to play a meaningful role in the life of one’s family and community. “Leisure,” Brandeis said, “means ability to work, not less, but more, ability to work at something besides breadwinning. . . . Leisure, so defined, is an essential of successful democracy.”

In America, this vision dates back to the Declaration of Independence, and Thomas Jefferson’s idea that Americans had a right to “the pursuit of happiness.” As the historian Benjamin Kline Hunnicutt details in Free Time: The Forgotten American Dream, from the founding until about forty years ago Americans defined progress mainly as the attainment of more time to pursue what really matters in life—like one’s family, one’s community, one’s arts, one’s hobbies, or one’s god.

In the nineteenth and early twentieth centuries, this meant fighting not merely for higher wages but also for an eight-hour workday and a forty-hour workweek. It meant fighting for the end of child labor and the ability to retire. It meant supporting unions. In many communities, it even meant “blue laws,” which closed stores on certain days and after certain hours, so people could have that time free.

And the policies worked. Between 1830 and 1930, the average American’s working hours fell nearly in half. Indeed, by the 1960s, futurists were predicting that, thanks to ever-increasing automation, the principal challenge facing the next generation of Americans would be what to do with an abundance of free time.

Populists found ways to check the power of financiers so that scientists, engineers, frontline workers, and professional managers could do their jobs. Productivity never grew faster.

Yet today, even as American workers as a whole produce almost three times more in an hour than did their counterparts in 1960, they are typically working at least as long and often much longer. Among adults working full-time in the U.S., the average workweek is now forty-seven hours, with fully a quarter working at least sixty hours. Worse, many workers now have virtually no control over their schedule, as they are required to remain on call for shifts that could come at any time of day or night. The resulting conflicts between work and family life are now widely seen as a spreading crisis that is particularly hard on women and children and on social institutions that depend on volunteers to donate time.

What would a True Populist do today? Besides restoring competition for labor (see #1 and #5), a True Populist would immediately push to cut the workweek back to forty hours. He or she would do so by promoting a living wage and stronger unions, and by cracking down on companies like Uber that deny their workers labor protection by misclassifying them as contractors.

9. KEEP PLANES, TRAINS, AND ROBOTIC CARS OUT OF THE HANDS OF PLUTOCRATS

As Americans glimpsed the possibilities for mass air travel in the 1930s, they faced the same challenge their parents had confronted with railroads. If airlines were allowed to monopolize, they would discriminate in ways that determined which cities, which businesses, even which individuals, rose or fell.

Populists defined progress mainly as the attainment not of more stuff, but of more time to pursue to what really matters- like devoting more attention to one’s family.

In many other countries, the result was public ownership. But Americans wanted to preserve a role for private initiative, so they applied the same model of competition they had mastered with railroads. As expressed in the Civil Aeronautics Act of 1938, the goal was to promote “adequate, economical, and efficient service by air carriers at reasonable charges, without unjust discrimination, undue preferences or advantages, or unfair or destructive competitive practices.” In effect, this meant that private airlines operated flights, while a government board ensured the service was of high quality and fairly distributed.

The airline industry soared, and America did as well. Airfares fell dramatically, and by 1977, 63 percent of Americans over the age of eighteen had taken a trip on an airplane, up from 33 percent in 1962. Regulators succeeded in keeping all cities more or less equally connected to the world.

But in the late 1970s, the Carter administration repealed this body of law, in the name of “deregulation.” In the years since, airlines have been allowed to consolidate to a degree unknown even to the railroad barons. Today four super-carriers control 80 percent of traffic, and enjoy outright monopoly on many routes.

These monopolists do just what monopolists of the past did. They charge different passengers different prices for the same flight. They cut the quality of air service even as they pocket record profits. And they discriminate among people who live in different cities, cutting service and hiking fares to places like St. Louis, Memphis, and Minneapolis, in ways that make it harder to attract and keep business.

Bad as all this is, the future looks even bleaker. On-demand transport and self-driving cars are fast ushering in the next major transportation revolution. These services have the potential to fundamentally alter how people and goods travel. Yet the destruction of the principles used to run the U.S. airline network has created an intellectual and legal void. As a result, a few private corporations, such as Google and Uber, are left free to write the rules of twenty-first-century transportation, with little coherent consideration for the interests of the American public or the individual American citizen.

The basic problem is that these private for-profit monopolists are increasingly able to combine their control of mapping software, the operating systems of vehicles, and the vast reams of information they collect on each rider in ways that give them alone the power to regulate who goes how fast, by what route, and at what cost.

What would a True Populist do today? Restore public regulation of all transportation to ensure fair service for all Americans.

10. POWER (AND IDEAS) FROM THE PEOPLE

In September 1932, campaigning in Oregon, Franklin Roosevelt described how he wanted to develop the nation’s electrical infrastructure. After railing against “public utility barons,” and dubbing one magnate’s empire a “monstrosity,” he detailed the principles that guided his thinking.

First, the supply of electricity should be “satisfactory and cheap” as well as “fair,” with no discrimination by region, business, or household. Second, although electricity should generally remain “a function for private initiative,” when a community is not happy with the service or rates offered by a private company “it has the undeniable right” to set up a “governmentally owned” system. Third, when power is generated from a public resource like water, that power is, Roosevelt said, “our power.”

Over the coming decades, these principles shaped thinking at both the federal and state levels. To this end, Congress and the White House in 1935 broke up electrical utilities along state lines, to keep them small enough to regulate effectively. And in 1936, they passed the Rural Electrification Act to help citizens organize and pay for locally owned electric cooperatives.

The result was something new in the world—universal, world-class, affordable, fairly distributed power. This system was a main factor behind the flourishing of the twentieth-century U.S. economy, and of communities and families from coast to coast.

Then, in the 1980s, two new visions of America’s power future began to compete.

One looked to update Roosevelt’s original vision for America’s electrical system to take advantage of new technologies. An early result was a 1992 law designed to separate the business of distributing electricity from the business of generating electricity. The goal was to create markets to promote the building of cleaner and cheaper power plants. More recently, with the advent of affordable solar and wind power, it meant a more radical vision, in which every home, every farm, even every car, could become a generator of electricity, with the electrical network serving as a commonly owned balancer of supply and demand—or, more simply, a two-way open-access highway for electricity.

Uber and self-driving cars are ushering in a transportation revolution, yet without the guidance of the Populists principles Americans long used to keep monopolists from controlling our mobility.

The other vision of America’s electricity future was to restore multistate centralized giants run largely by financiers. This vision was brought into effect by a 2005 law, the Energy Policy Act, strongly supported by then Vice President Dick Cheney, which undid most of the 1930s-era prohibitions on size and ownership. Since then a growing number of corporations, such as Exelon and Duke Energy, have focused on buying up smaller, neutral utilities wherever they can. They then use these local monopolies to block the introduction of new technologies and to force entire communities to consume power generated by antiquated coal and nuclear plants.

What would a True Populist do today? In addition to prohibiting discrimination (see #3) and banning vertical integration (see #4), a True Populist would follow Roosevelt’s lead. This means doing whatever it takes to provide world-class power and broadband service to all Americans. And it means freeing Americans to generate and share power (and ideas) in whatever way they will.

The post Populism With a Brain appeared first on Washington Monthly.

]]>
57744
Free Trade Is Dead https://washingtonmonthly.com/2016/06/12/free-trade-is-dead/ Sun, 12 Jun 2016 16:58:53 +0000 https://washingtonmonthly.com/?p=58390

How did Washington get trade policy so wrong? And what comes next?

The post Free Trade Is Dead appeared first on Washington Monthly.

]]>

Regardless of who wins the presidential election in November, the 2016 campaign has already dramatically undermined a major pillar of post–World War II American economic and foreign policy—free trade.

Hillary Clinton’s current rejection of the same Trans-Pacific Partnership (TPP) free trade agreement that earlier she had called “the gold standard” of free trade deals is a far cry from her husband’s 1990s embrace of globalization as essentially the same thing as Americanization. Of course, her shift of position is a dramatic indication of how much she is feeling “the Bern,” since he rejects “all the crazy trade deals” of the past forty odd years.

Even more surprising is Donald Trump’s effective capture of the Republican presidential nomination on the basis of trashing the “terrible trade deals” and the free trade doctrine that have long been tenets of the conservative Republican faith. For all his bullying, narcissism, policy ignorance, and shameless self-contradictions, Trump is resonating with voters in significant part because of his willingness to break with the establishment elite on trade. Yes, his talk of slapping 45 percent tariffs on imports, forcing Apple to move iPhone assembly from China to America, and telling our allies to pay us for providing defense is uninformed and unrealistic. (Presidents don’t have the authority to set tariffs. IPhone assembly is low-pay work that won’t raise U.S. wages; we need to make the high-value-added parts. And allies might—and should—increase their own defense spending, but we can’t make them pay us directly.) The public, however, sees in Trump’s and also Sanders’s comments the articulation of a possibly larger truth and the revelation of a possible giant confidence job.

Of course, it’s possible that all this anti–free trade deal talk is just campaign hype and that orthodoxy will return to rule in Washington after the election. However, the fact that the top presidential candidates in both parties—conservative Senator Ted Cruz also opposes the TPP—have turned against policies upheld in bipartisan fashion since the end of World War II suggests otherwise. The public has spoken: polls show that opposition to current free trade arrangements is one of the few positions Democratic and Republican voters share.

How did we get to this point? The answer is twofold. For seventy years, leaders of both parties have pursued trade deals less to strengthen the American economy than to achieve geostrategic aims, from rewarding political-military allies to fostering development of emerging markets. And they’ve been encouraged in this pursuit by generations of economists who have argued that trade deals, no matter how one-sidedly generous to other nations, are also good for the American economy—which raises the second point. Globalization has changed conditions so dramatically that this orthodoxy is no longer true, if it ever was. With the public now in full rebellion and presidential candidates leading, or at least adjusting to, that revolt, change to our trade stance is coming. What we really need, however, and haven’t seen from any candidate, is a comprehensive strategy that can both strengthen the American economy and meet our geopolitical needs.

It is important to understand that from the early 1800s until about 1932, America specifically rejected free trade. Washington and Hamilton were protectionists, as was Lincoln. Teddy Roosevelt famously wrote, “Thank God I am not a free trader,” and Wilson squeezed the last penny out of Great Britain during World War I. So the United States got rich as a disciple of mercantilism.

The Great Depression, victory in World War II, and the outbreak of the Cold War led to a complete change of the American tune. The very high Smoot-Hawley tariffs introduced in 1930 during the Republican Hoover administration had been blamed for the outbreak of the Depression by free traders and the Democratic Party during the 1932 presidential election campaign. In fact, as the University of California, Berkeley, professor Barry Eichengreen has demonstrated, the tariff probably had a mildly expansionary impact on the U.S. economy. But the argument was about power, not facts, and after 1932 it became a given in any economic discussion that Smoot-Hawley had triggered the Depression and that “protectionism” was a wrongheaded policy.

Moreover, after the war American industry no longer needed protection; U.S. producers were the leaders in virtually every industry. Rather than protection, industrial leaders now asked for access to foreign markets. The great task was no longer for America to catch up, it was for America to help the rest of the world get up. The urgency of this was, of course, greatly heightened by the outbreak of the Cold War. The United States made the rebuilding of its ruined allies and their defense its top foreign policy objective. International economic policy was no longer so much about economics. It had become a major tool of geopolitics, and opening its markets for trade had become a major part of U.S. grand strategy.

It was particularly attractive that this free trade approach was said by economists to be always and everywhere a win-win proposition. This idea was based primarily on the insights of the British banker David Ricardo and the Swedish economists Eli Heckscher and Bertil Gotthard Ohlin. In 1817, Ricardo developed the notion of “comparative advantage” by using the example of the production of cloth and wine in Britain and Portugal. He posited that in Britain, 100 hours of labor were needed to produce a unit of cloth and 120 to produce a unit of wine, while in Portugal it was only ninety hours for cloth and eighty for wine. So Portugal was the low-cost producer of both products, but it had a bigger cost advantage in wine than in cloth. If both countries specialized in what they did best and traded for the rest, the total amount produced of both products would be greater and each country would have more of both. Even if Britain refused to specialize, Portugal would still have more by doing so, and vice versa.

In the 1930s, Heckscher and Ohlin elaborated on this by adding capital and natural resources as factors of production to Ricardo’s labor. Whereas Ricardo had assumed that differences of technology within countries determine what they do best, Heckscher and Ohlin assumed that technology would spread rapidly and evenly to all players, and trade flows would be determined by endowments of land, labor, and capital.

A country rich in capital and with a high quotient of skilled versus unskilled labor would be expected to produce capital-intensive, high-technology products (computer chips, for example) and to export them in exchange for low-skill, labor-intensive, or naturally occurring products (apparel, toys, oil) from countries with a lot of unskilled labor or significant holdings of key natural resources.

It worked beautifully—at first. A succession of international economic deals slashed tariffs and provided capital and technology transfers that brought not only prosperity but also democracy to Europe and Japan while Communist expansion was checked in Europe and Asia. Exchange rates were fixed (per theory), international financial flows were relatively small, U.S. companies were happy with their expansion into Europe, and America ran a trade surplus and maintained full employment. It looked like all America had to do was keep the global peace and negotiate more free trade deals, which did seem always to be a win-win proposition, both economically and geopolitically.

Then things got more complicated. By the end of the 1960s, Europe and Japan had fully recovered. In 1971, the United States ran a trade deficit ($2 billion)—its first since 1888. That deficit rose in the 1970s. To stem it, President Richard Nixon traded in the fixed exchange rate system for a floating rate regime, but that didn’t stop “trade frictions” from starting to complicate important geopolitical relationships, especially with Japan. By 1980, the U.S. trade deficit was about $20 billion annually, and about half of it was with Japan.

In its drive for postwar recovery, Japan had ignored American advice to concentrate on production of labor-intensive goods. As an architect of the Japanese economic “miracle,” Naohiro Amaya, once explained, “We did the opposite of what the Americans told us.” The key elements of the miracle model included the protection of domestic markets, virtual prohibition of foreign investment, compulsory savings deposited into government-guided banks that directed lending to target industries, export-led growth, industrial policies aimed at developing domestic production in “strategic” industries (steel, shipbuilding, automaking, semiconductors, and so on), a currency that was undervalued against the dollar, technology transfer as a condition for foreign entrance into the market, and emphasis on building advanced infrastructure.

This was neither Adam Smith’s famous “unseen hand” nor the comparative advantage of Ricardo. It was “catch-up” industrial policy, and it worked so well that it was quickly imitated by Korea, Taiwan, Singapore, Malaysia, and others. It also gave rise to two conflicts: an external one between America and many of its trading partners, and an internal one pitting America’s trade agencies, some of its corporations, and many of its labor unions against its geopolitical agencies—the White House, the Departments of Defense and State, the National Security Council—and most professional economists. The problem was that free trade increasingly seemed to be less than win-win. The model had started to vary from the reality of an increasing number of Americans. Trade agreements slashing tariffs didn’t produce equally open markets. This was partly because U.S. negotiators wanting, say, a good deal from Japan or Germany for the use of military bases or support of a UN proposal would not insist on full reciprocity. But it was also because the mercantilism of important trading partners called into question the verity of free trade doctrine. An increasing loss of U.S. jobs and even whole companies in the textile, steel, consumer electronics, machine tool, auto, semiconductor, and other industries sparked a variety of reactions. Labor and many corporate executives called on Washington to do something about “unfair trade.” The U.S. Department of Commerce and the Office of the U.S. Trade Representative often sympathized with this request, but their proposals were usually overridden in the interagency process by the Departments of State, Defense, and Treasury and other foreign policy and economic agencies that were prepared to accept unreciprocal trade arrangements to obtain geopolitical objectives.

For all his bullying, narcissism, policy ignorance, and shameless self-contradictions, Trump is resonating with voters in significant part because of his willingness to break with the establishment elite on trade.

They felt justified in this because the majority of professional economists continued, despite having mostly no experience in trade, religiously preaching unilateral free trade. While they did admit that there might be some “costs of adjustment” in some industries and communities, they held these to be tiny in comparison to the benefits of low import prices for the vast body of consumers. The large number of winners, it was said, would compensate the small group of losers from free trade. The costs were never considered to be very high because full employment was assumed as part of the model. Thus the notion that trade could put overall downward pressure on wages was not considered. In any case, economists were confident that domestic economic stimulus would easily zero out unemployment.

It is important to emphasize here the significance of economists, whose role had become quite important in the wake of the Great Depression and who generally meant by the term “free trade” not the reciprocal market opening that the public generally understood it to be but the unilateral opening of the U.S. market regardless of the actions of trading partners. Thus a country might close its market to U.S. imports while engaging in illegal dumping (selling below the cost of production and/or below the home market price) into the U.S. market, and most economists would call that a gift to U.S. consumers. That the gift might be coming at the cost of otherwise competitive domestic producers and workers, or that it might result in the loss of substantial technological skills and capacity, was just never a significant consideration.

Amid these conflicts and debates, the fundamental founding myth of U.S. global economic policy—that America’s major trading partners were all dedicated in principle to free trade, and were all playing essentially the same game—was maintained. As a policy matter, no difference was recognized between the mercantilist catch-up strategy of a Japan and the largely laissez-faire strategy of a Great Britain. If there were problems, the cognoscenti were convinced that it had to be because all markets had not yet been sufficiently opened. Thus the recipe was to negotiate yet more free trade deals to tear down those hidden barriers. More deals were also seen as a way to strengthen U.S. alliances. This caused complications, because trade deals meant primarily to cement alliances had to be sold to Congress as aimed at opening foreign markets and creating good jobs for Americans. Consequently, the proposed new deals were always presented as the means to reduce trade deficits and create millions of jobs while checking the Soviet Union and China.

While orthodoxy reigned in the negotiating chambers, questions began to arise from some halls of academe around 1980. Young economists like Paul Krugman, James Brander, and Joseph Stiglitz noted that much of world trade was operating outside the theory. Krugman emphasized that this was because the theory rested on a host of now completely unrealistic assumptions—perfectly competitive markets (like commodities such as coffee or wheat), full employment, fixed exchange rates, no economies of scale, no cross-border flows of capital and technology, no costs for closing factories or switching to new industries, no government subsidies or industrial policies, and constantly balanced trade. He particularly focused on the fact that in reality economies of scale not only exist but are a driver of trade flows as or more important than land, labor, and capital in major industries such as aircraft, steel, and autos. For incorporating economies of scale into the standard trade model, Krugman was eventually awarded the Nobel Prize in economics.

Some expected that this “new trade theory” would change trade policies and negotiating strategies. For a while it appeared to rescue orthodox free trade doctrine from the inanity of false assumptions, but the incorporation of economies of scale also changed the theory from one of mathematical certainty to one of mere probability or possibility. Once falling unit costs are allowed among a few large producers, competition is imperfect—when an industry has only a few large producers, any one of them can affect the amount of total supply and the average market price—and comparative advantage depends not only on the cost and productivity of the factors (labor, land, capital) but also on the amount of production.

For seventy years leaders of both parties have pursued trade deals less to strengthen the American economy than to achieve geostrategic aims. And they’ve been encouraged by economists who have argued that trade deals, no matter how generous to other nations, are also good for the American economy.

Under these conditions, comparative advantage can be created by mercantilist industrial and trade policies that will determine who wins and who loses, both between producers and between countries—which is what Japan’s Naohiro Amaya meant when he spoke about rejecting American advice.

It was on this basis that South Korean President Chung Hee Park directed the creation of the Korean steel, auto, petro-chemical, and electronics industries. As a member in good standing of the General Agreement on Tariffs and Trade (GATT) and later of its successor, the World Trade Organization (WTO), and without opposition from the United States, Korea, a country largely without the supposedly requisite capital and skilled labor, protected and subsidized these industries until they became world-class competitors. Korea seemed to prove that mercantilism works quite well, at least as long as your main trading partner embraces unilateral free trade and is more worried about having military bases on your soil and assuring your geopolitical allegiance than about trade results.

In the early 1990s, the former IBM chief scientist and Sloan Foundation president Ralph Gomory and the former American Economic Association president William Baumol developed a series of models that, in contrast to most, incorporated not only economies of scale but also rapid technological changes, sudden shifts in productivity, and large numbers of products and countries. What they found was that, contrary to orthodoxy, there is not one uniquely optimal pattern of win-win trade for all countries. Instead, there are many possible trading patterns, and none are optimal for all parties at any one time. In other words, rather than always being a win-win proposition, free trade is more frequently zero sum—I win, you lose. Consider, for example, Boeing and Airbus: a Boeing sale tends to be more of a gain for job and wealth creation in the United States than an Airbus sale, and vice versa.

Another key Gomory and Baumol conclusion is that whether a country achieves large-scale production, innovation, or major increases in productivity is largely unrelated to climate, geography, and national endowments such as capital, land, and labor. Sometimes the leap will be made by strategic choice (a la Korea) and sometimes by serendipity (a Portuguese entrepreneur may innovate the latest cloth fashion). The key point is that, once achieved, economies of scale and technology innovations become barriers to entry for newcomers, particularly so because normal market dynamics will reinforce the existing market structure. Because these positions yield extra profits and jobs, they are desirable to governments who try to shape policy to achieve them.

Despite these powerful critiques of orthodoxy, no one in either Republican or Democratic administrations or in the powerful foreign policy elite of Washington paid much attention. Partly this was because Krugman abjured the practical significance of what he had wrought, arguing that while a strategic trade policy might sound good in theory, it could not effectively be executed in practice, because of inevitable political tampering. Better the flaws of orthodoxy, he and his more orthodox cohorts said, than the risk of politically inspired management of trade. As for Gomory and Baumol, they were perceived as completely heretical and were quarantined by the economics establishment. So, too, were similar analyses by academics such as Chalmers Johnson and journalists like James Fallows and Robert Neff, who were reporting on the ground in Asia.

The fiction that trade is always and everywhere a win-win was necessary to the smooth functioning of U.S. geopolitical strategy. Thus, trade deals continued along the accustomed track and were sold and defended with the accustomed arguments that they would create economic growth and good jobs. The Tokyo Round of GATT negotiations was concluded in 1979, and was presented as having finally opened the Japanese market. In its wake, however, the U.S. trade deficit with Japan quadrupled and major U.S. industries like the semiconductor industry began to crumble in the face of protected and subsidized Japanese competitors benefiting from an undervalued currency. Between 1984 and 1986, Silicon Valley companies lost about $4 billion and 50,000 jobs.

In reaction to this, one leading Reagan administration economist was reported as having quipped, “Potato chips, computer chips. What’s the difference? They’re all chips.” Perhaps apocryphal, this comment nevertheless perfectly captures the attitude of the high priesthood of trade economists at the time. They thought almost exclusively in terms of today’s comparative advantage rather than of the significance of technology and economies of scale for what might become the comparative advantage of tomorrow.

To be sure, there were concessions to political and strategic reality. In response to job losses and corporate and labor lobbying, the Reagan administration pressured Japan into agreeing voluntarily to limit auto exports, to halt dumping of semiconductors, and to facilitate sales of U.S.-made semiconductors in Japan. There was also the so-called Plaza Agreement (named for New York’s Plaza Hotel, where the accord was signed) of 1985, which revalued the yen. But these measures were reactive and specific and did not change the flow of overall policy and negotiation as the trade deficit continued to rise to about $150 billion by the late 1980s.

Free trade policy worked beautifully-at first. It slashed tariffs and provided capital and technology transfers that brought prosperity and democracy to Europe and Japan while checking Communist expansion. Then things got more complicated.

The Uruguay Round of GATT negotiations, which established the WTO, was concluded at the end of 1993. Most of the leading think tanks forecast that it would create a boom in U.S. exports and new jobs, and it was presented to the U.S. Congress and public as a final solution that would completely open world markets to U.S. exports.

Before that could be proven, the North American Free Trade Agreement (NAFTA) was concluded in 1994. Negotiations had begun under the Reagan administration, which was chiefly interested in solving a geostrategic and political problem: stopping the flow of illegal immigrants by developing a Mexican economy that would provide sufficient domestic employment. But because it was a trade deal, it had to be sold to Congress and the public by both Republican and Democratic administrations as an agreement that would increase U.S. exports and create lots of good American jobs. This proved a difficult forecast to defend as illegal immigration continued throughout the 1990s and the 2000s and U.S. trade with Mexico switched from a surplus of about $2.5 billion in 1994 to a deficit of about $50 billion today. Illegal immigration did eventually level off after 2008, and it is probably true that NAFTA actually kept a lot of U.S. jobs from going to Asia because Mexican factories bought more parts and components from the United States than did the Asians. But that was no comfort to the working-class Americans who lost their jobs.

Then came China’s entry into the WTO in 2001. The aim of the Clinton and Bush administration officials who negotiated and passed the agreement was, again, partly geostrategic. The pressures of entering a rule-bound international trading system would, the thinking went, encourage liberalization in China, while limiting challenges to existing security arrangements. For a while this bet seemed to be well founded, but more recently things seem to be moving in the opposite direction.

China’s WTO entry also had to be sold on economic grounds. The U.S. trade representative and other top government officials swore to Congress and the public that bringing China into the WTO would cut the then $80 billion U.S. bilateral deficit and create lots and lots of good American jobs. Superficially, this was a reasonable argument. China’s trade barriers were much higher then than America’s. It seemed logical that China would be opening to an America that was already open and that U.S. exports to China would therefore grow more rapidly than Chinese exports to the U.S. But instead of falling, the U.S. trade deficit with China rose, to about $360 billion today, and net U.S. manufacturing jobs lost topped two million.

How did the experts get the forecasts so wrong? The main error was to view these as trade agreements when they were actually investment arrangements. Of course, they all aimed at opening trade, but the real, if unintended, effect was to make Mexico and China safe for direct investment. This was mostly driven by shifting views among business executives. In the past, they had tended to fight to keep their U.S.-based operations and labor forces working. Now, they began to see the offshoring of their U.S. activities and the importation into America of their own offshore production from low-wage countries as the new road to high profits and became enthusiastic backers of the deals.

Prior to China’s entrance into the WTO, U.S. CEOs had often fought for maintaining U.S.-based production and for national competitiveness. Because Japan and Korea had never welcomed foreign investment and European costs were as high or higher than U.S. costs, the opportunity to move a lot of production offshore had not arisen. That changed when China became a WTO member in 2001. It had cheap labor and courted investment if the investors promised to export a lot of their production. Suddenly offshoring became the preferred American business model—consulting companies like McKinsey made a fortune advising U.S. producers how to move their operations to China while closing them at home. U.S. economists and trade negotiators alike had never imagined anything like this happening.

Just one example will tell the whole story. In 2011, GE CEO Jeff Immelt was chairman of President Obama’s Council on Jobs and Competitiveness. At the same time, GE announced that it was entering into a joint venture with China’s state-owned Aviation Industry Corporation (AVIC) to transfer much of its avionics production and development to China. “What?” Obama must have said. Avionics is what all the economic theories say America should be involved in—it’s high tech, and it’s not labor intensive. For trade negotiators, however, the decision was not surprising. China has made aviation a target industry for the future. It also has a large market for aircraft. GE wants to sell avionics to that market. So China is telling GE that if it wants the sale it will have to produce in China and transfer jobs and technology there. Of course, no one says it that directly. But that’s the game. It would be interesting to know if Immelt called Obama before making the announcement, or if Obama called him afterward.

The problem was and is that nothing in the orthodox trade model ever anticipated cross-border investment and offshoring of production. Indeed, Ricardo specifically said that his comparative advantage analysis would not apply under such circumstances.

None of this is meant to argue that international trade and globalization have been nothing but bad for Americans. No doubt the variety of goods available to consumers is greater, the quality better, and the prices lower as a result of increased trade and globalization. On the other hand, it may be the case that average wages and incomes are also lower. Whether free trade has resulted in a net loss of jobs and incomes is a matter of long and continuing debate. However, there does seem to be a growing consensus that it has definitely contributed to the widening gap between the incomes of the rich and everyone else.

Beyond this debate are two incontrovertible facts: the United States has a chronic trade deficit of about $500 billion, or 2.7 percent of GDP, and it has bilateral trade deficits with most major countries. This means Americans are consuming $500 billion a year more than they produce. The United States has become the global buyer of last resort and also the biggest global debtor.

As the buyer of last resort, America plays a valuable role in the international system by providing crucial demand when other countries cannot. This staves off global recessions, or lessens their impact, just as fiscal and monetary stimuli can do with domestic recessions. And, fortunately, the fact that the dollar is the world’s major reserve currency means that America can borrow in its own currency by simply printing T-bills that it sells at little immediate cost to China, Japan, and others. As long as they are willing to buy, the U.S. can continue consuming more than it produces, having a party for itself but also playing the crucial role of buyer of last resort. If, however, another currency, like the Chinese RMB, should emerge as an alternative, the party would be over; America would face a very steep bill and would have to dramatically cut consumption while increasing production.

Even if the dollar remains the world’s reserve currency, persistent trade deficits driven by the offshoring of American productive capacity are increasingly unsustainable economically, politically, and geopolitically. The push to offshore production and technology and the growing challenge from Beijing may well have changed the longtime calculus on trade and geopolitics.

How did the trade experts get their forecasts so wrong? The main error was not realizing that recent trade agreements were less about opening trade than enabling offshoring. Nothing in the orthodox trade model ever anticipated this.

At this point, it is clear that some of the most high-profile pro–free trade thought leaders have changed their minds. The über-globalist and New York Times columnist Tom Friedman has recently said that “free trade with China has hurt more people than originally thought.” Krugman has also acknowledged that he didn’t anticipate the extent of the impact of trade with China on the American workforce. And the longtime orthodox trade champion and former Treasury Secretary Larry Summers is now calling for “harmonization” rather than more “globalization.”

More important, the public has caught on. Trump and Bernie are resonating with the American people because many of them feel in their bones that their future is being stolen by a combination of inadequate trade deals and excessive care for allies at their expense. That feeling won’t pass with the election. The new president will have to redo the old trade-off. The days of unilateral American free trade for unilateral U.S. security guarantees may at long last be ending, but a new strategy is needed.

The first step toward a winning trade policy is to recognize that the game in our time is not trade. It’s globalization, a vastly more complex phenomenon that Harvard’s Robert Lawrence has aptly called “deep integration” of national economies rather than exchanges of goods and services between them. The game is about global power as well as economic welfare. Not all countries play it the same way, and existing rules and institutions are dated and inadequate.

The next president needs to recognize that globalization is deeply geopolitical and should pursue a new set of agreements over the rules of globalization. It needs to have as its central aim the replacement of the U.S.’s unilateral role as buyer of last resort with new arrangements that accomplish the same goal of providing demand, especially at moments when global recessions loom, but in a more equitable and sustainable way.

For its own long-term economic health, it would be nice if the United States could just decide to remove the dollar from being the key reserve currency. But it cannot and, even if it could, should not do so hastily. A better approach would be for countries such as Japan, China, Germany, and South Korea to keep promises they have repeatedly made to switch from investment- and export-led growth to internal consumption–led growth.

A weakness of the post–World War II trading system has always been that there was discipline on countries who ran chronic trade deficits but none on countries who ran chronic surpluses. What is needed now is for countries running chronic, large trade surpluses to be subject to the discipline of surcharges on their exports and limits on the accumulation of foreign assets.

There should also be surcharges for other beggar-thy-neighbor behavior that threatens the global economy. One of those is “dumping”—flooding world markets with products at below home market and/or-cost-of production prices, often as a result of overcapacity driven by export-led growth policies. At the moment, for example, China has more than enough steel production capacity to supply total global demand all by itself. This is ruining the steel industries of Europe, Japan, and the United States, which otherwise are perfectly competitive. Another behavior deserving sanctions is the practice of countries offering subsidies—tax holidays, free land, training of workers, capital grants, utilities at half price—to lure companies to their shores, companies that are competitive where they are. Government-supported hacking operations that steal corporate intellectual property also need to be reined in.

We need a new set of international agreements over the rules of globalization. The U.S.’s unilateral role as buyer of last resort must be replaced with new arrangements whereby all the leading economies share the burden.

The question is how to bring such an agreement about. Countries will understandably be resistant to changing behaviors that have worked for them. The answer is that the United States should give them an incentive by using all available legal means to halt harmful trade. The White House has the authority to self-initiate anti-dumping actions rather than waiting for complaints from industry. It should do so. A “war chest” similar to that used in the 1980s and ’90s to stop direct export subsidies should be established to counter the investment incentives being proffered abroad. Currency manipulation should be met with counter-intervention in foreign exchange markets by Uncle Sam. These actions would not be ends in themselves, but means to the end of achieving a sustainable world financial order.

The next president will have a new stick to wave to bring other countries together around a new plan. That president can say, in all sincerity, that the American people have had enough. They are no longer willing to support globalization policies that strip the United States of its wealth-creating capacities, and they don’t mind throwing out of office leaders who do. Cut a deal with me, the next president can say, or rest assured the one after me will be worse.

The post Free Trade Is Dead appeared first on Washington Monthly.

]]>
58390
The Commercial World of Our Fathers https://washingtonmonthly.com/2016/06/12/the-commercial-world-of-our-fathers/ Sun, 12 Jun 2016 16:56:09 +0000 https://washingtonmonthly.com/?p=57745 St. Louis

Working for my dad in St. Louis gave me a glimpse of independent businessmen living out their dreams and earning, if not fortunes, proud livings.

The post The Commercial World of Our Fathers appeared first on Washington Monthly.

]]>
St. Louis

Sometime in the mid-1970s my mother told my father that he should start bringing me to the office with him. I don’t recall being consulted on the matter. Nevertheless, I spent my high school and college summers working—happily, for the most part—for the Glastris-Manning/Courtesy Checks Advertising and Public Relations Group. My two younger brothers were similarly dragooned.

Despite the complex title, Glastris-Manning/Courtesy Checks was not a big operation—just a few people working out of a 600-square-foot windowless office above a bowling alley, which Dad dubbed his “World Headquarters.” Nor was it a big moneymaker. But it afforded my father the opportunity to be his own boss, to be with his sons, whom he adored, and to do work that fit his talents and temperament.

Dad was a very creative guy—among other things, he pioneered the field of “barter advertising,” the trading of products and services for media time and space. He was also gregarious and ebullient, with a caustic wit and a generous heart, and much beloved around town. This I learned accompanying him on sales calls and to what he referred to as “executive lunches” with his various business cronies and Greek American buddies, nearly all of them, like him, small business owners. Libations were typically consumed.

One of my jobs was writing copy for radio spots for the firm’s clients—hotels, car dealers, and the like. Dad could whip one of these out in about half an hour. It would take me all day. I’d bring him a draft, he’d mark it up, I’d go back and redo it. Over and over again.

Eventually I got the hang of it. I even started taking liberties with the genre. Once, borrowing a technique from a Donald Barthelme short story, I wrote a spot all in questions: “Is Schmeezings St. Louis’s newest bar and grill? Is Schmeezings conveniently located two blocks from the Arena, home of the St. Louis Blues? If Schmeezings is located two blocks from the Arena, does it serve twenty great varieties of burgers?” My dad especially liked that one, not because of the Barthelme connection (he’d never heard of the guy), but because it repeated the name of the establishment a dozen times in thirty seconds. Commercials that were “clever” but didn’t actually sell the client’s product or service drove him nuts.

Another lesson I learned was the importance of a bold sound bite. For one client, my father came up with the tag line “One of St. Louis’s ten best restaurants.” When the client asked him, “Bill, who said we’re one of St. Louis’s ten best restaurants?,” Dad replied, “We did!” Now, it happened to be quite a good restaurant, though unfashionably located in a hotel by the airport. Still, for a couple of decades, if you asked any St. Louisan about the place, there was a 50/50 chance they’d say, “Oh yeah, isn’t that one of St. Louis’s ten best restaurants?”

Yet another lesson I learned was that local ownership was a definite selling point in a tightly knit place like St. Louis. To be able to say that such-and-such family firm had been “serving the bi-state area for three generations” implied trustworthiness and stirred the abundant civic pride that resides in every St. Louisan. That’s still the case today. Though I haven’t lived in my hometown in decades, I watch Cardinals baseball games on the MLB network and can’t help noticing the ads—“Helitech Waterproofing & Foundation Repair, healing homes for thirty years!”

Unfortunately, the old, locally owned firm is a dying breed. Ever since the Reagan administration rewrote the rules on antitrust, such companies, large and small, have been bought up by out-of-town behemoths, or edged out by national chains. This has slowed St. Louis’s economy and devastated its once-thriving advertising sector (see Brian S. Feldman, “The Real Reason Middle Americans Should Be Angry,” Washington Monthly, March/April/May 2016). It has also led to an alarming nationwide decline in new business startups (see Barry Lynn and Lina Khan, “The Slow-Motion Collapse of American Entrepreneurship,” July/August 2012).

The good news, as Dane Stangler and Colin Tomkins-Bergh report in this issue, is that St. Louis, of all places, has the fastest-growing startup scene in the country, thanks to some smart local economic development policies other cities could copy (see “St. Louis, Entrepreneurial Boomtown”). The bad news is that without changes in federal antitrust policies, those startups are likely to be acquired and moved to places like San Francisco, New York, or Beijing, leaving the Gateway City high and dry, again.

What’s needed, as we argue in this issue, is a new set of “smart populist” policies on globalization (Clyde Prestowitz, “Free Trade Is Dead”) and monopolization (Barry C. Lynn and Phillip Longman, “Populism with a Brain”) to empower local economies that none of the presidential candidates has yet discussed. Such policies, updates of the traditional rules that built the American Century, could create the conditions for another.

Working for my father gave me a glimpse of a slowly vanishing commercial world, of independent businessmen living out their dreams and earning, if not fortunes, proud livings. It also allowed my father an opportunity to teach his son all he knew about writing, editing, selling, and managing a small enterprise. I use what he taught me every day. And I think about him, every day.

The post The Commercial World of Our Fathers appeared first on Washington Monthly.

]]>
57745
St. Louis, Entrepreneurial Boomtown https://washingtonmonthly.com/2016/06/12/st-louis-entrepreneurial-boomtown/ Sun, 12 Jun 2016 16:47:26 +0000 https://washingtonmonthly.com/?p=57730

Metro areas all over the country are trying to nurture startups, without much luck. The Gateway City is succeeding. What’s its secret?

The post St. Louis, Entrepreneurial Boomtown appeared first on Washington Monthly.

]]>

Back in the mid-aughts, Jarret Glasscock was happily ensconced at the Genome Sequencing Center at Washington University in St. Louis, working on the federal government’s Human Genome Project. He and his fellow scientists were exceptionally good at what they did: they could sequence the DNA of a cell for about 1/100,000th the cost of traditional methods. Pharmaceutical companies soon came knocking on their door with contract offers to do DNA sequencing for drug development purposes. But the center was busy fulfilling government grants and kept turning the offers down. After a few years of this, says Glasscock, “we finally got it through our thick skulls that maybe there’s a need for some kind of company to exist.”

So, in 2008 Glasscock and a few colleagues pulled together whatever money they could, rented a former photographer’s studio downtown off Craigslist for $700 a month, and filled the office with half a million dollars’ worth of used genetic sequencing machinery—which they cleverly managed to convince the company that sold it to them to finance. Pretty soon, their new company, Cofactor Genomics, was making money. They hired more staff, bought better equipment, and briefly made international news when they helped map the genome of heavy-metal icon Ozzy Osbourne.

Eventually, Glasscock became the company’s CEO and the firm moved to its current digs, a squat brick industrial building in Midtown St. Louis that backs up to Interstate 64. A metal coatings factory sits on one side; on the other, across a vast empty parking lot, a giant grain elevator looms.

Inside, the décor is contemporary high tech with a slight midwestern twist. It’s mostly one big open space, with a couple of glassed-in conference rooms with math equations scrawled on the panes. Two ski lift gondolas, bought for $800 each from a resort in Wisconsin, are available for more intimate meetings. On a wall is a vintage sign that reads “Gun Shop” next to a picture of a revolver. The barrel of the gun points toward the back, where the gene sequencing lab is located behind closed doors secured with push-button combination locks—a precaution demanded by Cofactor’s pharmaceutical industry clients, who are serious about protecting their intellectual property. In a corner by the back door is a drum set and some old bikes the company’s eighteen employees can use to go to lunch. There haven’t been many places to eat in this aging industrial part of Midtown. But more and more restaurants and coffee shops are popping up to cater to the 150 startups now occupying several rehabbed buildings in the neighborhood, which has been renamed the Cortex Innovation Community.

Cofactor is now being feted by the kind of West Coast venture capital investors who, a decade ago, would never have thought to put money into a St. Louis startup. The firm’s leaders were invited to spend last summer networking at Y Combinator, the famed Mountain View seed fund that nurtured Airbnb and Dropbox. In February, Cofactor acquired a San Francisco–based competitor, Narus Biotechnologies.

To many Americans, St. Louis is known for urban dysfunction and slow economic decline. It was in the nearby suburb of Ferguson in 2014 that riots broke out after a white police officer shot an African American teenager. St. Louis is one of those places where every year or two another big homegrown company seems to get bought by outside owners: aircraft maker McDonnell Douglas in 1997, food processor Ralston Purina in 2001, May Department Stores in 2005, brewer Anheuser-Busch in 2008. As this magazine recently reported (see Brian S. Feldman, “The Real Reason Middle America Should Be Angry,” March/April/May 2016), as a result of changes in federal antitrust and other competition policies, the number of Fortune 500 companies located in St. Louis has shrunk from twenty-three in 1980 to nine today.

Growth industry: Kristine Menn, greenhouse coordinator for the St. Louis-based renewable fuel startup Arvgenix, works on a field pennycress flower in the Danforth Plant Science Center.
Growth industry: Kristine Menn, greenhouse coordinator for the St. Louis-based renewable fuel startup Arvgenix, works on a field pennycress flower in the Danforth Plant Science Center. Credit:

But as the story of Cofactor Genomics illustrates, a city that has lost so many big old companies is becoming home to a lot of small new ones. Last year, Popular Mechanics deemed St. Louis to be one of the fourteen best startup cities in America, and in January of this year, Business Insider said it had the “fastest-growing startup scene” in the country.

St. Louis is a long way from becoming another Silicon Valley. But its sudden emergence as a hotbed of entrepreneurship holds lessons for a country struggling to make a growing economy benefit Americans who don’t happen to live in a handful of booming coastal megalopolises. For decades, St. Louis followed the familiar economic development playbook: try to attract big out-of-town companies, or keep local ones from leaving, by showering them with tax breaks and other subsidies. While it hasn’t exactly abandoned that old strategy, St. Louis has increasingly shifted to a new one of attempting to grow its own small firms. Metro areas across the country are trying to do the same, in many cases with little to show for their efforts. St. Louis seems to have hit on the right formula, though actions in Washington could determine whether, over the long term, it succeeds or fails.

While it may be hard for people who think of St. Louis as part of “flyover” country to wrap their heads around the idea of it as a startup haven, the numbers don’t lie. Business creation in St. Louis has risen every year since 2009, jumping 18 percent from 2012 to 2013, a year business creation actually fell nationally. In 2006, St. Louis was 11 percent below the national average in the number of new firms per 100,000 people. By 2012, St. Louis had narrowed this “startup density” gap to only 3 percent, according to the Kauffman Index of Entrepreneurship. St. Louis now ranks twenty-sixth among the top forty metro areas by startup density, ahead of some cities that garner lots of attention for their entrepreneurship scenes, like Pittsburgh, Columbus, and Philadelphia.

St. Louis’s startup scene is most noticeable in the information technology sector, led by such firms as the social media company LockerDome and app developers RoverTown and Aisle411. Venture capital companies poured $176 million into area IT startups in 2015, up from $66 million in 2013, according to the St. Louis Tech Startup Report. That is roughly double the figure of Kansas City, a region of a similar population on the other side of the state. St. Louis tech startups employed more than 1,400 people in 2015, up from less than half that in 2011. In 2015, three new accelerators started in St. Louis, and three new venture capital funds entered the fray.

But it’s not just IT and biotech startups that are flourishing. New St. Louis firms are popping in sectors like education, food, and—in the longtime home of Budweiser—craft beer.

The roots of St. Louis’s startup boom go back to the late 1990s, when a group of political and business leaders, frustrated with the metro area’s slow economic growth, began studying how other old industrial cities, such as Boston, were turning themselves around. They concluded that St. Louis had the necessary resources—highly ranked universities, med schools, and hospitals, plus major research-focused agricultural firms like Ralston Purina and Monsanto—to become a hub of life science startups. This precipitated the creation of a number of economic organizations that today form a kind of infrastructure for entrepreneurial activity: new investment funds, incubators for companies, new workforce programs.

Two organizations were especially critical in these early years. One was the Donald Danforth Plant Science Center, which opened in 2001 with funding from Monsanto and the Danforth Foundation (William H. Danforth was the founder of Ralston Purina). It has quickly grown into a world-renowned research facility, attracting tens of millions of dollars in federal grants annually to support hundreds of plant scientists. These scientists work directly with ag-tech startups developing new strains of crops that have, say, higher nutrition value, or better capacity to withstand the droughts that come with climate change. The center also cofounded the Ag Innovation Showcase, which has become the premier event in the nation around agricultural technology and innovation.

The other key institution was the Skandalaris Entrepreneurship Program at Washington University, begun in 2001 and expanded in 2003 with a grant from the Ewing Marion Kauffman Foundation. Renamed the Skandalaris Center for Interdisciplinary Innovation and Entrepreneurship, it became a hub of entrepreneurial training and networking, with a mentoring program for budding area entrepreneurs that has proved invaluable.

To many Americans, St. Louis is known for urban dysfunction and slow economic decline. In fact, Business Insider says it has the “fastest-growing startup scene” in the country.

But by 2008, all this activity still wasn’t generating much in the way of actual economic outcomes. This was partly a result of the inherent cycles of research and development and commercialization in the sectors St. Louis chose to stress. It takes longer to build a plant science company than a new mobile app startup. Then the Great Recession hit, shrinking St. Louis’s per capital personal income by 5 percent that year and eventually driving the metro unemployment rate to over 10 percent. Worst of all, 2008 was also the year that multinational brewing giant InBev acquired Anheuser-Busch, the corporate crown jewel of St. Louis, and laid off hundreds at its South St. Louis headquarters.

That same year, a former telecommunications manager and entrepreneur named Jim Brasunas had an idea. Brasunas had lived in St. Louis since 1994 and been involved with several technology companies, including running an incubator in downtown St. Louis. In his dealings with tech entrepreneurs around the region, he discovered that they all felt alone in their attempts to build successful companies there. They didn’t know other up-and-coming entrepreneurs in the same sector, or have successful older ones they could turn to for advice. Brasunas also learned that area investors looking to put money into IT startups had no idea that there were any such firms in St. Louis. Sometimes an entrepreneur and an investor would be on the same flight to San Francisco to talk to venture firms, utterly unaware of each other’s presence. “After hearing the same story twenty-five or thirty times, I just started getting these people together casually,” says Brasunas. In 2008, he formalized this network by starting a nonprofit organization, the Information Technology Entrepreneurs Network (ITEN).

To get his operation off the ground, Brasunas applied for funding from the Missouri Technology Corporation (MTC), a public-private investment fund controlled by the state government and championed by Governor Jay Nixon. The MTC’s very existence was another example of the change in thinking that was occurring about how to jump-start economic growth. As in just about every other state and municipality in the country, Missouri’s strategy had long been to throw tax dollars at big, footloose corporations. It’s a popular practice with voters and politicians, affording the latter the opportunity to hold press conferences to announce that they’ve lured a new corporate headquarters to the area or kept an existing one from leaving. But in terms of actually boosting net economic activity, it’s a mug’s game. A study by the researcher Nathan Jensen found that St. Louis–area companies receiving Missouri state tax incentives have actually performed worse in terms of creating jobs than companies elsewhere in Missouri that didn’t receive incentives.

The MTC’s modest initial grant to ITEN in 2008, plus later expansion grants, turned out to be wise (the Kauffman Foundation also supported ITEN). Over the next eight years, ITEN helped catalyze the entire entrepreneurial ecosystem in St. Louis. Today, many entrepreneurs and others give a huge amount of credit to Brasunas and ITEN for building connections and thus creating a new network of entrepreneurial support. Indeed, Yasuyuki Motoyama, in a paper coauthored with Karren Watkins at Washington University, points out that the “connections between novice and experienced entrepreneurs” that ITEN facilitated are probably the single-biggest factor in St. Louis’s entrepreneurial emergence.

For decades, St. Louis followed the familiar economic development playbook: try to attract big out-of-town companies with tax breaks and other subsidies. It has increasingly shifted to a new one of attempting to grow its own small firms.

Three years later, the MTC made another shrewd bet: it put seed money into a new program, called Arch Grants, with a model for economic development unlike any in the country. The organization runs a global competition to identify potential entrepreneurs from virtually any industry sector. It then provides those with the most promising business plans $50,000 equity-free grants and pro bono support services if they agree to build their businesses in St. Louis. Even more important than the money, Arch Grants helps these emerging entrepreneurs connect with each other—building bonds not unlike those among a class of college freshmen—and plugs them into St. Louis’s increasingly thick and energized network of entrepreneurs, funders, work spaces, and social events.

By 2013, all of this activity was bearing fruit. That year, the St. Louis Regional Entrepreneurship Initiative Report released an analysis of entrepreneurial activity and deal flow. According to the report, in 2007 only eleven companies reached “validated startup status.” (This was judged to be when a company reached milestones such as going through an accelerator, receiving equity investment, completing an intellectual property license, and so on.) By 2012, forty-two companies had become validated startups.

Importantly, the composition of startups receiving equity investments had changed significantly. In 2006, two-thirds of investments went to bioscience companies; only a quarter went to tech startups. This reflected the orientation of official economic development efforts in St. Louis. Yet by 2012, the situation was precisely reversed: tech firms received two-thirds of equity investments, and bioscience companies accounted for one-quarter.

Entrepreneurship has also found its way into the old corporate heart of the region: beer. Schlafly, a craft brewery that has been around for twenty-five years, has experienced strong growth and, earlier this year, announced it was doubling annual production to meet demand. Just within the last few years, more craft breweries have followed: O’Fallon Brewery, Urban Chestnut Brewing, and Senn Bierwerks, which will open in 2017. Oh, and don’t forget about the William K. Busch Brewing Co., founded in 2011 by the great-grandson of Adolphus Busch, cofounder of Anheuser-Busch.

In some ways, St. Louis is not unique. What we call “startup fever” has swept the country over the last several years, with cities of all sizes creating accelerators, incubators, pitch competitions, and all sorts of other programs intended to boost entrepreneurship. Pittsburgh, for example, has been celebrated for its entrepreneurial renaissance, with recognition from various “best places to start” rankings, and an increase in venture funding, new incubators, state programs, and more. Similar to Washington University’s role in St. Louis, Carnegie Mellon has played a catalytic role in Pittsburgh entrepreneurship. And yet, by other measures, this activity has yet to translate into entrepreneurial outcomes. In 2014 and 2015, Pittsburgh ranked last among large metro areas on the Kauffman Index, with the lowest rate of new entrepreneurial entry, the second-lowest startup density, and even one of the lowest rates of “opportunity” entrepreneurship. A similar story might be told of cities such as Milwaukee, Cincinnati, and Philadelphia.

What, then, is St. Louis doing differently that might explain its relative success? In a recent article in the journal Innovations, Ken Harrington, who led the Skandalaris Center for over a decade and has been closely involved in the St. Louis startup scene, argues that the secret sauce was “connectivity.” In the past, St. Louis was not without entrepreneurial energy, according to Harrington, but it existed in disconnected pockets. This stymied the formation of an “entrepreneurial genealogy” that occurs when successful entrepreneurs from one generation become the next generation’s mentors and investors. This genealogy is a distinguishing characteristic of places like Silicon Valley.

Home is where the lab is: Cofactor moved from a former photography studio to its current headquarters, which sits across from a grain elevator.
Home is where the lab is: Cofactor moved from a former photography studio to its current headquarters, which sits across from a grain elevator. Credit:

A metro area without this genealogy needs to create it by bringing together the disconnected pockets. It’s faddish in economic development circles today to talk about collisions. If you create lots of bars and coffee shops and parks, serendipitous unexpected connections will occur: the strategy is premised on people literally running into each other.

But that’s not what happened in St. Louis. Instead, organizations such as ITEN, the Skandalaris Center, and Arch Grants came into being and intentionally and deliberately built connections and networks.

A good example of how this connectivity can work is RoverTown, a startup that provides student discounts through a mobile app. After receiving an Arch Grant in 2013, the company moved to St. Louis from Carbondale, Illinois. It relocated to the T-REX, a nonprofit co-working space downtown, went through an accelerator program called Capital Innovators, and received a follow-on Arch Grant. It then took part in another program run by ITEN called Mock Angel, which prepares entrepreneurs to make their pitches to equity investors, and secured nearly a million dollars in funding last year. In 2014 it was named the fastest-growing tech startup in St. Louis.
RoverTown is not small-time—they have a satellite office in Chicago, and two VPs of GrubHub are on the board. They are in St. Louis by choice.

RoverTown’s experience also illustrates another side of St. Louis connectivity. One of its investors is Tom Hillman, a serial entrepreneur who formerly owned Answers.com, a leading IT firm based in St. Louis. Hillman has become a central figure in St. Louis entrepreneurship and philanthropy, and embodies the new entrepreneurial genealogy that has developed in the region.

St. Louis’s recent success at fostering startups shouldn’t be all that surprising. The metro area has had most of the ingredients for a long time: a central location, great universities, large sophisticated corporations, an educated workforce, world-class cultural institutions (symphony, art museum, zoo, botanical gardens), suburbs with good schools, and city neighborhoods full of beautiful old homes available at affordable prices. (Okay, yes, St. Louis is very humid in the summer.)

Several key ingredients, however, were only added relatively recently. The first was a change in mentality of the leadership class at the city, metro, and state levels. These folks—elected officials, university presidents, old money muckety-mucks—needed to be persuaded that the traditional ways of trying to foster growth (bribing big companies with subsidies, pouring endless tax dollars into downtown redevelopment projects) weren’t working, and that giving some attention and public and philanthropic resources to fostering startups might. Crucially, the new mentality needed to be shared by executives at the big locally based corporations who have an interest and the autonomy to participate in local civic life. The Danforth Center couldn’t have been built without the help of Monsanto, for instance, and its renewable fuels program is funded by the family that owns St. Louis–based Enterprise Rent-A-Car. In 2015, ITEN launched a corporate engagement program in which startups and large corporations will interact in a variety of ways, including “reverse pitches,” in which the corporations pitch their ideas and needs to startups. Initial corporate partners included Monsanto, the Reinsurance Group of America, and Enterprise.

The second missing ingredient was connectivity. Fortunately, St. Louis had civic entrepreneurs like Jim Brasunas, Ken Harrington, and the founders of Arch Grants, who figured out how to increase the connections among people and institutions in ways that make an entrepreneurial scene gel and grow.

Challenges remain, of course. Despite all the energy and excitement, there’s no point at which the startup scene in St. Louis is “finished” or “complete.” Nearly everyone we have talked to agrees that St. Louis faces a challenge in sustaining today’s entrepreneurial momentum.

Policymakers in Washington could make the city’s job easier—or harder. Many of St. Louis’s most promising startups, especially in the biotech and ag-tech sectors, would never have been started and could not continue absent federal investments in medical and agricultural research. Cofactor Genomics is a perfect example. CEO Jarret Glasscock got his training while working on the federally funded Human Genome Project. Contracts from the NIH and the USDA make up a significant portion of the firm’s revenue. Federal tax credits covered some of the cost of its new building and equipment. In general, more federal funding for these programs would help St. Louis, and cuts would hurt.

The federal government can also help places like St. Louis by remaining economically open and engaged with the rest of the world. Because of the Danforth Center and organizations like BioSTL, a university-led seed investment fund, St. Louis has become a global center of plant sciences, attracting new entrepreneurial companies to the area. Earlier this year, an Israeli company, NRGene, announced that it was establishing its U.S. headquarters in St. Louis—impressively, it is the fourth Israeli plant science company to set up its base in the city in the last two years. If our next president insists on alienating the rest of the world, you can bet that the flow of foreign entrepreneurs into St. Louis and other cities will slow, if not cease altogether.

Sometimes a St. Louis entrepreneur and an investor would be on the same flight to San Francisco to talk to venture firms, utterly unaware of each other’s presence.

Better parental leave policies from the federal government could also help by making it possible for more women to follow their entrepreneurial dreams. St. Louis, like other cities, faces a gender imbalance in terms of participation in the startup scene. Motoyama and Watkins found that men make up 70 to 80 percent of the people who participate in many of the city’s entrepreneurship support organizations. In the United States as a whole, business ownership has fallen among men and remained stable among women, as women make up a large and growing share of the workforce, especially the educated workforce.

More could also be done to open up opportunities for minorities. The talent is certainly there: World Wide Technologies—the nation’s largest black-owned business, according to Black Enterprise magazine—is based in St. Louis. Yet minorities are underrepresented among the new crop of St. Louis startups.

Financing is another area where federal policymakers could extend help to entrepreneurs. While St. Louis has seen an increase in equity investments into startups, most new businesses (including tech firms) do not receive equity financing. Instead, they seek out various forms of credit from banks: loans, lines of credit, credit cards, leasing, trade credits, and so on. Young and small firms, moreover, seek credit from small banks and credit unions at higher rates than large companies do. Yet small banks—including black-owned banks—were hit hard by the financial crisis. Some went under; others were acquired by larger banks that took advantage of decades-old federal deregulatory decisions, a phenomenon that continues to shrink the ranks of small lenders. New federal regulations put in place since the financial crisis have reinforced the advantages
of bigness.

Indeed, arguably the single-biggest favor the federal government can do for St. Louis’s startup scene is to change its policies on antitrust enforcement. For more than thirty years Washington has allowed corporate mergers and acquisitions to take place largely unimpeded, to the point where a handful of huge companies dominate market after market. Most of those behemoths are located in places like New York, San Francisco, Seattle, and LA. And most startups today, especially in the technology fields, never expect to grow their companies, but instead hope to get bought out by one of the big boys. This makes for a useful “exit strategy” for investors. But it also leads to less competition: startups seldom get big enough to challenge the incumbents. And it means that smaller cities like St. Louis that have painstakingly nurtured startups are likely to see most of them skip town after getting gobbled up by the Googles, Facebooks, and Pfizers of the world. In that scenario, instead of competing in the big leagues, St. Louis would become an uncompensated farm team for San Francisco and Boston.

This is the world today as we find it. But it is not necessarily the one that entrepreneurs would prefer. Asked if he expects Cofactor Genomics to be acquired, Glasscock replies, “Interesting question. That’s been the thinking of our board from the beginning.” On the other hand, he says, he loves living in St. Louis (he’s originally from Arizona), loves the corporate culture he and his colleagues have created, loves the challenge of steering the ship. “I can definitely imagine us having 500 employees ten years from now,” he says, with a noticeable twinkle in his eye.

Dane Stangler is vice president of research and policy at the Ewing Marion Kauffman Foundation. Colin Tomkins-Bergh is a research analyst for the foundation.

The post St. Louis, Entrepreneurial Boomtown appeared first on Washington Monthly.

]]>
57730 June-16-Stangler-plant June-16-Stangler-cofactor <b>Home is where the lab is:</b> Cofactor moved from a former photography studio to its current headquarters, which sits across from a grain elevator.
Payday for the Public https://washingtonmonthly.com/2016/06/12/payday-for-the-public/ Sun, 12 Jun 2016 16:46:12 +0000 https://washingtonmonthly.com/?p=57717

How the CFPB broke the back of the payday lending industry.

The post Payday for the Public appeared first on Washington Monthly.

]]>

Daniel Patrick Moynihan famously wrote, “Everyone is entitled to his own opinion, but not to his own facts.” In Washington, however, powerful interests frequently are entitled to their own facts—for the simple reason that they, and they alone, have access to them. When lawmakers and regulators sit down to write statutes or the rules to ensure that industries don’t cheat their customers, sully the environment, or (in the case of banks) put the whole economy at risk, the principal facts they have to work with are all too often chosen by the companies and industries themselves, from proprietary data sets they control.

Scientists at Exxon, for example, knew as far back as 1977 that man-made carbon emissions—such as those resulting from burning petroleum products—contribute to climate change. But instead of sharing this information with Congress or regulators, the company poured millions of dollars into phony climate-denial research and front groups. Likewise, tobacco companies knew as early as the 1950s about the dangers of smoking but chose to mount a multi-decade campaign of disinformation and obfuscation. And then there was the 2007 financial crisis, brought about by the activities of the “shadow-banking” system. Before the crisis, federal regulators had little information about the risky behaviors and positions of this sector and couldn’t have regulated it even if they had wanted to.

Sometimes the key information that public officials and citizens need to make informed decisions does emerge, but years after it would have been most helpful. It was a Pulitzer Prize–winning team of investigative reporters at the website Inside Climate News that exposed what Exxon knew and when it knew it. In the case of tobacco, a lawsuit brought by the U.S. Department of Justice under President Bill Clinton revealed the companies’ conspiracy.

But rather than wait for these occasional or serendipitous moments of revelation, the better course is for governments to have access to the same data industry has—at least to the extent that it bears on public policy and welfare. That was the lesson lawmakers took from the financial crisis when they drafted the Dodd-Frank financial reform legislation in 2010.

Dodd-Frank granted federal regulators broad access to what was once proprietary information held by financial institutions. For example, as part of the now-mandatory “stress tests” administered by the Federal Reserve, banks must disclose detailed data about their capital positions and risk management practices so regulators can assess their stability.

The law also created the Consumer Financial Protection Bureau (CFPB)—the first-ever federal agency aimed at regulating consumer financial products—and gave it expansive supervisory authority over not just banks but also non-bank financial institutions like mortgage brokers and payday lending outfits that had previously escaped serious scrutiny. Importantly, Congress gave the CFPB “examination” authority—that is, the power to demand the data necessary to carry out its mission of consumer protection—as well as the capacity to conduct independent research on any developments in the marketplace that could negatively affect consumers.

As the first exercise of its investigative and supervisory authority, the bureau in 2012 turned its sights on payday lending, an industry that had grown more or less unfettered for decades while preying on the nation’s most vulnerable customers. Its victims included people like Lisa Engelkins, who told the Center for Responsible Lending that she paid $1,254 in interest and fees on one $300 loan. As a single mom earning less than $8 an hour, she had the money to pay the fees but never enough to pay the principal, which she rolled over into new loans a total of thirty-five times. Another borrower, Sandra Harris, took out new payday loans to pay off old ones. She eventually held as many as six payday loans at the same time, paying $600 a month in fees alone.

To end these kinds of abuses, the CFPB announced in March 2015 its intent to regulate payday lending, along with a framework for what it might do. It is expected to propose and finalize new rules over the next year. Payday lenders are bracing themselves—the industry has all but stopped growing in anticipation—and a few state governments, emboldened by data dug up by the CFPB and several nonprofits, have started cracking down.

The battle over payday lending is far from over. The industry, with more than 20,600 storefronts nationwide and roughly $38 billion in annual revenue, has many powerful friends, on both sides of the aisle. Still, the story of how the CFPB broke the payday lending industry’s stranglehold on data is a valuable example of how government can regain leverage over industry actors who would impose their facts over the truth.

Governments have had laws against lending at high interest rates, or “usury,” since the Code of Hammurabi. At America’s founding, all thirteen original states had them. Over the nineteenth century, usury laws waxed and waned; many states repealed them, only to bring them back. By the turn of the twentieth century, the public had become concerned about the rise of illegally operated “salary lenders” offering short-term loans at high interest rates to desperate urban workers, often using brutal tactics and even violence to collect on their loans. Reformers argued that creating a carve-out for “small-dollar loans” would allow legitimate mainstream lenders, like banks and credit unions, to compete against illegal loan sharks while meeting the clear demand for short-term credit. In 1916, the American Bar Association helped develop model legislation for the regulation of these small loans—the Uniform Small Loan Law—which two-thirds of states ultimately adopted in some form. In those states, the annualized interest rates for small loans varied between 18 and 42 percent.

Things began to change in the 1980s, when banks, spurred by competitive pressures partly created by deregulation, stopped offering free checking accounts to customers who didn’t maintain minimum balances. To meet the needs of these largely low-income and suddenly unbanked individuals, check-cashing stores sprung up. Soon these stores started offering payday loans, too. In a typical payday loan transaction, borrowers write a postdated check that includes finance charges of $15 to $30 for every $100 borrowed, in exchange for immediate cash. The term of the loan is usually two weeks, which means these fees translate into an annualized interest rate of 400 percent or more.

Opponents argued that these practices violated state usury laws. Lenders maintained that they were providing a vital service banks had abandoned. State legislators responded by giving payday lenders exemptions from usury laws and allowing them to charge triple-digit interest rates. (Substantial campaign contributions from payday lenders also greased the way for these decisions.) By 1999, according to the Consumer Federation of America (CFA), twenty-three states and the District of Columbia had carved out safe harbors for payday lending, while another seven states had no caps on interest rates.

Now facing friendly terrain, the payday industry exploded, growing from just a few hundred stores in the 1990s to more than 24,000 outlets by 2007. By 2008, according to one study, payday lending outlets outnumbered all Starbucks and McDonald’s stores combined.

As the payday lending industry mushroomed, consumer advocates began to collect a growing number of horror stories from borrowers—people like Lisa Engelkins and Sandra Harris—who were caught in an endless cycle of debt. Yet the lack of real data about the industry’s practices made it difficult for reform advocates to make headway, either federally or in the states. For example, a 1998 report from the CFA noted that despite the industry’s exponential growth, “[p]ublic data on the profitability of payday lending is sketchy.” And crucially, advocates didn’t have the data to counter one of the industry’s central claims: that payday loans served as an occasional source of emergency funding to tide people over in a pinch. According to the industry, repeat borrowers like Engelkins and Harris were very much the exception, not the rule.

“[The industry would say,] ‘Look, these are very poor people, they have no access to credit, no access to emergency loans, and you’re going to leave them even worse off if you undercut us,’ ” says New Mexico state representative Javier Martínez, who also serves as the executive director of the Partnership for Community Action. “And oftentimes proponents of reform were unable to respond in a direct substantial way,” Martínez says. By the mid-2000s, the states that regulated payday lending were beginning to collect more data, which made more and better research possible. In 2005, for example, the Center for Responsible Lending (CRL) published a study showing the abnormally high concentration of payday lenders in military towns and in African American communities in North Carolina.

The industry fought back with a clever strategy: pay “independent” researchers to do counter studies using the industry’s own (uncheckable) data. The first of these reports appeared in 2005, when a researcher funded by the Consumer Credit Research Foundation (CCRF), an industry-affiliated nonprofit, published a study concluding that taking out a payday loan was less costly than bouncing a check. In 2006, the same researcher, Thomas Lehman, published a critique of the CRL’s study of North Carolina.

Other studies published during the latter half of the 2000s touted the economic benefits payday lending brought to communities and the satisfaction of its customers while at the same time reinforcing perceptions of the product as a short-term solution for Americans facing an unexpected financial crisis. One CCRF-funded study, for example, authored by Jonathan Zinman of Dartmouth College in 2008, argued that when the availability of payday loans in Oregon diminished after its regulation, consumers turned to “plausibly inferior substitutes” such as running overdrafts.

“[These claims] were pretty successful where policymakers were not on board with payday lending as a good product but thought it provided a service banks weren’t providing,” says David Rothstein, the director of resource development and public affairs at Neighborhood Housing Services of Greater Cleveland. Moreover, he says, state legislators who prided themselves as “pro-growth” were reluctant to shut down an industry arguably circulating dollars in the economy.

Not surprisingly, the industry was not inclined to share its data with other researchers. Professor James Barth, the Lowder Eminent Scholar in Finance at Auburn University, says he hit a “dead end” when he asked payday lending companies for data several years ago. “I tried a couple of times,” Barth says. “I never got a ‘no,’ but after a couple attempts, I never heard anything.”

June-16-Kim-cartoon
Credit:

In fact, the industry has gone to great lengths—such as through litigation—to keep its data secret. When Alabama legalized payday lending in 2003, it set a $500 maximum on what customers could borrow at any given time. When the state’s banking department tried to create a payday loan database to keep track of these loans, the industry sued to block it. “They wanted to selectively deploy information as it suits their needs as opposed to having state-sanctioned data about their business,” says Stephen Stetson, a policy analyst at the nonprofit advocacy group Alabama Arise.

As the industry grew exponentially throughout the 1990s and into the 2000s and consumer complaints began to mount, reformers made some headway. In 2007, Congress banned payday lending to military personnel and their families. In 2008, Ohio passed legislation capping payday loan interest rates at 28 percent, and in 2009, New Hampshire capped interest rates at 36 percent. Montana and Colorado followed suit in 2010.

These were scattered victories, though; the real reform action was still to come. In 2011, the CFPB began building out its Office of Research. In January 2012, it conducted its first field hearing on payday lending, in Birmingham, Alabama, where CFPB director Richard Cordray announced the publication of a field guide for agency examiners who would be deployed across the country to study payday lending practices firsthand. In June 2012, the agency established a portal for consumers to file complaints and created a searchable database for the complaints it received.

During these initial years, the CFPB’s future was far from assured. Congressional Republicans made no secret of their hatred of the agency and repeatedly tried to weaken its independence. Senator Ted Cruz introduced legislation to kill it off. In this uncertain environment, major philanthropically supported nonprofits stepped forward to conduct their own research. In 2012, the Pew Charitable Trusts released the first of a series of reports based on extensive surveys of payday loan borrowers—surveys that, in effect, mirrored the proprietary data independent researchers couldn’t get the industry to share. The report mounted a frontal assault against the industry’s claims that payday loans are an emergency product: “Payday loans are often characterized as short-term solutions for unexpected expenses, like a car repair or emergency medical need. However, an average borrower uses eight loans lasting 18 days each, and thus has a payday loan out for five months of the year.”

In 2013, Pew followed up its initial study with another report that further undermined the industry’s argument of payday lending as a short-term fix and undercut the industry’s arguments of customer “satisfaction” by finding that many borrowers opted for payday loans out of “unrealistic expectations and by desperation.” Similar findings also emerged from a 2012 survey of payday borrowers by the Center for Financial Services Innovation, published with the support of the Ford Foundation.

In April 2013, the CFPB’s Office of Research issued its first white paper on payday lending, based on its analysis of twelve million loans from thirty states over a twelve-month period from 2011 to 2012. This report was followed by a second study in March 2014. The research conclusively shattered some of the industry’s key claims—using industry data. In particular, it demolished the industry’s denial of the “debt trap” created by its products. “[T]he core payday loan product was designed and justified as being expressly intended for short-term emergency use,” said Cordray in 2014 at the release of the CFPB’s report. “But our study today again confirms that payday loans are leading many consumers into longer-term, expensive debt burdens.” For example, the CFPB said, as many as 48 percent of payday loan borrowers had taken out ten or more loans over a twelve-month period, and more than four in five loans were being renewed within fourteen days of origination.

For proponents of reform, the results of the CFPB’s research represented both the vindication they had sought for years and the weapon they needed for victory. “It’s the fuel for how we’re going to be able to drive reform at the federal and state levels,” says advocate David Rothstein.

In one telling sign of the shift that’s taken place, the industry appears to have stopped arguing that payday loans are an emergency, lifeline product. In 2012, Dennis Shaul, CEO of the industry’s largest trade association, the Community Financial Services Association of America, had testified before the U.S. Senate that “[p]ayday loans are one option for those who need help to make it to the next paycheck.” But in February 2016, this time testifying before a House committee, Shaul changed his tune: “Payday loans, perhaps once used more often to meet emergency expenses, are now used to offset income disruptions as well,” he testified.

“The reality of the debt trap is indisputable,” says Tom Feltner, the Consumer Federation of America’s director of financial services. “The debate has changed from whether or not intervention is needed to what type of intervention is the most helpful.”

Among the benefits of the CFPB’s research-driven approach to regulation is that it provides a credible evidentiary foundation for its eventual rule making—a must if the rules are to survive the challenges the payday lending industry will undoubtedly bring, both in the courts and by their allies in Congress. The proposed regulatory framework the agency announced in March 2015 includes “debt trap prevention” requirements such as a limit on the number of loans a borrower can take out in one year and consideration of a borrower’s ability to repay before lenders can make or renew a loan. Both proposals are squarely aimed at ending the cycle of repeated borrowings that the CFPB found in its data.

More so than even campaign contributions, information imbalances give powerful interests the leverage to turn policymaking in Washington to their advantage. But in the case of payday lending, regulators were able to crack the vault–and consumers will see the benefits.

The CFPB’s research has also had important spillover effects on reform efforts in the states. “The joint federal and state jurisdiction over payday lending is going to be the tipping point for states,” Feltner says. In Alabama, for example, state legislators are currently contemplating two separate payday lending reform bills, one of which passed the state senate 28–1 in April. “We have a much different environment this year than in years past,” says Arthur Orr, the Republican state senator who was the bill’s sponsor.

Republican State Representative Danny Garrett, the sponsor of the other payday loan reform bill pending in the state, says the availability of both state and national data has been a major factor in the momentum for reform. Last year, the Alabama Supreme Court ruled against the payday lenders who sued to keep their data private, and in August 2015 the state’s payday loan database became operational.

“People are realizing that this product is preying upon a small proportion of the population and generating huge revenue at a huge profit,” Garrett says. “In years past, the industry could always talk about this wonderful service they provided to all these people. They can’t say that anymore. We have data that shows what it is.”

There are three lessons to be drawn from this story. First, lawmakers should empower more agencies of government with the authority and budgets to do what the CFPB has done. From regulating telecommunications to contracting out services, government agencies are perpetually at a disadvantage because of asymmetries of information between themselves and the private entities they are charged with overseeing. It is these information imbalances, more so than things like campaign contributions, that give powerful interests the leverage to turn policymaking in Washington to their advantage.

Second, Congress should empower itself to gather more of its own information and expand its capacity to interpret the data it receives from external sources. As this magazine has argued (Paul Glastris and Haley Sweetland Edwards, “The Big Lobotomy,” June/July/August 2014), Congress twenty-five years ago gutted, through staff and budget cuts, much of the internal research ability it once possessed. Today, members have little in the way of in-house expertise to referee competing and polarizing claims on such contentious issues as climate change, the impact of trade agreements, gun control and gun violence, and the likely effects of raising the minimum wage. This lack of research capacity is especially alarming when it comes to emerging issues where disruptions in technology and business have opened up a regulatory vacuum, such as the oversight of the “on-demand” economy. Despite the angst among many policymakers about the “Uberization” of the modern workforce, Congress currently lacks even the most basic information about this trend—how many on-demand workers are there, for example, and what are their economic circumstances?

Third, to the extent Congress can’t or won’t execute these two tasks itself, there is a potentially vital role that the non-profit and philanthropic community can play in investing in independent research and helping policymakers challenge the one-sided narrative they might hear from industry.

A foundation of the American legal system is that each side in a lawsuit must have access to the same evidence base—indeed, if one side is caught withholding such evidence, it can be grounds for the other side to have the case thrown out or overturned. Yet for some reason, we do not apply this obviously sensible rule to policymaking. Simply giving government the authority to see the same data that industry lobbyists possess would go a long way toward making Moynihan’s memorable maxim about the role of facts actually true in Washington.

The post Payday for the Public appeared first on Washington Monthly.

]]>
57717 June-16-Kim-cartoon
The Most Important Agency You’ve Never Heard Of https://washingtonmonthly.com/2016/06/12/the-most-important-agency-youve-never-heard-of/ Sun, 12 Jun 2016 16:45:15 +0000 https://washingtonmonthly.com/?p=57731

The Office of Financial Research is meant to be the early-warning system for the next financial crisis. Is it doing its job?

The post The Most Important Agency You’ve Never Heard Of appeared first on Washington Monthly.

]]>

Among the many lessons learned from the 2008 financial crisis, one thing stands out: ignorance—willful or otherwise—drove the system to the brink of collapse. While banks were busily writing mortgages destined to default, there was a blithe, system-wide failure to recognize what those toxic mortgages could do to the economy. Not only were regulators asleep at the wheel, they didn’t even know the car was moving.

The Dodd-Frank Wall Street Reform and Consumer Protection Act, passed in 2010, took a number of steps aimed at righting the wrongs of the financial crisis. One was the creation of a new agency, the Office of Financial Research (OFR), tasked with ensuring that Washington would never get caught so flat-footed ever again. Headquartered in a nondescript office building in downtown D.C., the 225-person bureau collects data and produces reports aimed at identifying potential threats to the financial system. Although technically part of the Treasury Department, the body is by law independent. Its budget, $99 million in fiscal year 2016, is funded through fees paid by the country’s largest banks. To help the OFR carry out its mission, Congress granted it sweeping powers, including the right to demand certain data from banking regulators and financial institutions, either voluntarily or with a subpoena, as well as from banking regulators.

Yet given its vital role, the OFR is a body that is surprisingly—perhaps worryingly—low-key. Its director, Richard Berner, hasn’t testified before Congress since 2014. Its numerous reports are highly technical, specialized, and rarely grab headlines. And given its vast powers, it’s unclear why it hasn’t collected more data from financial institutions or exercised its subpoena power to do so. Nearly six years after its creation, the OFR has yet to establish the credibility and influence its champions had hoped for, leading some to worry whether it will be able to fulfill its mission of averting the next financial crisis. “OFR was created to be the beating heart of the financial circulatory system for the U.S. government,” says Dennis Kelleher, president and chief executive of Better Markets, an advocacy group. “I don’t think anybody would claim that it’s lived up to that goal.”

The idea for the OFR took root in early 2009 during the fallout from the financial crisis. Regulators struggled to make decisions during the height of the meltdown in part because they lacked real-time information about which banks were connected through financial relationships, and how. The crisis also exposed how little federal regulators knew about the world of “shadow banking,” such as the vast market for credit default swaps, collateralized debt obligations, and other complex securities that played a role in the meltdown. Regulators had no idea how much money was at risk in these activities, let alone who was involved or what the results would be if there were massive defaults.

To resolve this problem, reformers called for the creation of a new agency, initially called the National Institute of Finance, that would help analyze system-wide information about financial transactions and positions. The agency would build two reference databases so the regulatory community would be able to collect and assess data from financial firms and their subsidiaries, at the level of individual financial instruments or contracts. By creating a data standard for this, regulators would be able to drill down on the markets—to look, for example, at individual loans and credit default swaps and assess the risks. This would give regulators better insight about the system’s health in a way that much of the current, backwards-looking accounting data is unable to do. The agency’s rule-making authority would be confined to writing data standards so it could focus on rooting out problems in the financial system rather than having to act on those problems, thereby avoiding at least some of the lobbying and political pressures that regulators face. Its independence would allow it to honestly assess new threats. “Our view was that it would be like a biblical prophet—speaking truth to power,” says Allan Mendelowitz, a former housing finance regulator who, along with a group of like-minded wonks calling themselves the Committee to Establish the National Institute of Finance, lobbied for the agency from the early days.

Although critics on all sides disliked the idea of adding another standalone banking agency at a time when many wanted to consolidate oversight, the strong support of Senator Jack Reed, a top Democrat on the Banking Committee, helped the concept survive the legislative process. In its final incarnation as the OFR, the agency was placed inside Treasury and given as one of its tasks the job of providing research to the also newly created Financial Stability Oversight Council (FSOC)—a “Jedi council” of sorts, chaired by the Treasury secretary, headed by top regulators at the other banking agencies, and charged with keeping tabs on the health of the financial system.

In January 2013, Richard Berner, the agency’s first official director, was confirmed by a Senate voice vote with little fanfare, after being hired in April 2011 to help get the agency up and running. A macroeconomist who previously served on the research staff at the Federal Reserve, Berner was also a top economist at Morgan Stanley and Mellon Bank.

The agency has been mostly invisible since.

What’s baffling is why. While the creation of the agency garnered some pushback, it never came under the same sort of sustained attack as the Consumer Financial Protection Bureau, the highly controversial conservative punching bag. Lobbyists and critics have occasionally warned that the OFR could become the “CIA of financial regulators,” but opposition overall has been muted. House Republicans have threatened to put the agency under congressional appropriations and increase its transparency by making its reports subject to public comment, but these aims are low on the GOP’s wish list.

Another source of the agency’s low profile could stem from its subordinate position within Treasury. Although the OFR is technically independent, an agency spokesman confirmed that the director of the office reports to Treasury’s undersecretary for domestic finance. That seat is currently vacant, so Berner often consults with the special counselor to the Treasury, Antonio Weiss, who is filling the slot. Given the hybrid setup, the decision to assert independence rests in some ways with the leadership. Critics point to the Office of the Comptroller of the Currency, one of the bank regulators, as a model for the agency to follow: it too is housed in Treasury, but there’s no question that it’s autonomous when it comes to supervision and rule making.

The OFR also took an early hit to its credibility when it released a report reviewing the systemic risks of the asset management industry. The study was widely panned by lawmakers, regulators, industry officials, and advocacy groups, with some critics arguing that it mischaracterized the way asset managers worked. “It just seems to me a listing of possible horror stories with no indications that there was any significant likelihood of any of it happening,” former Representative Barney Frank said about the report in 2013. “The office has to raise its game,” said Reed, after the study was released.

Berner, for his part, has repeatedly said that the agency stands by the report, and the OFR has noted that other governments across the world are adopting an oversight approach along the lines of what the agency suggested. The director also defends what the OFR has achieved. “I do think we’ve made contributions on both the data and analysis fronts, and the evidence for that is that others cite our work and Congress looks to our work to inform their decisions,” he says.

Behind the scenes, OFR officials have led the effort to create a global reference database with more than 435,000 bar codes, called “legal entity identifiers,” to track financial firms and their subsidiaries. At the height of the financial crisis, banks and regulators were unable to fully assess counterparty risk to a Lehman Brothers failure, because no database of exposures existed. The hope is that having this system in place and identifying exactly how the entities connect to one another will better clarify the structure of markets, particularly if use of the bar codes is required by regulators.

At the same time, the OFR has started two pilot projects to collect data on a voluntary basis from financial firms on the largely hidden market for short-term loans between financial institutions and securities lending transactions. It has also published a series of annual reports assessing a range of threats to the system and issued several significant studies examining the risks of the country’s largest banks, one of which was held up by the Republican chairman of the Senate Banking Committee in support of a bill to raise a Dodd-Frank threshold for big banks.

But for critics, this activity, nearly six years after the passage of Dodd-Frank, is far too little, too late. While the OFR has had piecemeal responses to some of the high-profile debates in banking—over whether there’s enough liquidity in the markets or the role of high-frequency trading—critics charge that it has yet to influence those discussions through the collection of fresh data or deep analysis. It’s not yet a go-to authority on the matters of the day. “Many people hoped OFR would operate like a sentinel providing urgent warnings of potential crises, but it has not followed a practice of saying ‘danger’ or ‘warning,’ ” says Art Wilmarth, a law professor at George Washington University. “Instead, the tone of many OFR reports has been relatively neutral, bland and low-key.”

In addition, the agency is only in the early stages of its work to create a reference database of financial instruments that would give regulators more data about the workings of the market. (Mendelowitz, one of the bureau’s early advocates, launched his own project, a nonprofit called ACTUS, that’s creating an open-source data dictionary and set of algorithms to help fill the void.) People close to the agency also suggest that it’s still having trouble collaborating with some of the banking regulators, though it’s required by law to check with the other agencies about filling data gaps before going directly to financial institutions. “Getting to a place where there’s good information sharing across agencies is critical,” says Michael Barr, a professor at the University of Michigan and a former Treasury official who helped write Dodd-Frank. “In Washington, there are always turf fights among agencies because access to information is a kind of power, and I think that’s an area where you’re going to need to see the OFR assert itself more strongly going forward.”

In January 2013, with little fanfare, the Senate confirmed the macroeconomist and former Federal Reserve staffer Richard Berner as the OFR’s first official director. The agency has been mostly invisible since.

A GAO report from February, for example, found that the OFR and the Fed aren’t working closely enough together and could be duplicating efforts when it comes to monitoring risk. Berner says that the agency is working to “find the right balance between access and confidentiality” in terms of sharing intel.

One of the agency’s toughest challenges is juggling the many demands that have been placed on it. It must collect and standardize massive amounts of new and complex data distributed across the system and help support research for the financial stability council—a tall order in itself, and one that’s bound to take years to complete, particularly for a relatively new agency.

Moreover, some of the OFR’s toughest critics have lofty aims for the agency. In addition to supporting the FSOC, it’s supposed to serve as a counterweight to the very regulators that head that council. Reformers have likened this role to a “storm-warning system” that can sound an early alarm in case of financial danger—and, in this case, it’s armed to do so even when there might be political pressure to stay quiet. But the very nature of financial crises is that most people don’t see them coming. Can one small agency really be charged with ensuring that the next one doesn’t hit? Much like the country’s other intelligence agencies, there’s often little credit awarded for crises averted.

“It’s not like we can say, well, we did this, and can immediately see the result. Financial resilience doesn’t quite work that way,” says Berner. “It’s not easy to measure, it’s not easy to calibrate. Unfortunately, the answer’s going to come over time when we see how the system responds to shocks.” Nevertheless, says advocate Dennis Kelleher, the OFR has “a very long way to go living up to its mission and the requirements in the law.” Congress gave the OFR the teeth of a watchdog. It should use them.

The post The Most Important Agency You’ve Never Heard Of appeared first on Washington Monthly.

]]>
57731
Tilting at Windmills https://washingtonmonthly.com/2016/06/12/a-trump-by-any-other-name-mitt-romney-an-overdue-appreciation-confessions-of-a-hopeless-book-hoarder/ Sun, 12 Jun 2016 16:32:08 +0000 https://washingtonmonthly.com/?p=57747 A Trump by any other name ... Mitt Romney: An overdue appreciation ... Confessions of a hopeless book hoarder

The post Tilting at Windmills appeared first on Washington Monthly.

]]>
No conventional wisdom

Charlie Peters (the founder of the Monthly and the greatest living windmill tilter) portrayed it as the most important and dramatic political convention of the past century. His 2005 book, Five Days in Philadelphia, vividly recreated the 1940 Republican convention, which nominated Wendell Willkie on the sixth ballot over the isolationist Robert Taft.

Just two weeks before the Philadelphia convention was gaveled to order, the victorious Nazis had paraded through the Arc de Triomphe on what was the saddest day of World War II. With the British Empire standing alone, Willkie was the only GOP candidate willing to buck the tides of isolationism. As Charlie Peters put it, “If Taft had been nominated he would have vigorously opposed the efforts FDR made to aid Britain and to enact a military draft in this country.”

Willkie, a former utilities executive mocked as “the barefoot boy from Wall Street,” didn’t run in any primaries. Less than two months before the convention, Willkie was running at 3 percent in the polls. But in those days, voters were willing to defer to the judgment of party leaders as to who would be the strongest candidate in November.

Willkie and soon Donald Trump are the only presidential nominees in a century who have neither won World War II in Europe (Dwight Eisenhower) nor held political office (everyone else). Although no media conspiracy rivals the free ride that Trump has been granted by ratings-obsessed TV networks, Willkie’s candidacy was boosted by gushy over-coverage from Henry Luce’s empire. The May 13, 1940, issue of Life gave Willkie an unprecedented eleven-page spread that ended with this bit of puffery: “In the opinion of most of the nation’s political cognoscenti Wendell Lewis Willkie is by far the ablest man the Republicans could nominate for President.”

The modern primary system was created from the tear-gassed wreckage of the 1968 Democratic convention, which was so undemocratic that 25 percent of the delegates were selected in 1967, long before Eugene McCarthy challenged LBJ.

I recalled the Willkie convention in early May as the remnants of the Republican establishment collapsed faster than the French army in 1940. Instead of carrying the battle against Donald Trump to the convention floor, the GOP hoisted the white flag nearly eleven weeks before the delegates will assemble in Cleveland.

Even if Trump had swept the last primaries, rational Republicans still would have had weapons at their disposal in Cleveland. Since the GOP delegates are free agents on all votes except the presidential balloting, the anti-Trump forces could have mounted credential challenges and even tried to pass a rule to allow the delegates to exercise their best judgment on a nominee.

That was the theory, anyway.

In reality, the exit polls consistently demonstrated that Republican voters refuse to accept the legitimacy of a political convention as a decisionmaking body. In Indiana, the final contested primary before the deluge, 67 percent of Republican voters stated that the delegates should nominate the winner of the most primaries if no candidate has a majority. Even in the Wisconsin primary, which represented the high-water mark of the Ted Cruz campaign, 55 percent of GOP voters believed that Republican delegates in Cleveland should rubber-stamp the frontrunner in the primaries as the nominee rather than selecting who they feel is the best candidate.

Four years ago, there was a stirring Broadway revival of The Best Man, a 1960 Gore Vidal play about an Adlai Stevenson figure wrestling with his conscience at a political convention. If it were updated now, it would abruptly end during the primaries and be titled The Worst Man.

The modern primary system was created from the tear-gassed wreckage of the 1968 Democratic convention, which was so undemocratic that 25 percent of the delegates were selected in 1967, long before Eugene McCarthy challenged LBJ. But few party reformers ever envisioned that the primaries would someday prove to be a vehicle for a hostile takeover of a political party by a demagogue bristling with contempt for democratic norms and any coherent ideology.

In 1940, Republican delegates chanted, “We Want Willkie!” and rejected America First isolationism. In 2016, the Republican voters picked a nominee who proudly embraces the America First label—and cowed party leaders like RNC Chairman Reince Priebus and Senate Majority Leader Mitch McConnell refused to object.

A Trump by any other name

As I wrote the prior item, I realized that I have succumbed to that dread malady known as Anti-Trump Adjective Fatigue.

It is still spring and I have already run out of new descriptions for the ignorant hatemonger who will be bathed in confetti at the Cleveland convention. In my guise as a Roll Call columnist, I have called Trump “the bilious billionaire” in every article. At times, I have gone further, depicting him, for example, as “a potty-mouthed, pathological liar whose ignorance is only exceeded by his arrogance.”

Journalistically, I stand by all these descriptions. I do worry that piling on the adjectives eventually will seem labored and false. But treating Trump as a normal presidential candidate (“Hopscotching across the Midwest, the Republican nominee lashed out at his Democratic rival . . .”) masks the horrifying reality of the . . . err . . .bilious billionaire.

What particularly galls me is the mock familiarity that accompanies references to Trump as “the Donald.” That was how Ivana Trump, the now-discarded trophy wife of the bankruptcy-prone real estate investor, referred to her husband in a 1989 interview with Spy magazine. It was one thing for the New York tabloids to overdose on “the Donald” jokes when he was a buffoonish self-promoter. But now that Trump is poised to be the nominee of a once-but-no-longer grand old political party, the joke’s on us with each clichéd “the Donald” reference.

Tortured history

Trump, like many Republicans from the Dick Cheney School of Incompetent Toughness, has a fondness for interrogation techniques that come out of the proud tradition of the rack and auto-da-fé. Asked about waterboarding before the South Carolina primary, Trump promised not only to restore it to a place of honor but also to promote techniques that are “so much worse.” As Trump put it, “Don’t tell me it doesn’t work—torture works.”

Or does it?

In mid-April, the New York Times ran an inspiring obituary of one of those half-forgotten, amazing World War II heroes. Frederick Mayer, who died at ninety-four, was a German Jewish refugee who enlisted in the U.S. Army in 1941. Dropped behind enemy lines in Austria in February 1945, Mayer posed as a German soldier for two months in order to radio Nazi troop movements to the Allies.

But, finally, Mayer was captured, in the waning weeks of the war. As Eric Lichtblau wrote in the Times obit, “His German captors tortured him for days, waterboarding and pistol-whipping him repeatedly to try to get him to reveal the locations of his American colleagues. He would not talk.”

If Trump read the stories of World War II heroes—or read anything else, for that matter—he might learn that the Nazis enthusiastically employed waterboarding, and that torture didn’t work.

Mitt Romney: An overdue appreciation

Liberals, like all political partisans, fall into the trap of automatically turning their electoral adversaries into one-dimensional cartoons.

Against the backdrop of the 2012 campaign, Democrats portrayed Mitt Romney as an out-of-touch plutocrat who only cared about the “1 percent” while demonizing the “47 percent.” Having flip-flopped on abortion and abandoned his signature Massachusetts health care law, the GOP nominee was also mocked as a political changeling with few core values.

But what liberals missed in 2012—and many still fail to recognize—is that Romney is an honorable public figure who happens to hold differing views on the economy and foreign policy. And just as it was commendable that his father, Michigan Governor George Romney, displayed political moxie in standing up to Barry Goldwater in 1964, so too does Mitt deserve plaudits for leading the reasonable Republican resistance to a Trump takeover.

What liberals missed in 2012- and many still fail to recognize- is that Mitt Romney is an honorable public figure who happens to hold differing view on the economy and foreign policy.

In an early March speech, Romney said bluntly, “Donald Trump is a phony, a fraud. His promises are as worthless as a degree from Trump University.” After Cruz and John Kasich abandoned the primary fight against Trump, Romney was among the first Republicans to announce that he could not ever support the nominee.

Of course, sometimes partisan caricatures are dead-on. Dick Cheney, the architect of victory in Iraq, should be the kind of Republican appalled at Trump’s know-nothing isolationism—especially since Trump constantly ridicules the Cheney-orchestrated folly of the invasion of Iraq. But principled conviction is not a weapon in Cheney’s arsenal. In the latest example of his Wrong-Way Corrigan judgment, the former vice president went out of his way to tell CNN that he would be supporting Trump as the GOP nominee.

What are we so afraid of?

Ever since I stumbled on a March 1992 Time cover story on the death of the American Dream and “The Angry Voter,” I have been skeptical about glib linkages between economic anxiety and the trumpery of the primaries.

The loss of blue-collar jobs fails to explain why, according to exit polls, Trump won 60 percent of the votes of Indiana Republicans earning more than $100,000 a year. This was not an aberration. In Pennsylvania, Trump won 53 percent of the votes of upper-income Republicans. He even beat Cruz and Kasich among GOP voters with post-graduate educations.

When Obama took office after the greatest economic collapse since the Great Depression, no one would have guessed that unemployment would fall to as low as 5 percent in 2016 and that roughly two-thirds of the voters would still believe that the nation is “on the wrong track.” What no one has satisfactorily explained about the current glum national mood is the Fear Factor in the non-economic realm.

An early March Gallup poll, taken as Trump was galloping through the primaries, found that “Concern About Crime Climbs to a 15-Year High.” Even though the crime rate has been flat in recent years after a dramatic decline, 53 percent of Americans worry “a great deal . . . about crime and violence.” Just two years ago, only 39 percent of a national sample expressed similar worries. Seventy percent of those with a high school education or less are deeply fearful of crime, compared to 50 percent in 2014.

Since the days of Richard Nixon peddling “law and order,” fear of crime has often been a proxy for racial issues. So it seems likely that the rioting in Ferguson and Baltimore over police violence provides part of the explanation. But only part. The change in the polls is too dramatic (crime fears haven’t been this widespread since 2001) to be caused by memories of television footage from a year ago.

It also strains credulity to believe that these crime fears represent a racist backlash against an African American president. Four years ago, when Barack Obama was running for reelection, only 42 percent of adults worried a “great deal” about crime.

Equally baffling to me is why fears of a terrorist attack (48 percent of adults told Gallup they worry a “great deal”) are now as high as they were in 2002, a year after 9/11. The relevant Gallup poll was taken just before the latest wave of terrorist violence in Brussels, but clearly the Paris attacks have left Americans shaken, as have the killings in San Bernardino.

Okay, I confess to being an eastern elitist who lives in Manhattan and never thinks about the terrorist threat. So maybe I’m ill-equipped to understand why residents of, say, Enid, Oklahoma, are displaying a level of panic not seen since the days when everyone assumed that Osama bin Laden was about to launch a new assault on America.

While I cannot fully grasp its causes, the Great Fear of 2016 has to be a major reason why the Republican Party has given way with Trump to the authoritarian temptation that always lurks beneath the surface in a democracy.

Confessions of a hopeless book hoarder

Like many writers, I have an out-of-control book collection that legendary hoarders like the Collyer brothers might envy. Why have I saved from my days as a Monthly editor a Brookings volume titled Setting National Priorities: The 1973 Budget? Or held on to The Good Food Guide 1989, a memento from a trip to London when Margaret Thatcher was still prime minister?

The only book I ever deliberately threw out was a first edition of The Art of the Deal. A few weeks after I gleefully disposed of it in 1999, I found myself writing a newspaper column laughing at the wild rumor that Trump might run for president. In quest of an embarrassing Trump quote, I went to the bookshelf and reached for . . . gulp . . . the empty space between a biography of Tocqueville and a literary study of Mark Twain.

So if I ever find myself buried under a mountain of earnest policy tomes from the 1970s and clumsily written police procedurals, I will squander my last breaths cursing Donald Trump.

Every veep pick defies prediction

Every time I have the temerity to believe that I can guess Hillary Clinton’s VP pick, I recall the 1988 GOP convention in New Orleans, which I covered for Time. That was the moment when I absorbed the enduring journalistic truth: If you get back to your hotel room tipsy at 4 a.m., you probably will sleep through an 8 a.m. breakfast.

What did it matter, I figured, since the Time breakfast was with a backbench senator who never would be George Bush’s running mate. Later that week, when I was assigned the Dan Quayle cover story, I had to grovel and beg to hear a tape recording of the breakfast that I had so smartly skipped.

That was the moment when I absorbed the enduring journalistic truth: if you get back to your hotel room tipsy at 4 a.m., you probably will sleep through an 8 a.m. breakfast.

Beginning with Quayle, most nominees for the heartbeat-away job seemed like outlandish or unlikely choices just a week before their unveiling: Al Gore (1992) was too similar to Bill Clinton and from an adjoining state to Arkansas. Dick Cheney (2000) headed George W. Bush’s veep search committee. Joe Lieberman (2000) was a hawkish, Jewish critic of Bill Clinton’s personal conduct. Joe Biden (2008) got exactly 1 percent support in the Iowa caucuses. And Sarah Palin (2008)—do I really have to explain?

But we live in an era when there are just three words that you can’t say on cable TV news shows: “I don’t know.” So brace yourself for weeks of pseudo certainty before Hillary Clinton announces that her running mate will be . . . I haven’t the faintest idea.

The post Tilting at Windmills appeared first on Washington Monthly.

]]>
57747
Mental Illness and Addiction Don’t Respect Party Boundaries https://washingtonmonthly.com/2016/06/12/introduction-mental-illness-and-addiction-dont-respect-party-boundaries/ Sun, 12 Jun 2016 15:50:02 +0000 https://washingtonmonthly.com/?p=57733 Mental health policy cover

An introduction to our special report, The Politics of Mental Health and Addiction.

The post Mental Illness and Addiction Don’t Respect Party Boundaries appeared first on Washington Monthly.

]]>
Mental health policy cover

On January 3, 2015, my brilliant, funny, sweet, and immensely talented thirty-four-year-old son Matthew died in Delaware, an accident caused by inadvertent carbon monoxide poisoning. Sixteen months later, the pain is even greater than it was when we found out. While an accident, Matthew’s death was shaped by a lack of judgment itself driven by a ten-year struggle with serious mental illness. In the midst of a successful career in Hollywood, he had a psychotic episode at twenty-four that brought his vibrant life to a grinding halt. Most likely, Matthew suffered from bipolar disorder. There was never a definitive diagnosis, which is not uncommon, but in his case it did not matter. Different diagnoses can lead to different drug combinations or therapies, but a core part of Matthew’s illness was anosognosia—an inability to recognize that he suffered from a mental illness, and an unwillingness to accept any treatment. For ten years, we struggled with him and a system that made it impossible to intervene or help; of course, our frustration and pain paled next to the pain he felt and the stigma he suffered despite the fact that he was never a danger to anyone.

In the United States, if an individual is over eighteen, both federal and state laws in most cases give the individual enormous autonomy. Parents and other loved ones, not to mention most medical professionals, are unable to learn about their conditions or to influence treatment in any way. The autonomy flows mostly from an understandable concern about civil liberties, but for those with deep-seated psychoses and/or with anosognosia, the result is not freedom but more often tragedy, from homelessness to bullying to arrest and worse.

Read the full special report, The Politics of Mental Health and Addiction, here.

For loved ones of those with serious mental illnesses, sometimes the only realistic hope of getting treatment for their conditions is to have them arrested—and have a judge who has both the sensitivity and power to provide an alternative to prison or jail, including assisted outpatient treatment (AOT). That is what happened to the journalist Pete Earley, who recounted, in his 2006 book Crazy, the happier ending to the journey he had with his own bipolar son.

One judge who is making a dramatic difference is Miami-Dade’s Steve Leifman, who has transformed the way the county deals with mentally ill patients who come through the criminal justice system by developing partnerships with police and 911 responders to get them crisis intervention training (CIT). The judges in his court have separate mental health court hearings and provide an alternative to jail, while mental health, social work, and county officials provide wraparound services, including housing, therapy, medications, and counseling, along with job training, for people with mental illnesses. He has had remarkable success, transforming lives and reducing imprisonment and recidivism, even enabling the county to close a jail and save tax-payers $12 million.

But Leifman’s heroic efforts remain far more the exception than the rule. Many with mental illnesses who come in contact with police—most of whom do not have CIT—end up tased or shot because they do not respond to commands the way others do. And for those in jails, often for petty theft, loitering, or small-time drug offenses (a large number of those with mental illnesses have dual diagnoses, including substance abuse problems), the outcomes can be simply horrific. Last April, as the Washington Post reported, Jamycheal Mitchell, who suffered from schizophrenia and bipolar disorder, was arrested for stealing $5 worth of snacks. A judge ordered him sent for treatment to a state hospital until he was well enough to stand trial. Instead, because of bureaucratic malfeasance and incompetence, he languished for months in a jail, where he got no treatment and no attention and died of heart problems related to extreme weight loss.

Abuses in prison go beyond neglect. Eyal Press’s stunning exposé in the New Yorker showed mentally ill prisoners beaten, tortured, starved, and killed in Florida and New York, with abuses covered up. Even when officials vow to fix things, underfunding and privatization interfere. Fortunately, the policy dilemmas and the major problems associated with mental illness and substance abuse are now on the radar screen of local, state, and federal officials. A few weeks ago in Washington, D.C., the American Psychiatric Association Foundation, the National Association of Counties, and the Council of State Governments sponsored a conference inspired by Judge Leifman, called “The Stepping Up Initiative,” bringing together representatives from fifty counties around the country to share best practices to deal with the burgeoning cost and pain of the mentally ill caught in county jails.

At a time when few things in Congress are bipartisan and presidential candidates agree on even less, it is encouraging that this area is different.

In Congress, we are seeing bipartisan activity on several fronts. In the Senate, the Comprehensive Mental Health and Justice Act, cosponsored by Minnesota Democrat Al Franken and Texas Republican John Cornyn, passed on a voice vote last December after being subject to holds by Republican senators for an extended period. It has bipartisan support in the House, and should make it across the finish line this year. The act would expand mental health courts and veterans’ courts, and vastly increase CIT for police, school officials, and others who come into contact with those suffering from mental illnesses who have a crisis or confrontation, in order to avoid violence and tragedy.

Michigan Democrat Debbie Stabenow and Missouri Republican Roy Blunt were able to pass a bill in the Senate to expand funding for community mental health centers, the first step in a more comprehensive approach. And in both the House and the Senate, bipartisan mental health policy reform bills are inching forward: Tim Murphy, a conservative Republican representative from Pennsylvania as well as a psychologist, is joining with the former psychiatric nurse and Texas Democrat Eddie Bernice Johnson as chief sponsors in the House; Connecticut Democrat Chris Murphy and Louisiana Republican Bill Cassidy are sponsors in the Senate.

The House bill would provide incentives for AOT, more beds for patients with mental illnesses, and more flexibility in the HIPAA law to enable loved ones to get information about their mentally ill relatives, and would reform a dysfunctional federal government system by reorganizing the Substance Abuse and Mental Health Services Administration to make it more effective and responsive to the problems of serious mental illness. The Senate bill is significantly weaker, but still moves in the same direction.

Some months after my son died, I wrote an op-ed in the New York Times about our family’s journey, making a strong plea for the Murphy-Johnson legislation. I was flooded with responses, including from many who themselves suffer from mental illness, many more parents and siblings whose journey was similar to ours (although not always with its horrible ending), and many more yet who had experienced the suicide of fathers, mothers, brothers, and sisters after their struggles with mental illness. For a large number, it was the first time they had spoken or written to anyone about their experiences; some wrote that they had felt alone in their trauma. It has become clear to me that there is scarcely a family in America that has not been touched by these problems and issues. But it is also clear that there has been limited discourse on their experiences, on the policy and medical dilemmas we face, on what paths we need to follow, and on what works and what doesn’t.

There are essays in this special section on the unique problems in rural America, on stigma and interactions with police, on the treatments available on addiction and mental illness and what seems to work and not. And there are articles on the approach of the presidential candidates who have directly addressed these issues in their campaigns, especially John Kasich and Hillary Clinton. Kasich’s case is interesting in part because it reflects a reality of political life—lawmakers with passion about these difficult problems are often those who have been profoundly affected in their own families. That was true of Senators Pete Domenici and Paul Wellstone when they championed mental health parity in health insurance coverage (its spotty enforcement is the subject of another essay here).

At a time when few things in Congress are bipartisan, it is encouraging that this area is different. But that sunny reality has some dark clouds on the horizon. The Murphy-Johnson bill faces opposition from both Democrats and Republicans—with many Democrats resistant to anything that impinges on the civil liberties of the mentally ill, even if they are deeply psychotic or don’t recognize their illnesses, and many Republicans resistant to spending any money through the federal government, despite evidence that the money spent on effective treatment, including wraparound services and providing beds, along with alternative treatments to imprisonment for those caught up in the criminal justice system, can actually save money as it saves lives and heartache.

While the Franken-Cornyn bill has already moved through the Senate and has strong support in the House (in part because it is an authorization, not yet an appropriation), this Congress is en route to becoming the most unproductive in modern times and cannot be relied upon to act expeditiously on even consensus bills in an area that has deep needs at all levels. But given the broad support, you can expect similar legislation in the next Congress with a new president at the helm. And perhaps this special section of the Washington Monthly can raise enough consciousness and provide enough grist to create more public demand for action and move us at least a baby step closer to progress.

The post Mental Illness and Addiction Don’t Respect Party Boundaries appeared first on Washington Monthly.

]]>
57733
Hillary Clinton’s Work for Mental Health Parity https://washingtonmonthly.com/2016/06/12/hillary-clintons-work/ Sun, 12 Jun 2016 15:48:07 +0000 https://washingtonmonthly.com/?p=57734 Hillary Clinton

It's a fight she's fought for decades — and may be able to finish from the Oval Office.

The post Hillary Clinton’s Work for Mental Health Parity appeared first on Washington Monthly.

]]>
Hillary Clinton

As Hillary Clinton campaigns for the presidency, she frequently invokes her well-known role in crafting her husband’s ill-fated 1993 health care plan, to demonstrate to progressives who remain uncertain about her ideological instincts that she has faithfully advocated for universal health care for more than two decades.

But it should also be remembered that as part of that effort, Clinton also pushed for broad reforms to how our nation treats—or mistreats—people with mental illness. It’s a cause she has championed for just as long.

Early on, as the leader of the Clinton administration’s health care task force, the first lady enlisted Tipper Gore, wife of Vice President Al Gore, to serve as its mental health adviser. Tipper, who had a master’s degree in psychology and had long been involved in mental health advocacy, due in part to her own bouts of depression, recommended a policy of “parity”—that is, that the government should require insurance plans to offer coverage on an equal basis to both physical and mental illness. Gore articulated the problem in simple terms: “Why should a woman with diabetes who needs insulin have it covered by insurance, whereas a woman with manic-depressive illness who needs lithium not be covered in the same way, when both diseases can be managed and controlled?”

At the time, Hillary Clinton agreed. “It’s a problem that permeates the whole system,” she said. “We have to do something. I don’t think there is a choice anymore.” Mental health practitioners and advocates were ecstatic. As Congressional Quarterly put it, “For the first time in history, they see a chance that mental illness will get the same insurance coverage as physical illnesses.”

As we all know, this entire effort came crashing down, due to industry and congressional opposition, as well as Hillary Clinton’s own miscalculations. The chastened first lady retreated from her push for health care reform—mental health reform included.

Or, at least, so it seemed. The full story is that, starting in the mid-1990s and continuing for the next two decades, Clinton kept up the fight, both in public and behind the scenes, for parity. And it paid off with a series of small-bore advances that—while there’s still a long way to go—have added up. That narrative sheds light on the continuing challenges presented by the parity issue, and usefully illustrates Clinton’s broader public philosophy—that is, that incremental reform is worth fighting for, and can produce real change over time.

Only two years after the defeat of the health care bill, President Bill Clinton signed, with Hillary’s backing and advocacy, the Mental Health Parity Act in 1996. That law was extremely limited in scope. It secured the same lifetime and annual dollar limits to mental health coverage as for coverage of medical and surgical benefits, but insurers found ways around it by restricting the number of hospital days and outpatient visits for mental health services. As liberal Senator Paul Wellstone, the bill’s cosponsor, put it at the time, “We didn’t even get half a loaf. We just got crumbs. But it’s a start.” And indeed it was; the law raised the profile of the parity issue and prompted states to experiment with parity laws of their own.

Three years later, Hillary and Tipper helped organize the first-ever White House Conference on Mental Health, which brought national attention to the nation’s neglect of the mentally ill—and to the cause of parity. “We must do whatever it takes not only to remove the stigma from mental illness, but to begin treating mental illness as the illness it is on a parity with other illnesses,” Hillary declared. At the same conference, the president announced an executive order providing mental health parity for 8.5 million federal employees, retirees, and their dependents covered by the federal government’s employee health benefits program. That coverage continues to this day.

By 2000, Hillary Clinton was running for the Senate from New York, and again making the case for parity to voters. “The mind is an organ just like the heart or the liver,” she told one woman on the campaign trail, “and I would like to advocate and work towards parity for coverage for mental illness.” Throughout her first term as senator, Clinton pushed for various mental health care reforms that would have impacted the treatment of mentally ill juveniles in the justice system.

At the core of Hillary’s plan will be a broad effort to tackle that same problem she discussed so long ago: the need for parity in our treatment of people with mental illness.

But it wasn’t until 2008 that she played a role in another, more significant advance on parity. In that year, she cosponsored the Mental Health Parity and Addiction Equity Act, which went much further than its 1996 predecessor by requiring employers with more than fifty employees to provide equal mental and physical benefits if employee insurance plans covered mental health treatments. While the law—which was signed by George W. Bush as part of the big bank bailout package—did not require mental health coverage, and did not apply to the individual insurance markets, it did lead most insurance companies to eliminate separate co-pays and reduce unequal limits on outpatient visits and inpatient stays.

That same year, Clinton ran for president behind a health care plan that featured mental health parity, along with coverage for substance abuse treatment. Though she lost the primary fight to Barack Obama, health reform finally became a reality in 2010, with the passage of the Affordable Care Act. That law brought still more parity reform: it mandated mental health coverage as part of required “essential benefits” packages for some small group plans and on the individual market.

Despite that progress, however, there’s still a long way to go. Insurers have found new ways to get around the parity mandates in the 2008 law and the ACA—by denying claims, for instance, based on their not being “medically necessary.” Federal and state enforcement has been lax. And, as the National Alliance on Mental Illness (NAMI) points out, Medicare and some Medicaid plans are not subject to the 2008 parity law, with the result that many Medicare and Medicaid beneficiaries find that mental coverage is lacking compared to physical coverage.

“Basically what we have now, thanks to all this incremental progress, is that the vast majority of Americans are now covered by federal parity law,” says Timothy Clement, the policy director of the advocacy organization Parity Track. “But we still have huge problems. The law is not adequately enforced, and very few people are aware that there even is a law. The result is that prevalence of mental illness is high and treatment seeking is low.”

Clinton has continued to talk about these and other problems on the stump during the current election cycle. And while she has not yet rolled out a comprehensive plan for overhauling the way our nation treats those with mental illness, campaign aides say she will soon. Her plan will include expanding access to mental health care, greater emphasis on early intervention (treating mental illness in early stages to reduce serious outbreaks later) and suicide prevention, and more investments in efforts to treat low-level offenders with mental illness rather than throwing them in jail.

But at the core of her plan, her aides say, will be a broad effort—once again—to tackle that same problem she discussed so long ago: the need for parity in our treatment of people with mental illness. Her proposals, one aide tells me, will be anchored by the “basic belief” that “mental health is a part of a person’s general health, and mental illness should be treated no differently from other medical conditions.”

It’s unclear whether Clinton’s proposals will end up being as ambitious as her campaign indicates. But advocates are cautiously optimistic that the Oval Office could soon be inhabited by someone who has demonstrated—for decades—an understanding of the need for a fundamental change in the way our society views mental health, one that treats it as fully equal to physical health. As Angela Kimball, NAMI’s director of advocacy and public policy, puts it, “We need a paradigm shift.”

If Clinton does become president, she may be in a position to finally get parity done and complete that paradigm shift. Or, at least, to get us a whole lot closer to completing it than ever before. The current Democratic primaries have been framed as a choice between a candidate with a bold vision (Bernie Sanders) and one promising only incremental reforms (Hillary Clinton). But in the case of mental health, at least, Clinton has revealed that she harbors a vision that is quite bold indeed, but, tempered by experience, she has also demonstrated the value of advancing incrementally toward it, one hard-fought step at a time.

The post Hillary Clinton’s Work for Mental Health Parity appeared first on Washington Monthly.

]]>
57734
John Kasich’s Work https://washingtonmonthly.com/2016/06/12/john-kasichs-work/ Sun, 12 Jun 2016 15:46:33 +0000 https://washingtonmonthly.com/?p=57735 For the governor, it’s personal.

The post John Kasich’s Work appeared first on Washington Monthly.

]]>
At a John Kasich town hall meeting in Watertown, New York, in April, a questioner from the audience was having a difficult time formulating his question. He was able to get out that he suffers from autism and is college educated with two master’s degrees, but said he has a difficult time finding employment. It seems that employers are often put off by his bad social interaction skills and are worried about his potential ability to fit in with other employees. Because of problems like this, the young man told Kasich, adults suffering from autism have very high unemployment rates.

Kasich moved closer and leaned in, clearly both agitated and consumed by the man’s problems. The 600 people in the room could see that Kasich knew what this guy was talking about—not in terms of the specific symptoms of autism, maybe, but in a personal and emotional way. Because if there is one thing John Kasich understands, it is not being understood. Everyone agrees he is bright and likeable, but not everyone agrees that he brings a sense of what we constitute as normalcy.

He has been described as flaky, mean, cheerful, ornery, sullen, distant, and enthusiastic. Someone who is honest but not forceful about his spirituality who often spouts crazy-uncle idioms, who sometimes lectures instead of discussing, and who is always impressed with his own jokes. In a 1995 Washington Post article, the then Ohio congressman was described by those with whom he worked as someone who needed both Ritalin and Valium for his own mental balance to balance the federal budget. The writer also observed that Kasich’s eyes blinked thirty-six times in a minute, compared to just two blinks in the same period by “the more inert” Texas Congressman Dick Armey. “[Kasich] radiates so much energy that colleagues in the Ohio delegation, weary and looking for sleep, dread the thought of getting seated near him on flights back to the Midwest,” the article said. The Post story also declared Kasich to be a “wiry and fidgety politician.”

But on this day in Watertown, his eyes seemed to blink slower as he settled the young man down and addressed disability and mental health issues. “When it comes to developmentally disabled—and we have to come up with a better term—we just need to integrate people into our system to the level they are able to perform,” he said. “We just need to let people know about these issues. It’s not hard to bring people in. What the heck, I’d hire you.”

This exchange took place at almost every Kasich town hall meeting before he suspended his presidential campaign in April. This was where the presidential candidate told the crowd that we must all—government, private businesses, neighbors—take care of the “people in the shadows,” which, in Kasich code, means the developmentally disabled, the mentally ill, the drug addicted, and the impoverished. The ones we may have a hard time understanding because they are not like us.

Republicans have hammered Kasich’s 2013 decision to expand Medicaid in Ohio, accusing him of being friendly to President Barack Obama’s health care reform initiatives and warning that the expansion “would be a disincentive to work.” In other words, any health program that is not provided by the private market or a charitable entity—even government-led mental health initiatives—is nothing more than just another welfare entitlement for the poor.

Though he made the decision not to make his brother a part of the campaign, John Kasich has not shied away from the main reason he expanded Medicaid in Ohio.

But buried a bit deeper in all the political wrangling is a simpler reason why Kasich expanded Medicaid: more mental health treatment and accessibility for everyone had long been at the top of his policy wish list, and he saw expanding Medicaid as the best way to get it. This was driven by three main factors in the John Kasich belief book. He thinks physical and mental health programs must be conjoined for either to be effective, his Christian beliefs are clearly based in helping the less fortunate, and he’s familiar with mental illness issues thanks to family experience. Even with these three driving factors, it’s been no easy task.

“When I started out as a judge more than forty years ago, the first thing I noticed was that my docket was full of people with mental health issues, and they kept coming through again and again,” says Evelyn Lundberg Stratton, a Republican who served as a justice on the Ohio Supreme Court from 1996 to 2012. “They were recycled inmates. And the sheriffs who ran the jails were saying that incarcerating the mentally ill was consuming most of their resources.” John Kasich, Stratton continues, “has been a courageous governor in bucking his own party on this issue, and as a result he has saved many lives.”

It was the new allocation of resources that attracted Kasich initially to the Medicaid expansion. Even before he was elected Ohio governor in 2011, states had to reformulate their Medicaid programs to get them more in line with federal mandates as part of the initial Obamacare changes. So eighteen months before he was elected, Kasich assembled a team—some of whom had worked with him in the 1990s in Washington on the federal budget—that would look at ways to integrate mental health programs into the revised Ohio Medicaid system. “He joked that he was getting the band back together again,” says Greg Moody, who worked on the U.S. House Budget Committee in the 1990s and is now the director of the Ohio Governor’s Office of Health Transformation.

The Medicaid expansion brought about 600,000 Ohioans into the program; more than half of them worked (or had spouses who did) and another 30 percent didn’t work because of chronic disabilities. About half of those with chronic conditions had associated mental health issues. “We have found through numerous studies that most of the mentally ill lose their jobs because of health problems, mostly because getting physical health treatment becomes difficult for them to take care of with the other issues plaguing them,” says Terry Russell, director of the Ohio chapter of the National Alliance on Mental Illness.

The Kasich administration has instituted numerous programs that make Medicaid spending more efficient in integrating mental health into the equation—from increasing childhood access to mental health treatment to changes in opioid addiction treatment polices to spending $316 million in fiscal years 2016 and 2017 to help Ohioans with developmental disabilities, including programs to find better housing and jobs. “We seem to ignore these people,” Kasich said during a campaign speech in Georgia last year. “Now, I don’t know how many of you know people who struggle with these illnesses, but if you’ve got problems with schizophrenia and you find yourself in prison? It’s a disgrace in this country.”

Selling that line of thinking has been difficult, especially in the evangelical Christian community. According to a 2013 study by Lifeway Research, a Tennessee-based Christian church research group, 48 percent of evangelical, fundamentalist, or born-again Christians believe that prayer and scripture study alone can overcome mental illness. (Interestingly, eighteen- to twenty-nine-year-olds in those groups are more likely to believe in Bible and prayer treatment than those between the ages of fifty-five and sixty-four.) So while Kasich has been adamant about trying to bring better mental health treatment options to the poor, he is facing a strange battle of sorts within the GOP power base. Not only do many Republican conservatives believe that mental health care provided under the Medicaid umbrella is repackaged welfare, many also believe that prayer is all that’s needed.

Some observers think that may be changing. “There may be some pastors and Christian counselors with platforms who remain very skeptical of mental health professionals and the modern concept of mental illness, but I can’t imagine my evangelical friends rejecting a candidate who strongly supports better funding of mental health research or access to mental health services on that basis, especially if they were demonstrably committed to other issues of interest to evangelicals . . . sanctity of life, religious liberty, school choice, the persecuted church,” says Stephen Grcevich, a psychiatrist in Chagrin Falls, Ohio, who works with church groups on religious and children’s disability issues. Evelyn Stratton thinks that what Kasich has been doing politically—trying to hold down costs while providing services to the mentally ill—will work “because everyone has a mother, father, sister, or brother who has a mental illness problem.”

In Kasich’s case, this is true. His younger brother Rick, fifty-nine and a former postal worker like their father, told the Columbus Dispatch in a 1999 interview that “he takes medication and sees a counselor for emotional problems.” The problems were, he said, “‘chemical in nature. . . . I could give you a diagnosis, but I’d rather not.’”

The family went through horrific tragedy in 1987, when the brothers’ parents were killed by a drunk driver. There was some disagreement about how the estate would be divided, according to the Dispatch story, and bad blood developed between them. Rick Kasich said his brother’s “life can’t be reconciled to me in any degree. . . . I have a low view of him. I really don’t want to get involved with him.”

While the two have since reconciled, according to an April story in the New York Times, Rick’s opinion hasn’t changed much. “He doesn’t have much to do with me, and I don’t have much to say about him,” he told the Times. His older brother’s campaign told the paper, “We love Rick deeply and have shared the struggles that his disease brings with it,” and called for his privacy to be respected.

Though he has made the decision not to make his brother a part of his political campaigns throughout the years—and not to use Rick as an example of why policy changes are needed when lobbying the state legislature in Columbus—John Kasich has not shied away from the main reason he expanded Medicaid in Ohio. During one of the early debates last summer, he was asked why he’d done it: “I had an opportunity to bring resources back to Ohio to do what? To treat the mentally ill. Ten thousand of them sit in [Ohio] prisons. It costs $22,500 a year [for each one].”

Very simple: Save government money. Practice what you preach. And, more importantly, learn from personal experiences, however painful they may be.

The post John Kasich’s Work appeared first on Washington Monthly.

]]>
57735