March/April/May 2016 | Washington Monthly https://washingtonmonthly.com/magazine/maraprmay-2016/ Sun, 09 Jan 2022 03:41:19 +0000 en-US hourly 1 https://washingtonmonthly.com/wp-content/uploads/2016/06/cropped-WMlogo-32x32.jpg March/April/May 2016 | Washington Monthly https://washingtonmonthly.com/magazine/maraprmay-2016/ 32 32 200884816 The Real Reason Middle America Should Be Angry https://washingtonmonthly.com/2016/03/14/the-real-reason-middle-america-should-be-angry/ Tue, 15 Mar 2016 00:53:00 +0000 https://washingtonmonthly.com/?p=925

Like many "flyover" cities, St. Louis's decline is not mainly a story of deindustrialization, but of decisions in Washington that opened the door to predatory monopoly.

The post The Real Reason Middle America Should Be Angry appeared first on Washington Monthly.

]]>

The people of St. Louis weren’t really surprised when, on January 12, the National Football League announced its decision to let E. Stanley Kroenke, owner of the St. Louis Rams, move his team to Los Angeles. Kroenke had long signaled his intentions, and LA’s media market beckoned as the nation’s second largest. Moving the team meant more revenue for the league, and therefore more money in owners’ pockets. And that, as serious football fans know, is the point.

The NFL isn’t a charity. It’s a legally sanctioned cartel that strictly limits the number of franchises in order to maximize the value of each. St. Louis simply was losing another round in the NFL’s long-running game of profit-maximizing musical chairs. Indeed, the city originally had lured the Rams from Los Angeles in 1995 after losing a bidding war for the hometown St. Louis Cardinals football team to Phoenix in 1988.

What was surprising, though, and hurtful, was the way Kroenke badmouthed the city. In a twenty-six-page statement in support of the team’s relocation to Los Angeles he noted, “Compared to all other cities, St. Louis is struggling,” adding that the city “lags, and will continue to lag, far behind in the economic drivers that are necessary for sustained success of an N.F.L. franchise.”

St. Louisans, with their formidable civic pride, were outraged. The well-known lawyer Terry Crouppen aired a thirty-second ad that ran in St. Louis during Super Bowl 50 saying of Kroenke’s decision, “We cheered [the Rams] year after losing year. In return, they trashed, then left, us.” The St. Louis Post-Dispatch columnist Benjamin Hochman struck a more optimistic chord when he declared, “They can strip away our NFL team . . . but they can’t snatch our confidence because, right here, right now, we will harness it, we will cradle it, and we will carry it into the next year and years, because we are St. Louis.”

The Gateway City does have much to boast about. It’s home to nine Fortune 500 companies; world-class centers of learning (Washington University in St. Louis); a robust medical system (BJC Healthcare); cultural institutions that rival those of cities twice its size (the St. Louis Zoo); and one of the most storied baseball franchises in history, the Cardinals, a team that has won two World Series in the last ten years alone.

Yet while Kroenke’s argument that St. Louis can’t support an NFL team is self-serving, he’s not altogether mistaken about the city’s economic plight. Relative to big metro areas on the coasts, St. Louis has lost ground in recent years. Job growth since the recession has slowed. The city’s population growth has stagnated. Downtown St. Louis sits eerily quiet on most days, despite millions of taxpayer dollars spent on upgrades—including on the Edward Jones Dome, the Rams’ now-vacant home. The city has a nascent tech start-up scene, but struggles to keep its most successful companies from leaving town (the payment firm Square was conceived in St. Louis by two native sons who relocated to San Francisco in 2009). The per capita income of the St. Louis metro area today has fallen to 77 percent that of metro New York, down from 89 percent in 1979. And while St. Louis’s nine Fortune 500 corporate headquarters are a lot for a metro area of 2.8 million people, that’s down from twelve in 2000 and (correcting for the way Fortune changed its methodology in 1994) twenty-three in 1980.

Experts often point to manufacturing decline, off-shoring, and racial strife to explain the relative economic weakness of St. Louis and other Rust Belt cities. But these ills hardly have afflicted St. Louis more than they have Chicago, New York, Boston, and Los Angeles—which all have mounted much stronger comebacks in recent decades. Yes, those other cities made the transition from manufacturing to services and technology. But a quarter century ago, St. Louis was already (and, to some extent, it still is) a hub of many of the post-industrial industries that have gone on to experience the fastest growth, from pharmaceuticals to finance to food processing.

Moreover, St. Louis had an abundance of what regional economic growth theorists such as Richard Florida, Edward Glaeser, and Enrico Moretti argue is the most important ingredient of success for post-industrial America: a large population of educated, professional, creative types who dream up the innovations that drive growth and profits (think software in Seattle and Silicon Valley, biotech in Boston, finance in New York and Charlotte). In 1980, 23 percent of adults living in the St. Louis area had completed four years of college or higher—double the national average and greater than that of economically “hot” cities like Dallas, Charlotte, and San Diego. Even more important, one out of every five residents worked in fields like finance, insurance, real estate, business, health, law, or medicine.

Indeed, St. Louis contained enough human capital to sustain one of the defining “creative class” industries: advertising. Though perhaps not quite as high-flying as their Mad Men counterparts, St. Louis firms rivaled the biggest New York, LA, and Chicago ad agencies in terms of revenue and creativity during the industry’s heyday from the 1970s to the 1980s.

The relative decline of St. Louis—along with that of other similarly endowed heartland cities—is therefore not simply, or even primarily, a story of deindustrialization. The larger explanation involves how presidents and lawmakers in both parties, influenced by a handful of economists and legal scholars, quietly altered federal competition policies, antitrust laws, and enforcement measures over a period of thirty years. These changes, which enabled the same kind of predatory corporate behavior that took the Rams away from St. Louis, also robbed the metro area of a vibrant economy, and of hundreds of locally based companies. This economic uprooting, still all but unaddressed by today’s politicians or presidential candidates, accounts for much of the relative stagnation of other Middle American communities, and for much of the anger roiling voters this election cycle. The rise and fall of St. Louis’s advertising industry stands as a cautionary tale for what ails so many of the once vigorous and innovative cities of “flyover” America.

If there is a living embodiment of the St. Louis advertising industry, it’s Charles Claggett Jr. The former creative director at D’Arcy, long one of the city’s largest agencies, he retired in 2000, two years before the French firm Publicis acquired the agency. One of his many claims to fame is that in 1979, he and his team penned “This Bud’s for You”—the slogan widely credited for helping St. Louis-based brewing staple Anheuser-Busch eclipse Miller during the 1980s beer wars.

On a blustery December afternoon, Claggett and I met at First Watch, a breakfast-all-day chain restaurant in Clayton, the tony old suburb west of the city that acts as the St. Louis region’s de facto financial center. A boyish-looking man dressed in a neatly pressed blue sweater and checked shirt, Claggett is like just about every other St. Louisan you meet: wildly upbeat about the city’s prospects. “St. Louis is an undiscovered gem,” Claggett said, as he gushed about the Municipal Opera, the city’s famed open-air theater, and about the young professionals moving into the “loft district” downtown. (This tradition of boosterism traces to 1869, when a local named L. U. Reavis convened a seventeen-state delegation to lobby Congress to move all federal buildings to what he touted as “the future imperial city of the world.”)

Another claim to Claggett’s fame is his father, Charles Claggett Sr., who led the city’s oldest and largest agency, Gardner, in the late 1950s and the 1960s. During his tenure, the elder Claggett oversaw accounts such as John Deere, Ralston Purina, and Jack Daniel’s.

Claggett recalled his childhood days, sitting in his father’s office, as piquing his interest in advertising. “Ever since I saw John Deere tractor toys neatly lined up on my father’s desk,” he laughed, “it sealed the deal.” Four years after his father retired as Gardner CEO in 1968, he started at rival agency D’Arcy as a copywriter. “I wanted to be my own self, not just Claggett’s son.”

For nearly a century, Gardner and D’Arcy stood as the twin pillars of the St. Louis ad community, and it was no surprise that they blossomed in St. Louis. By 1900, the city’s population of 575,000 was the fourth largest in the nation. Thanks to its central location near the confluence of the Missouri and Mississippi rivers, St. Louis had built up a vibrant industrial and mercantile presence—from grain milling to shoe manufacturing to insurance underwriting. And in 1904, the city proudly hosted the World’s Fair, an event that drew twenty million people, including Teddy Roosevelt, the Chinese prince Pu Lun, and Mark Twain, from just upriver in Hannibal, Missouri.

Gardner, founded in 1902, got its break with St. Louis-based Ralston Purina, which sold animal feed and cereal nationally, and counted Oldsmobile, B. F. Goodrich, and St. Louis’s Brown Shoe Co. as early clients. In 1906, D’Arcy opened with one account: Coca-Cola. In its first year, the beverage maker devoted only $3,000 to an ad budget, but they upped the amount to $25,000 twelve months later. With the national success of its work for Coca-Cola, D’Arcy soon won other clients like Cascade Whiskey and Nature’s Remedy, before adding Anheuser-Busch in 1914.

Over the coming years, the beer maker expanded into a fifty-square-block headquarters just south of downtown, the world’s largest brewery at the time, from which the tang of yeast perpetually wafted. In the 1930s, by placing ads in Life and Forbes, D’Arcy helped the brewer distribute its product nationwide, stay ahead of rivals like Pabst and Lemp, and sell the one million barrels it produced annually.

As the ad industry flowered, so did the city’s Ad Club. In 1901, Captain Robert E. Lee, a peppery publisher of trade journals, invited a half-dozen leading advertisers to lunch at the Lindell Hotel. They enjoyed their get-together so much they soon founded what would become the nation’s first advertising club. Before long, advertisers from New York and Chicago were contacting the organization to learn how they could start chapters of their own.

The club’s accomplishments even attracted international acclaim. In 1924, a keynote speaker at the International Advertising Convention in London honored the St. Louis branch for having done the “best and most constructive work for advertising of any ad club in the world.” “St. Louis,” he said, “is to be congratulated because it is so far ahead of other cities in advertising manpower.” That keynote speaker? Winston Churchill.

During the Depression, the D’Arcy executive Archie Lee worked with Coca-Cola to enliven its brand, and commissioned the illustrator Haddon Sundblom to pair Santa Claus with the soft drink. What resulted was the invention of a rosy-cheeked, portly, approachable Santa, a departure from earlier German depictions of St. Nicholas as a thin, aloof fellow. The modern image of Santa, and the association between warm holiday cheer and the refreshment and friendliness of Coke, is still among the international beverage maker’s most iconic hallmarks.

The nation’s advertising industry swelled in the prosperous decades after World War II. As middle-class families acquired automobiles and kitchen appliances, living room furniture, and cleaning products for their new suburban homes, ad agencies took advantage of a new medium, television, to pitch their clients’ products.

Firms like Gardner and D’Arcy benefited not only because their clients largely sold consumer products, but also because these agencies were located in the figurative and geographic center of the country. The key to making money during this era was to capture, then package, the desires of the middle class, largely represented by midwestern cities like Cincinnati, Milwaukee, Cleveland, and St. Louis. As the Gardner president Elmer Marshutz put it in a 1950s company newsletter, “St. Louis sells goods to people, and as a result, advertisers throughout the Midwest, South, Southwest, and even New York, have brought their business to St. Louis.”

By the 1960s, Gardner added to its roster of clients, signing established local companies such as chemical giant Monsanto and Pet Inc., the first company to commercially produce evaporated milk; local start-ups like National Car Rentals; regionally based firms like Eli Lilly of Indianapolis; and international companies like Alitalia Airlines.

Gardner was also a magnet for talent. One of its stars was Bea Adams, a pioneering female ad executive who worked her way up from the steno pool to become the agency’s creative director. Adams wrote the jingle for a St. Louis Independent Packing Co. campaign that every St. Louisan over the age of fifty will remember: “I’m a meat man, and a meat man knows, the finest meats, ma’am, are Mayrose.” In 1956, the Advertising Federation of America chose her as National Advertising Woman of the Year, and Fortune listed her as one of the top thirty-six American businesswomen.

And it wasn’t just Gardner and D’Arcy—whose twelve offices now fanned out across North America, as far as Havana—that flourished in mid-century St. Louis. With its ample supply of locally owned businesses as potential clients, the city supported a vibrant start-up ad agency scene. These new firms trained up-and-coming talent, developed cutting-edge campaigns, and often grew to become regional or national in scope, enriching the metro area by bringing in revenue from outside of it.

There was Batz-Hodgson-Neuwoehner, which started out in 1950 with snack brand Old Vienna Potato Chips as an anchor client and by the late 1970s had offices in eleven midwestern cities and $43 million in annual billings. There was Courtesy Checks, which pioneered the field of barter advertising. There was the Savan Company, which worked with the Community Federal Savings and Loan Company throughout Missouri, before handling accounts for S&Ls from New York to Honolulu.

In addition to advertising, St. Louis became a player in the related field of public relations. The PR firm Fleishman-Hillard started in 1946 above a Woolworth’s store downtown. It quickly earned a reputation for savvy by helping Union Electric, the local utility, steer through a scandal involving company executives. Soon, Fleishman-Hillard attracted a roster of other big-name local clients, including Anheuser-Busch, First National Bank, May Department Stores, and Emerson Electric. The company would eventually become the world’s third largest PR firm, with annual billings of $580 million and eighty-five offices in thirty countries.

By the 1960s, St. Louis’s advertising industry had effectively developed into what economists call an “industry cluster.” Though the city’s agencies competed with each other, their sheer number created citywide competitive advantages: a deep bench of talent that moved in and out of agencies, spreading ideas and transferring know-how; a network of experienced, low-cost suppliers (printers, recording studios); and a reputation for quality that attracted national and international clients. All of it was built on the foundation of locally owned companies. These firms provided a steady supply of commissions facilitated by personal connections: account executives at the agencies and the senior executives at the corporations knew each other—from charitable events, from rounds of golf, or from attending the same high school.

Elites from the advertising cluster also interacted in myriad ways with those from the city’s other industry clusters. General American Life Insurance, Boatmen’s and Mercantile banks, AG Edwards, Edward Jones, and Stifel Nicolaus supported a thriving retail financial services sector. Pet Inc., Ralston Purina, Anheuser-Busch, and a food wholesaler named Wetterau constituted the city’s food-processing hub. Mallinckrodt and Sigma-Aldrich anchored its pharmaceutical cluster, civil-engineering company Sverdrup, Emerson Electric, and McDonnell Douglas its engineering cluster.

Mar-16-Feldman4
Commercial success: St. Louis advertising agencies were the brains behind generations of national ad campaigns for Coke, Budweiser and other major brands. Credit:

While these diverse companies were homegrown and locally based, they often owed their existence as independent entities to government policy, especially in Washington. As all students of high school history will recall, in the late nineteenth and early twentieth centuries powerful “trusts” run by financiers like J. P. Morgan and Jay Gould grabbed monopoly control of railroads, steel production, meatpacking, electrical utilities, and other industries. Their actions often thwarted local economies—St. Louis a prime example. In 1881, for instance, Gould won control of St. Louis’s famous Eads Bridge, a major crossing point for rail over the Mississippi. The high tariffs Gould charged led rail companies to re-route through Chicago, leading the Windy City to emerge as the Midwest’s dominant industrial center.

The behavior of the trusts ignited the Populist and Progressive movements, which in turn led to a series of laws that safeguarded independent businesses in cities like St. Louis from the predations of monopolists, and encouraged regional equity.

The Interstate Commerce Act of 1887 applied common carriage rules to railways, and sapped their industrialist owners of the power to discriminately pick winners and losers. The Sherman Antitrust Act of 1890 addressed the anticompetitive practices of monopolists. Years later, the Mercantile National Bank of St. Louis president Festus J. Wade celebrated these laws, especially a 1912 decision based on them to break up a railroad monopoly that was choking off commerce from entering the city. “Railroad managers,” he said, “can no longer combine against an industry and crush it out of existence because of a disagreement with the head of a manufacturing establishment.”

The Federal Reserve Act of 1913 created a central banking system in which decisions over national monetary policy were made by twelve regional Federal Reserve banks, one of which was built (and still exists) in St. Louis. The McFadden Act of 1927 likewise dispersed lending activity by confining national banks to their headquartered states. This rule preserved the flow of capital within local communities, made bankers attuned to their community’s needs, and prevented New York financiers from gobbling up St. Louis banks. It also addressed the public’s concern that if large banking organizations operated in multiple regions, they would evade adequate supervision.

The Packers and Stockyards Act of 1921 broke up the “Big Five” meatpacking cartel that previously had manipulated prices across the nation, giving undue preference to certain businesses and localities, and controlling non-meat production in the warehousing, wholesale, and retail industries. That move gave smaller companies like the St. Louis Independent Packing Company, of the famed “Mayrose” jingle, the opportunity to compete fairly.

The Wheeler-Rayburn Act of 1935 prohibited electricity, gas, and water utilities from speculating in unregulated businesses with ratepayers’ money and ensured that companies like Union Electric would remain locally headquartered and focused. The Robinson-Patman Act of 1936 protected small retailers by prohibiting manufacturers from giving larger discounts to chain stores, and the Miller-Tydings Act of 1937 did the same by permitting manufacturers to set a minimum price at which their goods could be sold. These laws safeguarded local-area retailers like Central Hardware and Bettendorf-Rapp supermarkets, as well as neighborhood pharmacies, bakeries, restaurants, clothing stores, and grocers—including those servicing the city’s predominant minority and African-American communities (see “Redlining From Afar”).

After World War II, Congress continued strengthening these anti-monopoly laws. The 1950 Celler-Kefauver Act, for instance, closed a loophole that allowed companies to thwart competition by gobbling up competitors’ regional suppliers. At the Wholesale Grocers Association convention held in St. Louis, the law’s cosponsor, Tennessee Senator Estes Kefauver, declared that the 1950 act would “blast out those pillboxes of monopoly . . . that threaten our free enterprise.”

Throughout the mid-twentieth century, these and other antitrust statutes were vigorously enforced by the Justice Department. “Today, anybody who knows anything about the conduct of American business,” observed the historian and public intellectual Richard Hofstadter in 1964, “knows that the managers of the large corporations do their business with one eye constantly cast over their shoulders at the antitrust division.” Even some St. Louis companies, such as Mercantile Bank and Monsanto, grew to the point where they too were hit with antitrust actions by the federal government in the 1960s—an indication, ironically, of just how much the city’s economy was thriving.

St. Louis’s advertising industry crested in the late 1970s and the 1980s, and Claggett recalls the era as “the peak of creative output.” Gardner and D’Arcy were headquartered in stylish mirrored and dark-glass downtown office buildings whose executive conference rooms framed the Arch. On the ground floor of the Gateway Building, which housed KMOX, D’Arcy, and Fleishman-Hillard, sat Anthony’s Bar, a rowdy after-work haunt for ad professionals, who joked that the minimum drink order was a beer and a martini.

During this era, Claggett garnered Cannes and Clio awards for his work on Budweiser’s frogs and Clydesdales campaigns, and secured more industry accolades for D’Arcy than at any other time in its history. Other shops also sustained St. Louis’s national influence and contributed original and widely distributed work. In 1977, the Stolz Advertising Agency played a key role in helping McDonald’s create the Happy Meal, much to the consternation of parents nationwide. Two years later, D’Arcy placed a crucial $10 million media buy for Budweiser on the fledgling television station ESPN, the network’s only advertiser at the time.

Indeed, well into the 1980s, St. Louis held its own in terms of advertising manpower. A Post-Dispatch article proclaimed that “D’Arcy [is] one of the top 10 agencies in the country, with the founding office here considered a major training facility for young artists, writers, and account management personnel.” In 1982 the Gateway City’s top ten agencies collectively held 256 local accounts, netting $326 million in St. Louis billings alone. A snippet from a 1980s Ad Club newsletter notes, “St. Louis swept the categories at the industry’s regional ADDY awards,” an event that recognizes creative excellence in local, regional, and national markets.

By its nature, the advertising field attracts individuals who straddle the worlds of commerce and art. Many of those who worked in the St. Louis advertising industry became active supporters of both business-based organizations like the Rotary Club and the city’s cultural institutions, from its art museum to its symphony orchestra. Occasionally they even made artistic contributions of their own. In 1987, for instance, Glenn Savan, son of Savan Company founder Sidney Savan, penned a best-selling novel set in St. Louis, White Palace, later made into a movie starring Susan Sarandon.

Like virtually every other city in the country, St. Louis had serious problems in this era. Jobs, wealth, and residents had long been migrating to the suburbs, leaving the central city increasingly poor and crime ridden. Still, its overall economy was diverse and vibrant. Per capita income in the St. Louis metro area was 82 percent as high as in the New York metro area in 1969; by 1979 it was 89 percent.

One of the things St. Louis had going for it, as always, was its central location. The city sought to capitalize on that advantage, and attract more commerce downtown, by building a convention center, which opened in 1977. While such structures often proved white elephants for other cities, St. Louis’s was an immediate hit. The convention center, booked in advance for ten years straight, played host to 461,450 visitors from across the globe in its first year alone.

St. Louis also profited from some of the best airline connectivity in the nation. That too was due to its central location, as well as its rich aviation history. The city famously had sponsored Charles Lindbergh’s transatlantic flight, and the military and civilian aircraft maker McDonnell Douglas was headquartered at Lambert Field, northwest of the city.

St. Louis also benefited from healthy competition between two local air carriers. One was TWA. The globe-spanning giant had long presided over Lambert, and in 1982, as a result of intrigues by state-level politicians, the company moved its headquarters to St. Louis from Kansas City. The other was the homegrown Ozark Airlines. Ozark started out as a “local service” airline licensed by the federal government’s Civil Aeronautics Board (CAB) to provide air service to small communities in the Midwest. But by the mid-1970s, having won permission from CAB to compete with major carriers on more profitable routes between major cities, the upstart airliner boomed. Flights extended deep into the Southwest, Mountain West, South, and East.

Claggett experienced the ascent of Ozark firsthand when he handled the company as a young account executive. In 1977, he stood on the tarmac at Lambert Airport and stared down a McDonnell Douglas DC-9 jet. At one of his first major shoots, Claggett worked with “hundreds of extras,” a Dutch director, and a crew from the local Technisonic Studios production company to increase the brand’s visibility. Ozark adopted a new slogan, “We’re up there with the biggest,” an assertion of the company’s growth and a playful jab at rival TWA. And as the St. Louis-based company sought to refresh its heartland brand for younger and more urbane audiences, the agency used the comedian George Carlin to proclaim, “Go-getters go Ozark.”

Mar 16 -Feldman1
Credit:

The rich connectivity was great, of course, for the city’s booming convention business. But it also was valuable to St. Louis’s corporate community. In 1985, Lambert’s 1,170 daily takeoffs and landings made doing business nationally or even internationally easy. It kept St. Louis competitive and at the center of the action, figuratively and geographically. That was equally true of its advertising sector. For instance, in 1980, a nineteen-member delegation of advertising executives from fifteen foreign countries held seminars with D’Arcy staff to gain insight into the company’s creative process. “People came to St. Louis not as a stepping stone,” says the former D’Arcy and Gardner copywriter Gerry Mandel, but as their destination. They wanted to work, Mandel says, “on the Southwestern Bell account, or even, if they were lucky, for Budweiser.”

Flying high: St. Louis’s Ad Club, founded in 1901, was the first such organization in the county. (Photo courtesy of St. Louis Public Library)

As St. Louis’s advertising renaissance peaked, and admen enjoyed a martini—or two—at Anthony’s, changes were afoot in Washington. In the late 1970s and early 1980s, politicians quietly overturned many of the anti-monopoly laws that had for so long protected the citizens of the Gateway City from distant economic predators. These legislative changes—inspired by an unlikely alliance of both conservative and liberal legal scholars and economists, including Robert Bork and Lester Thurow—spoiled the very ecosystem that had birthed St. Louis’s diversified economy and powerful industrial presence.

In 1978, Jimmy Carter signed the Airline Deregulation Act, which swept away the Civil Aeronautics Board and paved the way for massive industry restructuring. In the mid-1980s, Northwest purchased Republic Airlines. US Airways acquired PSA Airlines and Piedmont Airlines. Continental swallowed Texas International, New York Air, and People’s Express.

Invariably, this activity soon reached St. Louis. In 1986, TWA bought Ozark. This didn’t help price competition, but when TWA made Lambert its hub, the city’s air connectivity increased. That is, until American purchased TWA in 2001 and later moved much of its operations to Chicago O’Hare. In 2014, only five hundred aircraft took off and landed daily at Lambert, a fraction of the all-time high of 1,400 in 1997. Moreover, the airport serviced only 1,176 international flights, down from 3,826 in 2002. While some airlines like Southwest partially have filled the void, an entire terminal still sits empty.

Meanwhile, under the Reagan administration, the federal government fundamentally changed course on antitrust enforcement. The Reagan Justice Department wrote new guidelines that rejected regional equity or local control as considerations in deciding whether to block mergers or prosecute monopolies. Enforcers were instructed to wave through mergers and tolerate consolidation, as long as there was no active collusion and consumers didn’t immediately suffer higher prices. Even more, Reagan’s administration cut the budgets of the Federal Trade Commission and the Department of Justice, leaving both agencies with limited resources for enforcement.

Between 1980 and 1985, sixty-two Fortune 500 companies were subject to corporate takeovers, and the single greatest increase in corporate acquisitions in U.S. history took place between 1984 and 1985. This relaxed enforcement philosophy, compounded by other legislative action, quickened the consolidation of specific industries.

Throughout the 1980s, state politicians chiseled away at restrictions on interstate banking, and in 1994 the Clinton administration followed suit with the passage of the Riegle-Neal Interstate Banking and Branching Efficiency Act. Since 1984, the number of independent banks has fallen by more than half, from 15,663 to 6,799 in 2011. Of those now-defunct banks, more than 8,352 either merged or were consolidated.

In St. Louis, Boatmen’s, the oldest bank west of the Mississippi, merged with Kansas City-based Chartercorp in 1985, and in 1997 its ownership shifted to Charlotte-based NationsBank, which was later purchased by then San Francisco-based Bank of America. Mercantile, St. Louis’s biggest locally owned bank, was gobbled up in 1999 by Milwaukee-based Firstar, which later changed its name to U.S. Bancorp. The number of community banks in Missouri dropped from 637 in 1980 to 262 in 2014.

These changes in anti-monopoly policy also affected local retailers. The Consumer Goods Pricing Act of 1975 overturned the Miller-Tydings Act of 1937, and led to the end of most “fair trade” laws. In the early 1980s, the Reagan administration’s Justice Department and FTC stopped enforcing the Robinson-Patman Act. Together, these changes led to consolidation among retailers and gave the new mega-retailers tremendous power over their suppliers.

Mar-16-Feldman3
Credit:

In St. Louis, this played out in Dillard’s purchase of St. Louis flagship department store Stix, Baer, and Fuller in 1984, and Chicago-based Handy Andy Home Improvement Centers’ acquisition of Central Hardware in 1993. At the level of suppliers, these policy changes cleared the way for potato chip makers Frito-Lay and Borden Inc. to capture 50 percent of the local market, elbowing past local brands So Good Potato Chip Co. and Old Vienna, whose market share now stood at a paltry 4 percent.

In the 1990s, St. Louisans continued to witness the flight of corporate headquarters that either were acquired by outside companies or moved out of town completely. In 1993, company executives moved Southwestern Bell, then the twenty-ninth largest U.S. company, to Texas. And in 1997, Boeing absorbed McDonnell Douglas and moved its headquarters first to Seattle and then to Chicago.

That same year, Omnicom purchased Fleishman-Hillard. In 2001, Swiss food giant Nestlé bought Ralston Purina. In 2005, Federated Department Stores, whose chains include Macy’s and Bloomingdale’s, acquired St. Louis-based May Department Stores, whose banners included Lord & Taylor and Filene’s. That year, too, Lee Enterprises bought Pulitzer Inc., owner of the 127-year-old Post-Dispatch. In 2007, Wachovia (later acquired by Wells Fargo) snatched up 120-year-old A. G. Edwards. And then came the most devastating of deals for St. Louisans—the unthinkable one.

In 2008, Anheuser-Busch—the 156-year-old company that helped make D’Arcy, that had owned the St. Louis Cardinals, and that in many ways had defined St. Louis’s very identity—was bought by Belgian-based InBev, which itself had been acquired by a team of Brazilian bankers and investors in 2004.

While some new out-of-town owners kept large operations in St. Louis, the city lost entire layers of expertise. Business and account managers were shed in Pillsbury’s 1995 acquisition of Pet Inc. NationsBank purchase of Boatmen’s and Firstar’s acquisition of Mercantile resulted in the exodus of financial analysts and bankers. MetLife’s purchase of General American Life led to the jettisoning of insurance agents. Ralston Purina’s merger with Nestlé prompted the hemorrhaging of food scientists and in-house lawyers. Tyco’s 2000 purchase of Mallinckrodt and Merck’s recent acquisition of Sigma-Aldrich resulted in the departure of pharmaceutical scientists.

The change in antitrust policies in Washington and the subsequent wave of industry consolidation affected the city’s advertising sector in a number of ways. First, when a local company was bought out, there was one fewer account for advertising agencies to serve. The loss of local retailers was especially tough on smaller ad firms that relied on them as clients. It also devastated local media outlets, which not only counted on that revenue but, thanks to other federal rule changes, were themselves now targets for acquisition by national conglomerates (see “Communication Breakdown”).

Second, when local firms were taken over by out-of-town corporations, the personal connections St. Louis ad execs had with executives at the local firms they serviced became less important. Decisions were now being made hundreds or thousands of miles away by the executives of the acquiring company. Over time, that meant losing clients. “As major corporations moved their headquarters out of St. Louis,” Claggett told me, “they took their advertising needs with them too.”

Third, the consolidation trend made ad agencies themselves merger targets. The 1980s were the “decade of the deal,” when major New York ad agencies such as J. Walter Thompson and the Ogilvy Group were acquired by the London-based WPP Group. The merger trend also hit St. Louis. Gardner had already been bought in 1972, by New York-based Wells Rich Greene (WRG), but it was largely left alone to manage its own accounts, the biggest one being Ralston Purina, which at the time was still based in St. Louis. But in 1980, WRG placed its agency operations in different cities under the control of a NYC-based parent company. When Ralston Purina sold its animal feed business Purina Mills to BP six years later, Gardner lost its largest account. That move, and additional restructuring by WRG, crippled the agency and led to the eighty-one-year-old firm’s closure in 1989.

D’Arcy followed a similar trajectory. In 1985, it merged with NYC-based Benton & Bowles to become DMB&B, a deal that saw the headquarters and executive decision-making shift to New York. The St. Louis office still handled long-standing accounts like Mars/M&M and Anheuser-Busch, but NYC now made “above-the-rim” decisions. As Claggett put it, “The agency slowly became just a branch office competing for accounts.”

The turning point came one day in 1994, when, unbeknown to the St. Louis office, the agency’s NYC-based media-buying unit signed a $25 million deal with Anheuser-Busch’s archrival, Miller, then lied about it. Anheuser-Busch’s volatile owner, August Busch III, immediately cut ties with D’Arcy, costing the agency $422 million in billings. One D’Arcy copywriter quipped, “When you lose Bud, you’ve lost it all.” Two years later, the office lost its $140 million Blockbuster account to New York. The agency closed its St. Louis doors in 2002.

In the years since the St. Louis advertising cluster disintegrated, the entire industry has taken a major hit as the Internet has disrupted its traditional business model. U.S. ad agencies today have fewer employees than they did in 2000. The advertising talent that still remains in St. Louis—veterans from the old firms, ambitious young people with new skills—is learning to adapt. Claggett waxed enthusiastic about the “cutting-edge” creative work being done in the social media advertising space by upstart St. Louis firms like Moosylvania.

St. Louisans, like Americans generally, take pride in their self-reliance. When things turn sour, they don’t blame others, but instead channel their energy to make their city better. While the convention business has declined, hosting only 350,000 visitors last year, local leaders are looking to renovate and upgrade the city’s key event space. A consortium of the city’s hospitals and universities has created a 200-acre innovation district in midtown St. Louis to nurture bioscience and pharmaceutical start-up companies. The Donald Danforth Plant Science Center, funded by Monsanto’s philanthropic arm and the family that founded Ralston Purina, is doing the same in the agricultural tech sector. Square announced last year that it would open a new office in St. Louis and hire 200 people.

A few St. Louis businessmen and politicians even tried assembling a package of taxpayer-funded incentives to convince Stan Kroenke not to move the Rams to LA. But Kroenke, who knows the monopoly game better than most, didn’t take the bait. His wife is a Walmart heiress, and he amassed his $7.4 billion real estate empire building shopping malls anchored by Walmart stores.

Interestingly, most St. Louisans reacted with contempt to the plan to try to bribe Kroenke with more public money. St. Louis Mayor Francis Slay announced that he and his city were done negotiating with the NFL, proclaiming, “[C]ities and hometown fans are commodities to be abandoned once they no longer suit the league’s purposes.” Having been through several of these cycles, St. Louisans now understand, more than they used to, that the game is rigged—that they could have a football team in a heartbeat if the NFL expanded the number of franchises to match the number of cities that can clearly support one. That’s how fair markets are supposed to work.

Applying that lesson more broadly, when the citizens of St. Louis—and of other small- and medium-sized cities across the country—look at the decline of their local economies, they may consider a different explanation than the one Kroenke offered. The economic fates of their communities may not be the result of their own failings, or of an inability to lure educated “creative class” types, or of off-shoring or deindustrialization, or of the workings of a mysterious and immutable free market. Rather, their fates may be the result of decisions in Washington, influenced by a small group of legal scholars and economists, to overturn antitrust laws passed by elected officials of both parties over the course of the twentieth century. These decisions quietly changed the rules of America’s economy to be more like the NFL’s, in which monopoly power isn’t fought but catered to, in which economic opportunity isn’t disbursed but consolidated, in which fewer cities—and fewer Americans—get a fair chance to compete.

The post The Real Reason Middle America Should Be Angry appeared first on Washington Monthly.

]]>
925 Mar-16-Feldman4 Mar 16 -Feldman1 Mar-16-Feldman3
Message to Millennials: Bernie Sanders Is Intellectually Consistent, Not Intellectually Honest https://washingtonmonthly.com/2016/03/13/message-to-millennials-bernie-sanders-is-intellectually-consistent-not-intellectually-honest/ Sun, 13 Mar 2016 21:01:34 +0000 https://washingtonmonthly.com/?p=922

Millennials, we get why you like Bernie, but...

The post Message to Millennials: Bernie Sanders Is Intellectually Consistent, Not Intellectually Honest appeared first on Washington Monthly.

]]>

I recently got an email from a young man I dearly love, a college freshman who’s a close friend of my son’s. This being the first election of his lifetime in which he can vote, he was researching the issues and candidates in the Democratic primary and asked for my views. I sent him an email back that was probably way longer than he was looking for. But the exercise forced me to articulate my own thinking about the stakes in this primary. So I thought I’d share an edited version of my email to that young man.

A key fact of this race is that Democratic voters my age lean toward Hillary Clinton, while voters your age overwhelmingly support Bernie Sanders. One of the reasons this is such a big deal is that there are a lot of voters your age right now—the Millennials, your generation, are the biggest birth cohort since the Baby Boom, my generation.

It’s not surprising that younger voters are, by and large, with Bernie, and passionately so. Young people almost always support the candidate who most forcefully expresses their ideals. In 1972, when many Baby Boomers were the age Millennials are now, they went overwhelmingly for George McGovern—who campaigned on ending Vietnam, cutting the defense budget, and other liberal causes—over more moderate candidates like Edmund Muskie and Hubert Humphrey. In fact, Bill and Hillary Clinton worked for the McGovern campaign in Texas.

McGovern lost forty-nine states to Richard Nixon.

A big part of Bernie’s appeal is his “authenticity.” He wears rumpled suits, lets his hair run wild, and scowls rather than plaster a fake politician’s smile on his face. That’s something I love about him, too—though as one of my favorite Millennial writers, the Washington Post columnist Catherine Rampell, notes, no woman in politics could get away with that.

Another appealing factor about Bernie is that he’s intellectually consistent. He has a big, overarching, simple-to-grasp vision of what’s wrong with the country and how to fix it: the billionaire class is screwing it up for the rest of us, so let’s trim their political power by reforming campaign finance laws and tax them more to finance government programs that give average people a better life. It’s a vision he’s maintained for decades, there’s always been a lot of truth in it, and that truth is more apparent today than ever.

But intellectual consistency isn’t the same as intellectual honesty. He’s surely got way more of the latter than the buffoons running for president in the GOP (Ohio Governor John Kasich being an exception). And there’s a basic decency and candor about Bernie that I really admire. He says what he thinks and he doesn’t play word games or tailor his approach to different audiences. These are not qualities people associate with Hillary Clinton, I’m afraid.

It’s in the realm of policy, however, where I find Bernie intellectually quite dishonest, and Hillary pretty damned honest. When you scrutinize his policy ideas, as wonky liberals have begun doing (finally) in the last couple of months, those ideas don’t stand up, on a bunch of different levels.

One of those levels is political—as in there’s no way, in the foreseeable future, there will be sixty votes in the Senate, much less support in a likely GOP-controlled House, to pass single-payer health care, or break up the big banks, or reform the political campaign system, or provide free college tuition for every student. You can excuse that by saying, Well, that’s his vision, his end goal, maybe not achievable in his first term but possible over time, especially if we get the “political revolution” he calls for.

But there’s a deeper level at which these policy ideas are intellectually dishonest. Even if you could somehow get them passed, practically they either wouldn’t work or would be recklessly disruptive or both.

On health care, it’s not just that corporate interests would resist single-payer, as Bernie rightly says. It’s that given the fact that, for historical reasons, we’ve built out an employer-based system, any legislation that attempted to rip up that entire system from the bottom up would lead to logistical and economic chaos and to political backlash that would dwarf what the Democrats got hit with after they passed Obamacare. Why in the world would you do that, especially now that, with Obamacare, we’ve taken a huge step toward making the existing system work for those who’ve been excluded, and made at least some steps toward cost containment (though we need more, as Hillary is calling for)?

But even if you think it’d be worth all the turmoil and stress to at least try to get rid of our cumbersome and expensive health care system and replace it with one on a Canadian or European model—those systems are, on the whole, more cost-effective and of the same or better quality than ours—it’s hard to take seriously Bernie’s specific plan to get there. In fact, it’s pretty clear that Bernie himself doesn’t take his plan seriously. When he first rolled it out—hours before the Iowa caucuses—knowledgeable (and sympathetically liberal) health policy experts were shocked at how sloppy it was. For instance, he promised to save Americans more per year on prescription drugs than we currently spend in total per year on prescription drugs. (By the way, Donald Trump has made the same impossible promise regarding prescription drugs.)

On reforming the financial sector, it’s not just that the big banks will resist Bernie’s plan to break them up and to restore Glass-Steagall, the Depression-era law (since overturned) that kept federally insured banks from gambling with their depositors’ money by trading securities and engaging in other forms of high-risk behavior, as uninsured institutions like investment banks are free to do. It’s that breaking up the big banks and restoring Glass-Steagall wouldn’t accomplish the practical task of making the financial sector more stable and less likely to tank the economy.

Remember, it wasn’t the big banks we bailed out that caused the financial collapse of 2008. Rather, it was the bankruptcy of Lehman Brothers, the mismanagement of AIG, and the subprime mortgages being peddled by firms like Countrywide. None of these companies were all that big. None had deposits insured by the government, and hence their behavior wouldn’t have been stopped even if Glass-Steagall still had been in force. They were part of the “shadow banking” system that Dodd-Frank has gone a long way to regulate and that Hillary wants, rightly and sensibly, to tighten the screws on. Big banks remain a problem, but mostly because we’ve allowed the financial sector, as with most industries, to consolidate in a handful of coastal cities (more on this in our next issue, out next week).

On the billionaire class, it’s not just that America has never, since its founding, been able to keep money out of political campaigns (though the problem has clearly gotten much worse in recent years). It’s that campaign donations are not remotely the most important way billionaires and corporations rig the system to their benefit. Rather, it’s through the money that flows into the Washington lobbying machine, and Bernie’s campaign finance reforms won’t do squat about that.

On our bloated prisons and abusive policing in poor and minority communities, it’s not just that progressives blame these problems on Bill Clinton’s 1994 crime bill, and hence on Hillary, even though then Congressman Bernie Sanders voted for the bill. Nor is it that the crime bill, with its stress on community policing, helped lower the crime rate in poor communities while improving police behavior (though to get GOP votes it did increase sentences and incarceration). It’s that Bernie’s pledges to fix these problems are unserious and outright dishonest. At a candidate debate in February, Sanders pointed out, rightly, that America has more of its citizens behind bars than any other country, including China, then promised that “at the end of my first term as president we will not have more people in jail than any other country.” Even some of his strongest boosters, like the MSNBC host Chris Hayes, pointed out that Sanders was making promises he wouldn’t, as president, have the power to keep. States house 87 percent of the nation’s prisoners; even if a President Sanders were to pardon every inmate of every federal prison—the only ones he would control—the United States would still have more people behind bars than China.

And then there’s the issue—which few Democrats have even begun to grapple with, and Bernie certainly hasn’t—that his many big ideas (single-payer, free college) would be insanely costly and be coming at a time when federal deficits will start climbing because of the retirement of the Baby Boomers. Hillary’s many proposals will also hit that rising deficit wall, but hers aren’t anywhere near as costly as Bernie’s.

It drives me crazy that so many people buy into the idea that Bernie’s policies are the principled ones and that other people’s more “pragmatic” policies are compromised, watered down, and, ultimately, something to be ashamed of. I don’t see it that way at all. To me, selling policies that you know or should know won’t work is pretty much the definition of unprincipled.

And I haven’t even brought up the issue of foreign policy, because once I start there I won’t quit. But suffice it to say Bernie has had thirty-five years in Congress to get involved in and bone up on issues of national security and foreign policy. He’s chosen not to, while Hillary spent four years as secretary of state.

The one area where Bernie’s knowledge and ideas are truly impressive is veterans’ health care. That, I think, is because the VA actually practices (quite successfully) a form of socialized medicine, and hence is in Bernie’s ideological comfort zone (and if you think the “scandal” regarding wait times at the Phoenix VA disproved that, well, watch this space).

When I was your age and first old enough to vote in an election, in 1980, the country was beset by a host of problems (inflation, rising crime, Islamic fundamentalism in Iran) for which traditional liberalism seemed to have no good answers. Over the next twelve years I watched Democrats get beaten by Republicans with conservative policy ideas I thought were, for the most part, crackpot. So I devoted myself to the larger effort then under way (at, among other institutions, the Washington Monthly) to subject policy ideas, liberal as well as conservative, to tough, evidence-based scrutiny, and to try to develop a new policy agenda that could actually work, both politically and substantively. That more hardheaded, less ideological way of looking at policy served the country well in the Clinton and Obama presidencies, and its abandonment by George W. Bush’s administration led to multiple epic disasters. My greatest fear is that the Democrats will follow the Republicans in turning their backs on “reality-based” policymaking.

Of course, most people don’t have the time or inclination to learn the nuances of complicated policy questions. If you’re a young person who leans left, you’re probably engaged in a simpler thought process: establishment politics has left me with high student debts and diminished job prospects; Hillary Clinton is the ultimate example of establishment politics; Bernie Sanders has fought establishment politics for years on behalf of progressive goals I believe in; why the hell shouldn’t I vote for Bernie?

I get it. The simplest response I can offer is this: on policy, Bernie doesn’t know what he’s talking about; the policies that most damaged your life came overwhelmingly from the Republicans, not the Democrats; and the Democrat most likely to beat the Republican in November is not Bernie Sanders.

The post Message to Millennials: Bernie Sanders Is Intellectually Consistent, Not Intellectually Honest appeared first on Washington Monthly.

]]>
922
Tilting at Windmills https://washingtonmonthly.com/2016/03/13/trumps-better-vowels-sanderss-better-consonants-the-oppo-researchers-trash-can-the-soft-bigotry-of-ivy-league-football/ Sun, 13 Mar 2016 21:00:25 +0000 https://washingtonmonthly.com/?p=931 Not a yooge difference As a Bronx native I’ve spent the campaign quietly weighing Donald Trump’s New York accent against that of Bernie Sanders. I can declare a split decision. Trump has the better vowels: His yooge obliterates Sanders’s yooge, the perfect measure of dismissiveness without dwelling on itself. But Sanders has the better consonants: […]

The post Tilting at Windmills appeared first on Washington Monthly.

]]>
Not a yooge difference

As a Bronx native I’ve spent the campaign quietly weighing Donald Trump’s New York accent against that of Bernie Sanders. I can declare a split decision. Trump has the better vowels: His yooge obliterates Sanders’s yooge, the perfect measure of dismissiveness without dwelling on itself. But Sanders has the better consonants: when he says speculation, each syllable is a saliva receptacle. What is especially great about both of these accents is that no New Yorkers speak like that anymore, not even in deepest Canarsie. The city is too diverse; its population changes too constantly. Accents so extreme could only be preserved in environments where their bearers did not regularly interact with other New Yorkers: Burlington, Vermont, in one case, and a quartz penthouse in the other.

Trump’s favorite bureaucrats

A few days before the New Hampshire primary, I happened into a Donald Trump rally in Exeter, at the same town hall where Abraham Lincoln once delivered a speech when he was running for president, in 1860. Trump’s rallies have become notorious, for their rowdiness and nativist anger and general atmosphere of menace, but having been to several I can report that the scene is not always quite this bad. A lot seems to depend upon Trump’s mood. Sometimes you get Benito Mussolini, and sometimes it’s just an amiable drunk who keeps forgetting that he’s in the quiet car.

This day in particular Trump was in a ruminative mood; possibly his recent loss in the Iowa caucuses had chastened him. He spent a great deal of time talking about the hotel project that his real estate company is developing in the Old Post Office Pavilion, the gorgeous and ornate building, capped by an actual clock tower, a few blocks east of the White House. The renovation was running ahead of schedule and under budget, Trump said, and he made a case for its social impact: hotels, he said, employ more people than offices do. He suggested that he had been surprised to win the project in the first place (he intimated that the Hilton hotel corporation, rival bidders, had outsized political influence with the Obama crowd) and then mentioned that he’d been extremely impressed with the bureaucrats from the General Services Administration (GSA) whom he’d worked with. “Extremely professional. They’re unbelievable. They’re very talented people.”

When it comes to hotels in Washington, I’m basically pro glitz. The bars and lobbies in the capital are almost uniformly stultifying, as if they’ve been designed for the uncle of the Portuguese ambassador. So Trump’s progress report, which promised something different, was good news to me. But Trump’s enthusiasm for the men and women of the GSA triggered for me a very specific memory from about a decade ago when I was an editor at the Washington Monthly. One late night when I was going over Charlie Peters’s Tilting at Windmills column with him, we discussed an item in which he had reported that the Department of Justice contained the most prestigious of all bureaucratic jobs in Washington. Idly, without considering what I was getting myself into, I wondered what was the least prestigious. With the exactitude of an eye surgeon considering a cataract, Charlie ranked every agency in Washington according to its bureaucratic prestige. “And finally,” he concluded, having listed every existing agency, and possibly some expired ones, “the General Services Administration.” If Donald Trump can find it in himself not only to praise bureaucrats but to praise bureaucrats from the least prestigious wing of the entire federal government, perhaps all will not be lost in the Trump administration.

Like asking an enemy for the map of the minefield

There are these hints scattered through the archives of the New York tabloids that in his own mind Trump has been a political figure, of a sort, for a very long time. In 1984 he told the Washington Post that if his country needed a volunteer to take care of nuclear negotiations with the Soviets he was ready and willing. “It would take an hour and a half to learn everything there is to know about missiles,” Trump said. “I think I know most of it anyway.” This summer many reporters suspected that Trump had nowhere near as much money as he claimed, as has sometimes been the case in the past. When he turned out to be a legitimate billionaire, the word was that he had lucked into it, by the sheer geographic accident of having been born into a trove of buildings in a city, New York, that has during the past quarter century been the site of a historic boom.

But David Segal of the New York Times, in a magnificent study of Trump’s deal to acquire the Plaza Hotel, made a compelling case that the tycoon really was a remarkably adept deal maker. At a key moment, Trump sized up the attorney charged with negotiating on behalf of the seller, recognized a young man eager to make a splashy deal, forewent all of the torturous negotiations over contingencies, and simply asked the lawyer to tell him everything that was wrong with the hotel. “It was like asking an enemy for the map of the minefield,” Segal writes. Trump got the hotel. He also got the lawyer, a man named Tom Barrack, to deal with a troublesome tenant at the Plaza, an older woman named Fannie Lowenstein, who stood in the way of Trump’s condo conversion of the iconic hotel, by promising her free rent for life and throwing in free furniture and a Steinway.

The oppo researcher’s trash can

One irony of the present campaign is that what might be the most buttoned-up political operation in history, Hillary Clinton’s, must now be busily preparing to oppose the most xenophobic, vulgar, and freewheeling candidate in modern memory. Whole teams of ex-Rhodes Scholars and promising Yale 2Ls are carefully highlighting old New York Post articles and busily assembling dossiers. The material they discard must be amazing. What does an operation like the Clinton campaign do, for instance, with a candidate who has praised his daughter’s figure to television hosts and said, “If I weren’t married and, ya know, her father . . .” It’s amazing to think that Joel Benenson and Margolis are thinking that one through.

But the sheer volume of Trump’s public record means that many heinous episodes (his whole New York experience, more or less) would be, in a general election campaign, outlined in only the broadest strokes. We would hear about the four bankruptcies, but probably not about the separate lawsuits New York City and New York State brought against Trump in the mid-eighties, alleging that Trump had resorted to extreme tactics to try to harass rent-stabilized tenants out of his buildings—not just normal slumlord stuff like “drastic decrease in essential services” and “threats of imminent demolition” but “instructing employees to obtain information about the private lives and sex habits of tenants.”

Trump’s response to the Central Park jogger case, in which he took out a full-page ad in the New York Post to denounce as “crazed misfits” the accused, several young African American teenagers who ultimately turned out to be entirely innocent, would likely fly under the radar. More striking still was an anecdote supplied to the New Yorker’s Nick Paumgarten by a former busboy at Trump’s Showboat casino in the 1980s: “When Donald and Ivana came to the casino, the bosses would order all the black people off the floor.” This has not made news in the campaign. Neither has the lurid Marla Maples affair, in which he took his chauffer’s girlfriend, a Georgia ingénue, as his own, and which ended with a spectacular divorce and his twelve-year-old son quoted in Vanity Fair as saying, “You don’t love us! You don’t even love yourself. You just love money.” That son, Donald Trump Jr., is now a partner in the family real estate organization.

At a Jeb Bush event this winter, I ran into an older couple from Brooklyn who told me that though they were too moderate to vote for Trump they had a warm feeling about him. When he first bought the Plaza, Trump staged a promotion in which any couple who had ever been married at the hotel could stay at it again, for the rate they’d paid on their wedding night. It so happened that this couple fit the bill, and they had saved their receipt in a wedding scrapbook. They got the room, and the cheap rate, and together with the other couples a reception in the grand ballroom. We mostly use the term “Jacksonian” to describe angry, populist political impulses. But Trump has spent much time dwelling in a particularly Jacksonian emotion: the vulgar and liberating thrill that comes when the outsiders finally occupy the cultural citadel.

Vermont gun owners shoot blanks

One black mark on Bernie Sanders’s record, for most progressives, is his general support for the right to bear arms—the five votes against the Brady Bill that Hillary Clinton is fond of pointing out, the vote to immunize gun manufacturers from suit after Sandy Hook, and others. Sanders has suggested that, having headed out of Brooklyn, he deferred to the broader culture of Vermont, where gun ownership is so ingrained that for many years the Green Mountain State was the only one in the nation where you were allowed to openly carry a firearm without licensing or registering it. The gun owners’ lobby was often said to be especially powerful.

But how powerful, and why would it be more powerful in Vermont than in any other rural state? I got curious. It turns out that the gun owners as a group are not especially organized in Vermont: they do not even keep formal track of which legislators vote for and against them. I called up Jim Douglas, who was Vermont’s Republican governor between 2004 and 2012, and he told me that among the hundreds of groups who lobbied in the capital the gun groups did not stand out. The key, it seems, is that until 2012 there was literally no organized gun control group in the state at all. The idea that state legislators know their own constituents better than congressmen from distant corners of the country sounds good in theory, but in practice the local bodies, often composed of part-timers, are less resourced and less ambitious. The gun question in Vermont seems to have been trapped in a feedback loop: No one—including, to his discredit, Bernie Sanders—ever really challenged the gun lobby, and so everyone assumed it was unassailable.

The soft bigotry of Ivy League football

This fall, I went with some neighbors to the Harvard-Dartmouth football game. To go from watching the NFL and Division I college football to seeing the Ivy League version includes some shocks. The game has a different geometry. Punts flutter quickly to the ground. Long passes basically don’t exist at all. It is a slower, muddier struggle. It makes you wonder why the colleges bother.

But no, really, why do they bother? Think about this system as if it were invented from scratch. The richest and most powerful people in American society come from a small group of private northeastern colleges that are extremely competitive and growing only more so. Admission to these places is extraordinarily difficult. And yet eighty spots are reserved for young men (no women) willing to play a game that exposes them to a likelihood of long-term brain damage whose exact dimensions are unknown but almost surely greater than zero. Most of these men would not earn admission to these colleges if they were not willing to play this game. But the men admitted to the elite colleges to play this game are not the very most talented young men who play this game—those young men tend to go to less academically rigorous colleges. They are, instead, young men who are pretty good students (good enough to meet the lowered bar for entry) and pretty good at the game. Disproportionately they come from wealthy northeastern places. Simply playing the game gives them an extra boost after graduation: from the Northwestern sociologist Lauren Rivera we know that the two most important factors the leading investment banks, consultancies, and law firms consider are the prestige of a candidate’s university and whether he played sports there.

Heroin’s strange road trip

As the candidates moved through New Hampshire this fall, treatment centers for the heroin epidemic suddenly became a common stop, as if they were diners or county fairs. Chris Christie and Jeb Bush have been particularly attentive, showing up and listening to middle-class parents talk about watching their children die, and to treatment staffers telling stories of middle-of-the-night drives across state lines to find an addict an open treatment bed.

Beneath the misery is a mystery. No one really knows why deaths from heroin and prescription opiates have spiked so precipitously, but the scale of the change is remarkable. Four times as many Americans died of heroin overdoses in 2014 as had in 2010. It seems to be part of a larger pattern. The married Princeton economists Anne Case and Angus Deaton (a Nobel laureate) published a pair of studies this year finding that death rates for white middle-aged Americans were escalating while rates for every other demographic group were declining, both here and abroad. The anomaly, Case and Deaton found, was largely due to opiate addiction and suicide. These factors are so amorphous, so essentially literary in nature, that the temptation has been to see at their root a literary experience: the collapse of white privilege or working-class certainty, for instance, the inward turn that mirrors for depressives the anger that is manifest in the Trump movement. But it’s hard to say whether that’s true. In January there was a remarkable bipartisan Senate hearing on the issue that covered the bases but produced no decisive insight. Americans have been taking many more opiate pain pills, perhaps because doctors have been overprescribing or perhaps because we were underprescribing that class of drugs when they were more rudimentary and less helpful; people who become addicted to heroin tend to have long histories using prescription opiates first, and there has been more heroin moving into the country.

Complicating things further are the geographic particulars of the epidemic. The CDC, long worried about overprescription, has traced the patterns of painkiller scripts and found that they are most concentrated in the South. The deaths from prescription overdoses, meanwhile, are concentrated in Appalachia and the Mountain West. Heroin seems to have started flooding into the United States right around 2010 (seizures at the border began spiking then), and almost immediately the deaths starting rising too. But heroin deaths are concentrated most heavily in New England, and second most in the Midwest. The epidemic has, so far, bypassed demographically similar spots: Upstate New York, the Plains states. In individual stories you often see the line through the cases—a pain prescription, an addiction, a switch to heroin, a death—but the picture of the epidemic generally is not so clear.

One theory, advanced by the author and former Los Angeles Times reporter Sam Quinones, is that the shape of the epidemic may largely be the result of choices made by drug traffickers. Quinones focuses on a network of distributors from the Mexican state of Xalisco, who, at least in his account, are largely responsible for distributing a cheaper and less pure version of the drug, called black tar heroin, around the country. Quinones suggests that this network is low-profile and nonviolent (its members decline to carry guns). For this reason, he argues, it has targeted those regions where there are fewer drug dealers already operating—places like rural New Hampshire, and Ohio. The Xalisco network has largely avoided African American neighborhoods. The story of the Xalisco boys, in which a whole region is said to be co-opted into a drug operation, echoes the case of Marietta, Arkansas, during the crack epidemic, in which much of the population of this Delta town moved to Detroit in the employ of a drug supply chain led by four local brothers (Larry, Billy Joe, Otis, and Willie Chambers).

Quinones’s account is compelling. It could explain, for instance, the uneven spread of heroin through the country. But the Xalisco boys are probably only part of the story. The Case/Deaton data suggests that middle-aged white people are engaging much more frequently in a whole array of risky behaviors for reasons that we don’t totally understand. Deaths in car accidents, for instance, have been spiking in this group and not in others. So have suicides. Another theory is that the people using heroin now are naive drug users—they tend not to be from places or families that have much experience with addiction, and therefore they are less scared. Surely there are other factors too, yet to be uncovered. So many of the policy fights during the Obama years, especially after the financial crisis abated, have been essentially political: the experts mostly agree, and the drama is whether the politics can support the scholarly consensus. It is bracing to see a bipartisan group of senators who want only to turn to the experts, and yet the experts don’t really know what is happening at all.

Has America’s luck run out?

On the Sunday before the Iowa caucuses, the New York Times Book Review devoted its cover to a warm review by Paul Krugman of a book by Robert Gordon, an economist at Northwestern who in his eighth decade has abruptly become the leading declinist of our time. Gordon’s thesis is that the great boom of the past century in the United States was a unique event, due largely to what is generally called the Second Industrial Revolution: certain structural changes that can happen only once. You cannot air-condition the South a second time, he is fond of saying. I spent some time with Gordon when his ideas first began circulating, and though I thought his ideas fascinating I also secretly suspected that their bleakness owed something to the man himself, a married but childless academic past seventy who spent his time in a vast house looking out forlornly at the fog of Lake Michigan.

But lately I’ve been revising that opinion. One prevailing mystery in this campaign is why the country is so glum. The economic statistics, with unemployment below 5 percent and wages finally rising a bit, seem on the whole, all things considered, relatively bright. The emotion in the country is much darker. Hillary Clinton, in particular, seems unsure about how to navigate this. One possible answer is that the poignancy of the financial crisis left most people skeptical that the recovery is anything but fleeting, or temporary. Another is that they sense what Gordon does, that the long arc of the country’s financial well-being is not certain to bend toward prosperity, that our history depends upon vast good luck that our future may not get.

The post Tilting at Windmills appeared first on Washington Monthly.

]]>
931
God Save the Scene https://washingtonmonthly.com/2016/03/13/god-save-the-scene/ Sun, 13 Mar 2016 20:59:10 +0000 https://washingtonmonthly.com/?p=929

D.C. punk has thrived for decades with the help of churches, activists, and even the library. Can it survive the city’s rapid redevelopment?

The post God Save the Scene appeared first on Washington Monthly.

]]>

Eight hours before my first-ever show in Washington, D.C., I was standing in the cold while a bearded man from AAA shook his head and explained why I’d never make it. My band, Jet Jaguar, had just played in New York and we were living that rock-and-roll tour life—except our cheap motel room was my parents’ house in suburban New Jersey and our rust-bucket tour van was a decade-old Volvo sedan. We had awoken that morning to find that the usually reliable Volvo was refusing to turn on.

For our work-shirt-clad doomsayer, though, the mechanical failure wasn’t the problem. The battery was shot, but it was a simple (if pricey) fix, since he had a replacement in the truck. He was just certain that even with a new battery we wouldn’t be able to make the drive in time. He assured me I could trust him on that, since he “had some family down there.” When I told him we didn’t have much of a choice because our band had to play, all sense of urgency disappeared. Did we sound anything like Rush? And did we want to hear a bunch of stories about the dozens of Rush shows he’d been to? The answer to the first question was no. The second question was rhetorical.

Despite a traffic jam outside Baltimore, we made it to the Adams Morgan bar in time for the show; I can neither confirm nor deny that a number of speed limits were broken along the way, though I will say that I’m a better guitarist than driver. We’d played in New York and Chicago before, but I was the most excited to play D.C. I had recently moved back to the city after some time away and, like many others who grew up listening to a certain strain of punk music, had an idyllic picture of the music scene. If your image of Washington stops where the monuments and government buildings end, you may not know that the city is hiding a hugely influential and uniquely accessible punk scene that stretches back more than three decades. I was ready and eager to join in.

Although that show turned out to be kind of a mess.

To understand the attraction of D.C. to a wannabe like me, you have to go back to 1985. That was the year of “Revolution Summer,” a season of protest and performance that put the District’s scene on the map. While the shouts were aimed at targets like apartheid, militarization, and Ronald Reagan, the revolution had turned inward as well. Washington was a hotbed for punk and hardcore music in the late 1970s and early ’80s, but many found the scene violent, sexist, and inaccessible, both in the mosh pits and out. A new activist group called Positive Force and a wave of musicians who rejected those values set out to change that. “Positive Force together with a new generation of bands had a critique of that kind of destructive version of punk,” says Mark Andersen, a Positive Force member who literally wrote the book on the D.C. punk scene. “It was an insurrection.”

Mar-16-Connolly2
Credit:

Bands like Rites of Spring (who were hugely influential despite existing for all of two years), Fire Party (whose singer, Amy Pickering, actually coined the term “Revolution Summer”), and Mission Impossible (whose young drummer, Dave Grohl, would go on to be exponentially more famous than everyone else in this article) took part in this pivotal era. But the movement’s biggest band, Fugazi, emerged just afterward. Fronted by the D.C. icon Ian MacKaye, Fugazi was known for their staunch independence and dedication to low-priced, nonviolent, all-ages shows. They toured nationally and internationally in the late 1980s and ’90s, cementing the city’s status as a hotbed for ethical do-it-yourself punk rock. Andersen, who interviewed Grohl right after his new band, Nirvana, had changed the landscape of rock music with Nevermind in 1991, remembers, “He was talking about how at the outset his ambition with Nirvana was that he just hoped they could be as popular as Fugazi.” To put things in perspective, Nevermind’s original run was about 50,000 records. Fugazi’s Steady Diet of Nothing, which came out that same year, had an original run of 160,000. (Nevermind has gone on to sell thirty million copies worldwide.)
Hardcore history: The Wilson Center in Columbia Heights was one of many all-ages venues in the city.

For punk rock in the District, this wasn’t just a period of increasing popularity. It was also a time of institution building, which would help the ethics espoused by bands like Fugazi take root in the scene and last well into the future. Later generations, like D.C.’s slice of the feminist punk Riot Grrrl movement in the early ’90s, brought their own values but still needed accessible spaces to work and perform. Forced to the margins by bars that were wary of booking punk acts and wouldn’t let in underage musicians or fans, bands and promoters built up relationships with community spaces like the Wilson Center in Columbia Heights (now the site of a public charter school) and churches like St. Stephen’s a few blocks away, which still hosts shows today. “A lot of these churches were more genuinely open to community participation and less open to the almighty dollar,” Andersen says. “These churches had this sense of a broader mission and they could tuck the punk stuff under that big tent—as long as the punks weren’t wrecking the joint.” Many of those shows were and still are booked by Positive Force, which celebrated its thirtieth anniversary last year. Larger “real” venues like the 9:30 Club and the Black Cat agreed to keep shows open to all ages, giving bands who got too big for houses and church basements places to play without compromising their values. The Fort Reno concert series, held every summer in the titular park, has been going for more than forty years now.

Why are lasting all-ages spaces important? Without them, the rock scene revolves around bars. The money isn’t in the music itself; much like how newspapers are really advertising bulletins with news articles, live music in this case is just liquor sales with guitars. Kids younger than twenty-one, often the lifeblood of a thriving music scene, can’t attend shows or play them. And if you are old enough to perform, you might not get asked back to venues if your fans don’t drink enough.

This dynamic makes it difficult for bands to find their footing. A common way to compensate performers is for a bar employee to sit at the door and ask each fan who they’ve come to see. Every answer is tallied, and the bands get a percentage of the payout based on how many people they brought in. This puts bands that are playing a show together in competition for their share of the money at the end of the night, and puts anyone who’s a fan of multiple acts in a tough spot, since they have to pick one. (Ideally they pick whoever they think needs gas money the most.) And that’s if the bar is being totally aboveboard—if you’re certain your band got thirty fans but the manager’s magic sheet only has twenty tallies, there’s not much you can do.

As you can imagine, this isn’t exactly a path to profitability. The first time Jet Jaguar ever made money on a show, we played a bar in Chicago that offered $5 for every fan we brought in. We left with $95, which, after the $85 on equipment and car rental, was exactly enough for the fifty chicken nuggets we split at Burger King that night. And those tallies are far from the worst system out there—I remember going to see my friends’ band at a dingy club in Chicago that would book as many as eight acts for one show and then give each group a few weeks to sell their own tickets, with the length and placement of each band’s set contingent on how many tickets got sold. This led to a conveyor belt of acts whose fans would show up right before their set and leave as soon as it was done. When my friends’ group had finished playing, a metal band came on to replace them. The lead singer, whose long hair covered up most of his Scottie Pippen Bulls jersey, audibly sighed into the microphone as he saw how few people were in the crowd before limping into a dispirited rendition of Rage Against the Machine’s “Killing in the Name.”

On that drive to my first D.C. show I was excited at the prospect of never having to play or attend a show like that ever again. But I soon discovered that it takes more than an all-ages policy to make a good show. That night the bar’s microphones kept cutting out, and to make matters worse the sound guy would yell questions and inexplicably try to make conversation with us literally in the middle of our songs. Undeterred, I ended up taking a volunteer gig booking and hosting a monthly show there. The bar only paid artists in crowd donations, though, and most audience members were either barflies who weren’t expecting music or artists’ friends who didn’t think they had to pay. I often found myself sneaking my own money into the tip jar so performers would get a measly double-digit payout, or inviting my own friends who I knew I could guilt into giving (sorry!). After a year I gave it up—I had found local music, but had yet to discover the institutions that made it thrive.

For musicians outside the punk scene, it can be hard to find such institutions. I booked the rapper Stef Luva, née Stephan Nolan, at one of my shows two years ago. He says D.C. hip-hop has yet to embrace the all-ages mentality the way punk shows did back in the ’80s, since most of the scene revolves around showcases at bars and clubs. “There’s usually no place for underage artists, and there are a lot of artists eighteen and younger who are actually good,” he says. “It’d be nice to get them early so they can hone their skills. Otherwise by the time you get everything, you’re almost pushing thirty.” That’s not to say there are no local institutions; Nolan says it’s “all love” at Hip-Hop Yoga, a weekly all-ages showcase at the University of Maryland that is exactly what it sounds like. But with two singles planned for this year and a mixtape on the way, he thinks leaving the District to play shows in cities like Atlanta and New York can lead to better local exposure: “Everybody from your hometown starts hearing word of mouth. People give you respect and props when you’ve gained fans elsewhere.”

And when you want to branch out and play shows that aren’t just hip-hop focused, you don’t always feel welcome. There were some bad moments in my short-lived hosting career. The drummer of the first band I ever booked quit ten minutes before they were supposed to go onstage, and a few months later a band cut the power to half the bar when they plugged all their equipment into a power strip plugged into another power strip plugged into a single outlet. But the worst came right before Stef’s set: I was fiddling with the sound as he got ready to go onstage, and when I looked up a group of young-looking, white-looking patrons were lined up talking to him and taking their wallets out. I naively thought we were going to get donations before the show even started—but it turned out they had assumed Stef was a bouncer and were trying to show him their IDs. They later left without leaving a tip.

While Stef says the hip-hop scene is on the rise, the punk scene that still carries so much from thirty years ago is entering a new phase, with new challenges. David Combs, whose band the Max Levine Ensemble released a new album in November (he has also performed as Spoonboy and recently started a new band called Bad Moves), is from the area and has been playing music in and around D.C. for more than fifteen years. Despite the scene’s roots in protest, he says, things have gotten less political since he started playing. “We had a lot of the same people who were organizing protests and playing in bands, putting on shows and distributing literature,” he says. “That political movement kinda waned off in a crushing of the spirit during the Bush administration.”

Julie Yoder, drummer for the feminist punk band Hemlines, says the climate has shifted for bands like hers over the past decade. Her first band, Mess Up the Mess, had to work a lot harder to find shows—especially when leaving the District to go on tour. “I just remember playing shows ten or eleven years ago where the sound guy would just be shitty to us—we were picking up on the fact that he was dismissing us,” she says. “With Hemlines it seemed like people responded almost immediately to what we put out there. And we’re not just the one band on the bill that’s mostly women.” Yoder helped bring a Girls Rock! camp, which gives girls ages eight to seventeen a week to learn an instrument, join a band, and write an original song to be performed in a big showcase at the 9:30 Club, to D.C. (There’s also We Rock!, a version for adults.) The camp, now in its eighth year, has become its own musical institution, engaging and reengaging volunteers. “What often happens is a lot more adult women get inspired to start playing music or go back to playing music,” Yoder says. Shira Mario, who plays in the pop punk band the Maneuvers and grew up going to and organizing shows in D.C., volunteers with Girls Rock! and carried some of that spirit over to a project called Hat Band, in which new bands are formed by drawing participants’ names out of a hat. New musicians and learning new instruments are encouraged, and the whole thing ends with a show at St. Stephen’s. When lasting bands come out of Hat Band or Girls Rock!, it refreshes a music scene that might otherwise devolve into the same group of people rotating between bands. “It was so exciting that people were doing this and breaking out of their comfort zone,” Shira says. “I want to see some new faces. I want to see some of these people that are in the audience get up there and play.”

The most acutely felt change, though, is a challenge facing artists in cities across the country: rapid redevelopment. “You just can’t understate gentrification and the continual growth of the price of living in this city when it comes to the impact that has on a music scene,” Combs says. “You see a lot of people moving away if their focus is on art and trying to live somewhere that’s more affordable.” While any artist you talk to is quick to point out that they’re far from the biggest victims of gentrification, it’s a big blow to the scene when houses that once hosted shows disappear overnight to be flipped by the landlord and musicians pack up and move to Philadelphia or Baltimore, where the rent’s cheaper. Shockwaves went through the community when the warehouse that houses the arts collective Union Arts—which hosted practice and performance space for many District musicians—along with other artists was sold to a developer with plans to turn it into a “boutique hotel.” To add insult to eviction, the developer tried to allay fears by making room for eight art studios—a small fraction of the original space—and no room for musicians.

A February zoning hearing on the issue was literally overflowing with punks and other artists who wanted to voice their complaints: a second meeting had to be scheduled just so everyone who had signed up to comment could do so. The unease between the scene and local government may not come as a surprise—you’re saying the punks have a problem with authority?—but the city’s aggressive pro-redevelopment policies have a way of making things testier. And it’s not just punk musicians who are affected. January saw a “Muriel’s Gotta Go-Go” protest, in which activists rallied against Mayor Muriel Bowser with a go-go soundtrack. The protest was held outside the Reeves Center, a government office building that housed the go-go venue Club U until the city shut it down in 2005 following a stabbing. (Even the most dedicated local hardcore fan will acknowledge that go-go, the homegrown genre that blends funk and R&B into nonstop live shows, is the city’s musical heritage, and a protester told Washington City Paper that the city and its police force have largely pushed go-go performances out of D.C. proper.) Similarly, a shooting in 2007 at a bar a few blocks from Club U led to a proposal in the city council to bar minors from businesses that serve liquor, essentially banning all-ages shows at clubs. Ian MacKaye showed up to give an impassioned speech against the measure, which didn’t pass.

What may be surprising is that local government resources are going toward preserving and honoring the city’s punk culture. The D.C. Public Library is collecting and digitizing materials as part of its new D.C. Punk Archive, set to house everything from music recordings and show videos to flyers and handmade zines. “I grew up around here and went to shows and stuff a lot as a young person,” says librarian Bobbie Dougherty. “When I started working in the library world I started thinking about this differently—I started to realize it should be something that is being saved and preserved and talked about.” Dougherty and others solicit donations from scene veterans—Andersen donated multiple huge filing cabinets full of materials—and count some younger punks as volunteers, entering metadata and obtaining bands’ permission to make their songs available to play online. (The library is hoping to build on the Punk Archive’s success by using many of the same tactics to strengthen its go-go collection.)

The library has even gone a step further to institutionalize itself in the scene by hosting free punk shows in the main branch’s basement. “Historically it was just this dark scary room,” Dougherty says. “The first time I went in there I thought, ‘This would be great for shows.’ Because it’s a terrible room in a basement.” Hemlines was part of the first show, and Yoder credits it with kick-starting the band’s success. As a music venue in general it’s an amazing addition—free, for all ages, and extremely accessible, especially for younger fans. Mom and Dad in Silver Spring might be wary of their little punk going to a stranger’s house to see a show, but how can they argue with a trip to the library?

Despite anger over Union Arts and similar issues, no one I’ve talked to sees any cognitive dissonance about criticizing the current administration while praising the Punk Archive—after all, the library is far and away the most punk-rock branch of local government. (Garbage collection is probably a distant second.) And given the organization around the challenges they’re facing, there’s an opportunity for musicians working to preserve affordable places to live, practice, and play to fight for more disadvantaged communities who are being displaced. “I was definitely impressed by the showing at the [Union Arts] planning meeting, but equally, if not worse, stuff has been happening to people in the city for years,” Dougherty says. “Hopefully that kind of energy and that kind of support can continue, not just for art spaces.”

New institutions like the Punk Archive and decades-old ones like St. Stephen’s, Fort Reno, and Positive Force mean the music scene is uniquely situated to potentially weather the current storm. In terms of pure space to perform, churches and libraries are much harder to gentrify away. While it remains to be seen what happens to Union Arts, the spirit of protest and participation is at least alive on a local level. And thanks to how ingrained all-ages shows are in the fabric of the city, all the way up to some of the city’s biggest rock venues, a band in D.C. can go from just starting out to pretty famous without compromising on those values from 1985.

Veterans like MacKaye and Andersen are part of those institutions too. “People who grew up in D.C. and were engaged in that culture kept a part of it—it wasn’t just some party they went to,” Combs says. “They show more investment in what you’re doing with music and whether the music scene is maintaining values like that.” That history also attracts people (like me) who grew up listening to D.C. punk and want to get involved. “What happened here was particularly powerful and long-lasting,” Andersen says. “There’s a mythology, but there’s an actual reality behind it, that D.C. stands for something that inspires and challenges people still.”

Eight hours before my most recent show in the District, I was doing some serious interior redecorating. My new band, Hooky, was having its first show that night, and we decided to do it ourselves at our singer’s house rather than try and get booked somewhere. That meant carrying couches and clearing out space for the forty or so sweaty people who ended up crammed into the living room on an unseasonably warm December night.

It was my first time onstage since I had quit my hosting gig a year earlier, and during that span I had started going to Positive Force meetings and occasionally volunteering with the D.C. Punk Archive project. I had also spent most of that time desperately searching for a drummer, eventually resorting to taking a Sharpie to a blank T-shirt and wearing my plea out to shows. Even in a storied scene like this one, a drummer with free time remains the rarest of breeds.

The band came together over the summer, though, and six months later our first show went off without a hitch (depending on whether you count screaming so loud that you come close to passing out mid-song a hitch). We had friends in attendance who were in bands themselves, and afterward made a few vague plans for shows together soon. It was just a six-song set in a living room, but it made me feel like a part of this thing, this scene, that could be traced back thirty years to some of my favorite bands. If I were to stick around another thirty years, I bet a lot of it would still feel the same—though I’m a better guitarist than gambler.

The post God Save the Scene appeared first on Washington Monthly.

]]>
929 Mar-16-Connolly2
Why Is Marriage Thriving Among (and Only Among) the Affluent? https://washingtonmonthly.com/2016/03/13/why-is-marriage-thriving-among-and-only-among-the-affluent/ Sun, 13 Mar 2016 20:58:02 +0000 https://washingtonmonthly.com/?p=928

Experts struggle to explain one of the biggest drivers of inequality.

The post Why Is Marriage Thriving Among (and Only Among) the Affluent? appeared first on Washington Monthly.

]]>

In 1950, the typical man married for the first time at about age twenty-three, while women married at a tender median of twenty. In this heyday of America’s Leave It to Beaver period, few children were born outside marriage—just 7.9 percent of births were “premarital” in 1950, according to census data—and divorce was equally rare.

Today, however, marriage in America seems to be dying.

In 2010, according to the Pew Research Center, only about half of all Americans over age eighteen were married, compared to nearly three out of four in 1960. Americans today are marrying later, if at all, and the share of Americans who’ve never married has climbed to record highs. As one result, the share of children growing up with single moms is also skyrocketing; in 2013, 41 percent of all births were to unmarried women.

But the seeming decline of marriage includes one major caveat: educated elites. When it comes to marriage, divorce, and single motherhood, the 1950s never ended for college-educated Americans, and for college-educated women in particular. According to the researchers Shelly Lundberg, of the University of California, Santa Barbara, and Robert Pollak, of Washington University in St. Louis, the share of young college-graduate white women who were married in 2010 was a little over 70 percent—almost exactly the same as it was in 1950. College-educated white women are, moreover, half as likely as other women to be divorced, according to Steven Martin of the University of Maryland, and they are also refusing single motherhood. Fewer than 9 percent of women with a bachelor’s degree or more had an unwed birth in 2011—a level barely higher than what it was for all women in 1950.

It’s also seemingly only Americans with four-year degrees or better who appear immune to the broader cultural and social forces eroding marriage. In 1950, white women with “some college,” such as an associate’s degree, were actually more likely to be married than their better-educated sisters. Today, it’s the opposite. Though women with a high school diploma or less have seen the sharpest drop in marriage rates, the decline has been almost as severe—and ongoing—for women just one short rung down the education ladder, regardless of race.

The endurance of marriage among elites—and, it seems, elites alone—is important not just as a cultural anomaly. As the class divide in marriage grows, elites are compounding the advantages of their status, especially for their kids. Since the release of Daniel Patrick Moynihan’s now-famed report on the breakdown of black families in 1965, researchers have amassed a growing mountain of evidence that family structure and marriage matter. Compared to children living with single parents, or even with parents who are cohabiting, kids raised in married-parent households are much less likely to grow up in poverty, more likely to do better in school, and more likely to move up the economic ladder even if they start out poor. “There’s no argument about what’s best for kids,” says the economic and social policy expert Ron Haskins, of the Brookings Institution. “It’s to be reared in a stable household by married parents.”

But if there’s growing consensus that class differences in marriage rates are contributing to inequality, there’s far less clarity about the solution. “We don’t know why [these class differences] exist,” says the Brookings Institution’s Isabel Sawhill, one of the nation’s preeminent experts on marriage. But “we certainly know they do exist.”

Without a proven answer about what’s happening to marriage in America, it’s tough for policymakers to figure out exactly how to bring it back. In the early 2000s, the George W. Bush administration embarked on a series of experimental efforts to “promote marriage,” particularly among low-income households. From 2001 to 2010, the federal government invested more than $600 million in “marriage promotion” programs, including a series of demonstration efforts to try out such strategies as marriage and relationship education as well as programs to expand job and training opportunities for low-income men. By and large, the results were disappointing.

A few scholars are advancing new ideas that could open fresh territory in the debate on marriage. But the question of how to save marriage—one of the biggest cultural, social, and economic conundrums facing the country—remains a largely unsolved challenge. As a consequence, a potentially vital lever for reducing inequality remains untapped.

One reason the marriage behavior of elites has been so puzzling is that it defies the popular explanation for marriage’s decline.

During the 1950s, marriage was the best—and perhaps only—route to economic security for women. Just 34 percent of women were in the workforce in 1950, and the absence of reliable contraception meant that a woman’s every sexual encounter was fraught with the risk of unwed motherhood and its accompanying stigma. Single motherhood was economically infeasible as well as socially taboo.

In the ensuing decades, so the story goes, these traditional rationales for marriage—financial security, sex, and respectable motherhood—began to vanish. Women got the Pill and paychecks of their own. Premarital sex lost its stigma, as did single motherhood. By 1990, 58 percent of women were in the workforce, and by 2010, women in the workforce with college degrees outnumbered men. In the meantime, the relative economic status of men declined sharply, leading some to theorize a shortage of “marriageable” men, especially among African Americans. When marriage rates began declining, scholars attributed this to the “independence effect” of the choices now available to women.

Yet the women best positioned to exercise this hard-won independence—college educated, career minded, and with the highest income and resources—were still choosing to marry. In fact, while marriage rates were declining for every other group of women in the 1960s and ’70s, marriage rates for college-educated white women even rose slightly.

Mar-16-Kim3
Credit:

Figure 1. Share of Americans Currently Married, 1960 and 2010
Source: Pew Research Center.

Some scholars argue that while traditional rationales for marriage fell away, elite Americans discovered a new reason to marry: for the advantages it confers on their children in an increasingly competitive economy. “College graduates—
men and women—are using marriage as a ‘commitment device’ to jointly invest a lot in children,” says the scholar Robert Pollak, who recently championed this point of view in the academic journal Future of Children, with coauthor Shelly Lundberg.

Proponents say that these child-centered unions—what the Brookings scholar Richard Reeves calls “HIP,” or “high investment parenting,” marriages—ensure an equal commitment to the project of raising children, including the necessary sacrifices that may be demanded of one parent or another to balance career and family. College graduates, wrote Reeves in the Atlantic, “are reinventing marriage as a child-rearing machine for a post-feminist society and a knowledge economy.” It’s the ultimate in helicopter parenting.

Under this thesis, marriage enables the enormous logistical, emotional, and economic benefits in what’s now the high-stakes enterprise of middle-class and upper-middle-class parenthood. A second income helps pay for the football uniform or for a babysitter to shuttle Junior to his piano lesson if a parent is unavailable to do the driving. A second parent means one more set of eyes on the homework and one more voice to enforce discipline. And the returns to investing in children—such as by ensuring them a college education—have certainly grown over the decades.

But other experts question whether marriage is truly a deliberate parenting strategy, even if higher-achieving kids are a by-product of it. And, as Pollak acknowledges, there’s no empirical evidence to date that aspirations for children directly influence people’s decisions to get married. “I don’t think marriage is just a child-rearing device,” says Sawhill. “A lot of people who are getting married don’t have children and may not even put a high priority on having children. The romantic reason for marriage is not totally obsolete.” The child-centered marriage thesis also raises some discomforting implications about the expectations parents might have for their children, depending on their class. “Aspirations for children may be key to the class divide in marriage,” says Lundberg.

Yet a logical corollary to this theory is that it is only those Americans with the highest expectations for their children who decide to invest in marriage—a notion that is difficult to believe. In fact, according to a 2015 survey by the Pew Research Center, lower-income parents put a higher value on a college degree than do more affluent ones. While 50 percent of parents earning less than $30,000 a year say that it’s “extremely important” to them that their children graduate from college, only 39 percent of parents earning $75,000 or more say the same.

The simplest explanation for why elites continue to marry at the same rates as they did in 1950, and why they have such high expectations for their children (if that is indeed a factor), is that the 1950s did not, in fact, ever end for them economically. It’s perhaps no coincidence that the only group for whom marriage rates have not declined is the one group for whom incomes have been rising. In 1965, according to Pew, the earnings gap for young adults with a high school diploma versus those with a college degree was only about $7,000, measured in 2012 dollars. But by 2013, that gap had nearly tripled. While a young adult with a high school diploma earned a median of $28,000 in 2013, down from $31,384 in 1965, college graduates earned a median of $45,500, up from $38,833. Real median earnings for young adults with two-year degrees also fell, from $33,655 in 1965 in 2012 dollars to $30,000 in 2013.

Mar-16-Kim1
Credit:

Figure 2. Currently Married White Women, Ages 30-44, by Education

Source: Shelly Lundberg and Robert A. Pollak, “The Evolving Role of Marriage: 1950-2010,” 2015.

Another wrinkle is that while marriage rates have been declining, people’s aspirations to marry have not fallen as fast. Even as 39 percent of Americans in 2010 told researchers at Pew that “marriage is an institution that is becoming obsolete,” 61 percent of unmarried people said they hoped to get married someday. And even among unmarried adults who said they thought marriage was obsolete, nearly half still planned on marriage for themselves.

As the sociologist Andrew Cherlin described this aspirational view, marriage is now the “capstone,” not the cornerstone, of people’s lives. “Marriage has become a status symbol—a highly regarded marker of a successful personal life,” Cherlin wrote in the New York Times. It’s no wonder, then, that college graduates are the only ones who feel successful enough to marry, and who are also more likely to find partners of equal status with whom to tie the knot.

Mar-16-Kim1
Credit:

Figure 3. Share of Currently Married Women, Ages 40-55, by Race and Education, 2012

Source: R. Kelly Raley, Megan M. Sweeney, and Danielle Wondra, “The Growing Racial and Ethnic Divide in U.S. Marriage Patterns,” 2015.

But what can public policy do? If marriage is, in fact, a signal of aspiration and a symbol of achievement, it stands to reason that the best way to “promote marriage” might be to improve the declining economic lot of most Americans. However, these improvements will likely need to be substantial. In an extensive review of the impacts of various job training and other programs aimed at raising the economic prospects of men, Daniel Schneider of the University of California, Berkeley, found that these programs had little to no impact on whether a man was more likely to get married. The sole significant exception was a training program called “Career Academies,” which also produced dramatic gains in income compared to the other programs Schneider reviewed. Over an eight-year follow-up period, Career Academies participants saw their incomes rise by a total of $17,000, compared to a control group, and 36 percent of the men who participated in the program were either married or living with a partner, compared to 27 percent of men in the control group.

“I think what we see here is evidence that well-designed—and consequential—interventions can matter for marriage,” says Schneider. “The notion that small economic increases might have big increases on marriage doesn’t seem to bear out.” Rather, the impacts have to be “large enough to affect big life decisions like marriage or childbearing,” Schneider says, not “nudges.”

But the time for major intervention may have come. As the returns to education rise, children handicapped by access to just one parent’s time, attention, and income are at a serious disadvantage. By getting married and staying married, educated parents are compounding the ever-widening gaps in both achievement and opportunity between the haves and have-nots. Without a fix, this growing class divide in marriage will only calcify the social and economic inequality crippling the odds for increasing numbers of children.

The post Why Is Marriage Thriving Among (and Only Among) the Affluent? appeared first on Washington Monthly.

]]>
928 Mar-16-Kim3 Mar-16-Kim1 Mar-16-Kim1
How Obama Got Schooled https://washingtonmonthly.com/2016/03/13/how-obama-got-schooled/ Sun, 13 Mar 2016 20:57:31 +0000 https://washingtonmonthly.com/?p=927

Under pressure from right and left, the president signed away hard-won federal power over K-12 education and gutted his own reforms, even as they were working.

The post How Obama Got Schooled appeared first on Washington Monthly.

]]>

Barack Obama has not been shy about exercising federal power over the states, in areas ranging from health care to the environment. That’s been especially true in elementary and secondary education, where Washington spent $42 billion last year. Obama has leveraged federal school aid to promote higher standards, school choice, better tests, and more meaningful measures of teacher performance. When a paralyzed Congress couldn’t make needed fixes to the No Child Left Behind Act (NCLB), he used a regulatory strategy to let states out from under the law’s most troublesome provisions in exchange for commitments to the same reforms.

But just before the December holidays, in a White House ceremony, there he was, like a captive in a hostage video, talking about the importance of “empowering states and school districts to develop their own strategies for improvement.” With that, flanked by a bipartisan group of lawmakers, he signed a new federal education law, the Every Student Succeeds Act (ESSA), which replaced the NCLB and put the direction of the nation’s 100,000 public schools and the welfare of fifty million students squarely in the hands of the states and the nation’s 13,500 local school systems—effectively allowing them to do as little as they please to improve educational quality.

What brought the president to that moment was an unholy alliance of powerful political forces on the left and the right. One is the Tea Party, the right-wing coalition that has subjected the Common Core State Standards, the latest in three decades of attempts to ratchet up academic rigor, to all manner of conspiracy theories as part of its anti-Washington crusade. Ironically, a primary author of the new federal education law, Republican Senator Lamar Alexander, was George W. Bush’s education secretary and a leading proponent of using federal influence to demand accountability from states and school districts. But, capitulating to the rightward drift of his party, when he took over as chairman of the Senate’s Health, Education, Labor and Pensions Committee last year Alexander set about drafting the new federal education law with conservative colleagues including Indiana Republican Todd Rokita, a Tea Party favorite and chairman of the House Subcommittee on Early Childhood, Elementary and Secondary Education. The federal government was “overreaching” in education, they argued, usurping authority over the direction of the nation’s $640 billion public education system that rightly belonged with state governments and local school boards. “Federalism is the point of our bill,” Rokita told me last year. “It restores local control in education.”

But Obama would not likely have put his pen to the new law if many of his Democratic colleagues in Congress hadn’t voted with Republicans and abandoned their previous insistence that the federal government be able to require states and localities to do right by students. But they did get on board, and for one main reason: teachers’ unions. Eager to end the Bush/Obama-era focus on school and teacher accountability and the probing light it cast on the performance of their members, the national teachers’ unions—the National Education Association (NEA) and the American Federation of Teachers (AFT)—inundated congressional Democrats and the White House with protests against “blaming and shaming” teachers, targeting the use of student test scores in teacher ratings as a way of discrediting school and teacher accountability en masse. They poured money into surrogate organizations like Fair Test to amplify their attacks against more-rigorous testing and encouraged parents to opt their children out of public school standardized testing (a campaign led, ironically, by the kinds of suburban white families who don’t think anything of spending thousands of dollars for hours of private tutoring to ready their students for college admissions tests). By the NEA’s own calculations, it sent 255,000 emails to Capitol Hill, made 23,500 phone calls, had 2,300 face-to-face meetings with lawmakers and their aides, and spent $500,000 on advocacy advertising in key Senate congressional districts on behalf of the NCLB rewrite.

Nor did the Clinton campaign attempt to talk Obama out of signing the new law. Though both Bill and Hillary Clinton have supported higher standards and accountability in education, the teachers’ unions have been pouring people and money into her presidential bid. And the passage of the new federal education law deprives Clinton’s Republican opponents of an anti-Washington stalking horse and allows her to narrow her education focus to preschool instruction and college affordability, two middle-class educational priorities.

What’s most troublesome is that federal demands for standards- and accountability-based reforms, though far from perfect, seemed to have made a difference. Since the NCLB’s signing in 2002, scores for black and Latino students on the National Assessment of Educational Progress (NAEP), the gold standard of national yardsticks, have risen, as they have for white students. And arguments from teachers’ unions and many middle-class parents that students are being subjected to excessive testing because of federal accountability rules have turned out not to be true, at least in comparison to other countries. A recent study of the educational practices of seventy developed nations by the Organisation for Economic Cooperation and Development in Paris found that the U.S. ranks “just below average” in the amount of testing their students undergo. Moreover, there’s a strong case to be made that the Obama administration’s hard-nosed demands for improvement on two fronts, academic standards and teacher performance, have paid off.

Before passing the NCLB, national leaders had been trying for decades to get local educators to ratchet up standards in response to the changing workplace and societal demands—especially the recognition that “local control” of education had long resulted in disadvantaged and minority students receiving substandard instruction. Back in the 1980s, when the federal government first proposed more rigorous high school curricula, states and school districts responded by channeling tens of thousands of students into watered-down courses where they earned English credits for “word processing,” science credits for “auto body repair,” and math credits for “commercial food preparation.”

But in the face of persistent performance gaps along racial and class lines, President George W. Bush built a bipartisan congressional coalition that pushed through the No Child Left Behind Act, in 2001. The law directed states to create “rigorous” standards, test students’ mastery of those standards, and hold local educators accountable for the results—a model Bush had used in Texas. It brought more transparency to public education and made all of the nation’s educators directly responsible for their students’ achievement for the first time.

The NCLB had plenty of flaws. It defined school success narrowly (the percentage of students meeting a state-set score on standardized tests rather than the improvement in students’ performance over the course of a school year), and remedies kicked in even if a relatively small number of students in a school lagged—causing states to lower standards rather than have many schools get on the law’s wrong side. The NCLB’s tight schedules for reporting student test results led state legislators to buy simplistic multiple-choice tests that were easily administered and quickly scored—but drove down the level of instruction in some classrooms, as teachers taught to the tests. The law’s unrealistic demand that 100 percent of the nation’s students achieve proficiency in math and reading by 2014 was a big tactical mistake that undermined the law’s credibility. And the school choice remedy proved beyond the reach of many students.

With Congress gridlocked, Obama and his secretary of education, Arne Duncan, let states work around these problems in exchange for reforms that included the introduction of the new Common Core standards. And there’s evidence that the moves are working and that standards are finally on the rise.

In a recent study from Harvard’s Kennedy School of Government, the researcher Paul Peterson and colleagues reported that no less than forty-five states have raised their standards for student proficiency in reading and math since 2011, when the Obama reforms started to take hold. Peterson, who advised Mitt Romney’s presidential campaign on education issues, attributed the gains to Obama’s reforms and has written that the Common Core “has achieved phenomenal success in statehouses across the country.” Similarly, Achieve, Inc., a Washington policy shop, recently reported that a long-standing “honesty gap” in public education—the sometimes striking differences between the high percentages of students rated “proficient” under many state tests and the much lower proportions of students in the same states rated “proficient” under the more demanding federal NAEP tests—has recently closed significantly.

Obama used the same incentive strategies to persuade states and school districts to get serious about measuring teacher performance. Before the administration stepped in, school systems were spending over $400 billion a year on public school teacher salaries and benefits without any real sense of what the money was buying. The standard evaluation model for the nation’s more than three million teachers was a single, cursory visit once a year by a principal wielding a checklist looking for neat classrooms and quiet students—superficial exercises that didn’t focus directly on the quality of teacher instruction, much less student learning.

Now, more rigorous teacher evaluation systems are under way in more than three dozen states. The deployment of the new systems has greatly strengthened many school districts’ focus on instruction. School officials have incorporated student achievement into calculations of teacher performance on a wide scale for the first time in the history of public education. The new systems are increasingly prioritizing ways to help teachers improve their practice rather than merely identifying bad apples. And the best of them are providing a foundation for a wide range of new, performance-based teacher roles that are making teaching more attractive. These changes simply wouldn’t have happened at anywhere near the scale they have without federal intervention, which is now being withdrawn under the new law.

The Obama strategy hasn’t been perfect. Many of the new teacher evaluation systems are in the early stages and there are plenty of implementation problems, especially with student-achievement measures. Secretary Duncan’s decision to have states use student test scores in new teacher evaluation systems and at the same time introduce the Common Core standards and the new testing regimes—a move some of his senior aides advised against—alarmed and angered much of the nation’s teaching corps and intensified anti-testing and anti-Common Core sentiment.

But instead of building on the Obama successes and addressing the weaknesses in the administration’s initiatives and in No Child Left Behind, the new law effectively gives states and school systems a free pass on educational excellence.

After heavy lobbying by school reformers and civil rights groups fighting what amounted to a rear-guard action against Alexander’s plan, the new federal law retains the NCLB’s requirements that states set standards, test students, report the results, and work with the lowest-performing schools—“guardrails,” in the words of the law’s sponsors. But the law demands only that state and local officials go through the motions in those areas; it’s silent on results and consequences. States can ultimately set standards where they please. And while they’re directed to improve the bottom 5 percent of their elementary and high schools with high dropout rates, there’s no meaningful federal enforcement if they decide to go easy on schools. Indeed, the new law makes it virtually impossible for the U.S. secretary of education to proscribe, enforce, or even incentivize rigorous academic expectations, quality tests, school performance standards, and the measurement of teacher performance—core improvement levers.

And while it’s good news that a relatively high percentage of states are expected to retain the Common Core in some form—now that they have started expending resources to implement the standards—the pressure from the Tea Party and the teachers’ unions has already led states to withdraw from the Obama-funded networks building the rigorous new testing systems based on the Common Core. Membership in one of them, the Partnership for Assessment of Readiness for College and Careers, or PAARC, has plunged in four years from nearly half the states to seven and the District of Columbia.

If there’s an argument against federal accountability, it’s that national policymakers shouldn’t micromanage schools. In the end, schools are only as good as the people in them and the culture in which those people work. So it’s crucial to get school communities invested in reform. National policymaking should be about expectations rather than execution.

But the new federal education law both gives local educators more day-to-day flexibility and liberates them from external expectations, a strategy that risks returning many students to second-class educational status. Rather than being a path toward a new paradigm in public education where all students are taught to high standards, it invites a capitulation to traditional race- and class-based educational expectations, half a century after the passage of federal civil rights laws and just as the nation is transitioning to a minority-majority school population.

When “local control” in education is looked at through the lens of what’s best for students rather than through the filter of adult agendas, it’s clear that we’re not going to get many of the nation’s students where they need to be without explicit expectations for higher standards in much of what schools do, and without the policy leverage needed to ensure that educators deliver on those expectations.

The post How Obama Got Schooled appeared first on Washington Monthly.

]]>
927
Talk of the Toons https://washingtonmonthly.com/2016/03/13/talk-of-the-toons/ Sun, 13 Mar 2016 20:56:26 +0000 https://washingtonmonthly.com/?p=930

A selection of political cartoons from the past few weeks.

The post Talk of the Toons appeared first on Washington Monthly.

]]>

A selection of political cartoons from the past few weeks.

Mar-16-Toons1
Credit:
Mar-16-Toon2
Credit:
Mar-16-Toon3
Credit:
Mar-16-Toon4
Credit:
Mar-16-Toon5
Credit:

The post Talk of the Toons appeared first on Washington Monthly.

]]>
930 Mar-16-Toons1 Mar-16-Toon2 Mar-16-Toon3 Mar-16-Toon4 Mar-16-Toon5
The VA Isn’t Broken, Yet https://washingtonmonthly.com/2016/03/13/the-va-isnt-broken-yet/ Sun, 13 Mar 2016 20:55:00 +0000 https://washingtonmonthly.com/?p=971

Inside the Koch brothers’ campaign to invent a scandal and dismantle the country’s most successful health care system.

The post The VA Isn’t Broken, Yet appeared first on Washington Monthly.

]]>

In past presidential primaries, when candidates wanted to win the votes of veterans they would trek to American Legion halls and Veterans of Foreign Wars conventions in far corners of Iowa and New Hampshire. While there’s been a little of that in the current primary contest, a new pattern has emerged, at least on the Republican side.

Over the last year, every major GOP candidate with the exception of Donald Trump has made a pilgrimage to gatherings put on by Concerned Veterans for America (CVA), a group that had barely formed during the 2012 primary cycle. Whereas candidates back in the day were under pressure from the old-line veterans’ groups to promise undying support for the Department of Veterans Affairs (VA) and its nationwide network of hospitals and clinics, the opposite has been true this season. Candidates at CVA rallies have been competing with each other to badmouth the VA and its allegedly shabby treatment of veterans. And all have pledged fealty to the CVA’s goal of moving as many vets as possible out of the VA into private care. Even Trump is calling for more “choice.”

This may not at first hearing seem too surprising. Nearly the whole of the Republican Party has become more radically antigovernment in recent years. And since the spring of 2014, when headlines started appearing about long wait times and cover-ups at some VA hospitals, a strong narrative has built up, including in the mainstream media, that the system is fundamentally broken. A recent front-page headline in the New York Times proclaimed, as if it were a matter of fact, that Bernie Sanders’s support for the VA during the controversy over wait times proved his poor judgment: “Faith in Agency Clouded Bernie Sanders’s V.A. Response.”

Yet beneath the surface of events, a far different, deeper, and more consequential story is unfolding. The CVA, it turns out, is the creation of David and Charles Koch’s network. The Koch family has famously poured hundreds of millions of dollars into think tanks, candidates, and advocacy groups to advance their libertarian views about the virtues of free markets and the evils of governments and unions. Seldom, however, has one of their investments paid off so spectacularly well as it has on the issue of veterans’ health care. Working through the CVA, and in partnership with key Republicans and corporate medical interests, the Koch brothers’ web of affiliates has succeeded in manufacturing or vastly exaggerating “scandals” at the VA as part of a larger campaign to delegitimize publicly provided health care.

The Koch-inspired attacks, in turn, have provided the pretext for GOP candidates to rally behind the cause—only recently seen as fringe—of imposing free market “reforms” on the federal government’s second largest agency. The attacks have also damaged the reputation of the VA among the broader news-consuming public, and, not coincidently, undermined morale within the agency itself. And they succeeded in stampeding bipartisan majorities in Congress into passing legislation in 2014 that under the guise of offering veterans “choice” has instead created a deeply flawed and unworkable process of outsourcing VA care while also setting in motion a commission that seems intent on dismantling VA-provided health care altogether.

All this has been happening, ironically, even as most vets who use the system and all the major veterans’ service organizations (VSOs) applaud the quality of VA health care. Adding to the perverse twists of the story is a mountain of independent evidence, including studies mandated by the 2014 law itself, showing that while the VA has an assortment of serious problems, it continues to outperform the rest of the U.S. health sector on nearly every metric of quality—a fact that ought to raise fundamental questions about the wisdom of outsourcing VA care to private providers.

The long arc of the VA’s place in American life shows that the agency has always struggled against ideological enemies and against commercial health care providers who would stand to gain business from its being privatized. The only hope is that Americans will wake up in time to save the VA from those who are trying to kill it.

The federal government’s role in providing health care to veterans began during World War I, when tens of thousands of American “doughboys” came home with hideous, lingering physical and psychological injuries, including, most tragically, exposure to poison gas and what that lost generation called “shell shock.” Congress responded by creating the Veterans Bureau, a predecessor to the VA, which set about constructing specialized veterans’ hospitals and soldiers’ homes in major cities across America. These facilities were and still are distinct from the separate military hospitals run by the Department of Defense, such as the Walter Reed National Military Medical Center, which care for men and women still in the service.

During and after World War II, the VA experienced a period of reinvention and uplift as an agency. Not only did it successfully administer all the benefits of the newly created GI Bill—from VA mortgages to subsidized college tuition—it also began a deep collaboration with American medical schools that put it at the heart of modern American medicine.

Under this partnership, medical schools conduct practical training for medical interns and residents at VA hospitals, while many VA doctors hold joint appointments on the faculties of medical schools. Today, an estimated 65 percent of all doctors practicing in the United States have received all or part of their residency training in VA facilities, while doctors employed by the VA often engage in important medical research (and have even racked up three Nobel prizes in medicine).

During the Vietnam War and its aftermath, however, the VA health care system again came under acute stress. Part of the challenge derived from medical breakthroughs: thanks to improvements in combat medicine and air evacuation, many wounds that would have been fatal in previous wars now resulted instead in severe long-term injuries and disabilities that swelled patient loads of VA hospitals. But Vietnam vets also came home to VA hospitals that were woefully underfunded and run-down. Many Vietnam vets were further outraged by what they saw as the VA’s refusal to take seriously their complaints about post-traumatic stress and exposure to chemical agents like Agent Orange.

A new generation of veterans’ activists brought needed, if sometimes distorted and sensational, media attention to the deficiencies of veterans’ health care during the war and afterward. Recalling a lurid 1970 Life magazine photo essay about conditions in the Kingsbridge VA hospital in the Bronx, the activist Oliver Meadows would later admit that it “was totally contrived, we helped them all the way,” while at least one veteran interviewed for the story admitted that activists had staged scenes to make conditions in the hospital look more awful than they really were. Yet there is also no doubt that the disrespect American society generally showed Vietnam vets upon their return extended to the lack of funding and focus given to VA hospitals.

During the 1980s and ’90s, conservatives increasingly leveraged the damaged public image of the VA to make arguments against any move toward “socialized medicine,” including the universal health care plan laid out by Bill and Hillary Clinton in 1993. “To see the future of health care in America for you and your children under Clinton’s plan,” argued the conservative activist and author Jarret B. Wollstein, “just visit any Veterans Administration hospital. You’ll find filthy conditions, shortages of everything, and treatment bordering on barbarism.” Deploying rhetorical strategies it would later use against the Affordable Care Act, the libertarian Cato Institute, then heavily funded by the Koch brothers, piled on with a white paper proclaiming that “the history of the [VA] provides cautionary and distressing lessons about how government subsidizes, dictates, and rations health care when it controls a national medical monopoly.”

Yet just as the reputation of the VA was reaching a nadir, the organization was undergoing a transformation behind the scenes that, within a few short years, would result in its outperforming the rest of the America health care system in safety, adherence to evidence-based care protocols, and other standard metrics of health care quality. For example, in 2003 the prestigious New England Journal of Medicine published a study that used eleven measures of quality to compare veterans’ health facilities with fee-for-service Medicare. In all eleven measures, the quality of care in veterans’ facilities proved to be “significantly better” than private-sector health care paid for by Medicare.

Mar-16-Mundy
Credit:

Other studies began appearing in the early 2000s showing that the VA was light years ahead of the rest of the health care system in the meaningful use of electronic medical records, investment in disease prevention, and integration of care. In 2007, the prestigious British medical journal BMJ noted that while “long derided as a US example of failed Soviet-style central planning,” the VA “has recently emerged as a widely recognized leader in quality improvement and information technology. At present, the Veterans Health Administration offers more equitable care, of higher quality, at comparable or lower cost than private-sector alternatives.”
Bad optics: A 1970 Life magazine photo essay showed the lurid conditions Vietnam veterans faced in VA hospitals—but activists later admitted to staging scenes to make things look worse.

The change agent who led the turnaround was Dr. Kenneth W. Kizer, whom President Bill Clinton appointed as VA undersecretary for health in 1994. The story of Kizer’s transformational leadership of the veterans’ health system has been widely told in the peer-reviewed literature on health care quality. (It was first chronicled in the popular press by the Washington Monthly in a 2005 cover story by Phillip Longman, which subsequently led to a book, now in its third edition, called Best Care Anywhere: Why VA Care Is Better than Yours.)

Accounts of the Kizer revolution generally stress how he took advantage of the VA’s long-term relationship with its patients and its ability to operate as a patient-centered, integrated system. The VA tends to have its patients for life, often extending to long-term nursing home care. This means that it has an incentive as an institution to invest in prevention, effective disease management, and other measures that maximize long-term well-being. And because the VA is a large, integrated system, it has the ability to coordinate care among specialists, so that patients are treated as whole persons rather than as collections of failing body parts. Though no one used the term at the time, Kizer transformed the VA into what health care policy wonks today describe as an “accountable care organization,” or ACO, in which the well-being of patients and providers are actually aligned.

To implement these changes, Kizer radically decentralized power within the VA, pushing it out of the central office in Washington and into the field. For example, Kizer embraced a dissident, previously persecuted subculture of front-line employees, known as the “Hard Hats,” who had been experimenting with using their personal computers to improve the practice of medicine. The result was software written by doctors, for doctors, that pushed the VA roughly twenty years ahead of the rest of the U.S. system in its use of what we today call electronic medical records and telemedicine. Kizer also allowed regional managers far more autonomy over the operations of local VA hospitals and clinics, while at the same time using the VA’s new capacities in information technology to hold managers accountable for meeting measurable performance metrics.

During the Kizer era, the major veterans’ service organizations joined health care policy wonks in applauding the high quality of care offered by the VA. Indeed, the American Legion, while initially skeptical of his reforms, would wind up giving Kizer a lifetime achievement award, while measures of patient satisfaction generally showed the VA outperforming Medicare and the private insurance plans. Veterans also applauded the decision made by the Clinton administration in 1996 to relax eligibility standards, so that any honorably discharged veteran could receive VA care for life, no questions asked.

Under George W. Bush’s administration, studies continued to show the VA outperforming the rest of the U.S. health care system on most metrics of safety, quality, health IT, and patient satisfaction. Often, the administration seemed not quite sure what to do with this fact. In 2004, when Bush announced an initiative to push for greater adoption of electronic medical records throughout the U.S. health care system, he did so by traveling to the Baltimore VA Medical Center and showcasing the world-class health IT in place there. “I know the veterans who are here are going to be proud to hear that the Veterans Administration is on the leading edge of change,” Bush explained, without showing any evident discomfort with praising the largest actual example of socialized medicine in the United States.

Yet behind the scenes, the administration took many measures to undo the quality transformation that had occurred under Kizer’s leadership, including the freedom given to front-line employees. Partly this was the result of the tendency of top managers in all large organizations to want to exercise control. But it also reflected the Bush administration’s commitment to outsourcing more and more VA functions.

Bush’s political appointees at the VA, for example, quickly squashed software innovation in the field by reconsolidating bureaucratic control over all things digital in Washington and then contracting with venders of private, proprietary software. At one point, the VA even lost control of its own lab software system to Cerner, a private corporation that dramatically ramped up spending on lobbying during the middle of the last decade.

And, increasingly, Bush’s political appointees at the VA began outsourcing more care to private health care providers, often with unhappy results. For example, between 2002 and 2008, the Philadelphia VA outsourced its prostate cancer unit to a team from the University of Pennsylvania. Investigators later found that of the 114 patients who went through the treatment, ninety-two received either too much or not enough radiation to the prostate, and in some cases the physician missed the prostate altogether. Outsourcing also led to financial waste and fraud. For example, the VA inspector general found that in the last year of the Bush administration, 37 percent of the $3.2 billion the VA spent on outsourced care was improperly paid.

At the same time, many key changes that the VA needed to make to ensure access were neglected. One was fixing the worsening misalignment of VA capacity. Nationwide, the number of veterans was shrinking, with the passing of the huge cohorts of World War II- and Korea War-era vets. The decline was, and continues to be, particularly steep in California and throughout much of New England, the mid-Atlantic states, and the industrial Midwest.

Reflecting this decline, as well as a general trend toward more outpatient services, many VA hospitals in these areas, including flagship facilities, want for nothing except sufficient numbers of patients to maintain their long-term viability. At the same time, however, large numbers of aging veterans have been moving from the Rust Belt and California to lower-cost retirement centers in the Sun Belt, which often have more patients than they can easily handle.

Under Bush, the VA appointed a commission to recommend what hospitals and properties it needed to close or dispose of, and where it needed to build new capacity. But the commission’s recommendations were ignored. By 2008, the General Accountability Office (GAO) estimated that the VA was spending approximately $123,000 per day to maintain vacant or underutilized assets. At the same time, no new VA medical centers came on line during the Bush years, including in high-demand areas.

This failure to build new capacity where it was needed made it more difficult to provide veterans with timely appointments in places like Tampa and Phoenix. Though there is no evidence of any harm to patient care, during Bush’s second term the VA inspector general was warning that front-line employees in eight VA facilities had been caught juking waiting lists to make them appear shorter than they actually were. In 2007, a follow-up report found that the practice was still occurring and that the VA had not fully implemented five of the eight steps the GAO had recommended to eliminate it.

The Bush administration also reversed the liberal eligibility standards that the Clinton administration had established. No longer were all honorably discharged veterans welcomed at VA hospitals; instead, to qualify for care veterans would have to prove that they were either indigent or suffering from a service-related disability. This gave rise to much more time-consuming and bureaucratic processes, as VA employees had to determine, for example, whether a veteran’s Parkinson’s disease was due to exposure to Agent Orange in Vietnam or to some other combination of environmental and genetic factors.

This, combined with the increasing volume of vets returning from Iraq and Afghanistan, contributed to an increasingly large backlog of unprocessed eligibility claims. For those who managed to get into the VA, the quality of care continued generally to be demonstrably better than that found out outside the system. A systematic review of thirty-six studies comparing the quality of VA and non-VA care found that as of 2009, “almost all demonstrated that the VA performed better than non-VA comparison groups.” But during the Bush years, access was becoming an increasing problem, causing many vets to become embittered, though often without understanding what the root cause of the problem was. As frustrations with red tape mounted among vets and the press focused on breakdowns in claims processing, the conditions were set for new attempts by conservative ideologues and corporate health care providers to privatize the VA.

A month after winning the election, President-elect Barack Obama nominated the retired four-star Army general Eric Shinseki to head the VA. He was unanimously confirmed the next month and proceeded to take on several major areas of need.

Under Shinseki, the VA reduced the number of homeless vets by a quarter. It also cut a backlog of unprocessed VA disability claims, swollen by wounded Iraq and Afghanistan vets, by 84 percent. Shinseki also helped convince Congress to make Vietnam veterans with chronic illnesses associated with exposure to Agent Orange automatically eligible for VA care.

Shinseki’s record also includes implementing a concept long sought by medical reformers as a means of overcoming the dangerous fragmentation of care so often experienced by patients. Under his leadership the VA began providing each of its patients with a specific team of health care professionals—including a specific primary care physician, nurse, social worker, pharmacist, and health technician—who managed and coordinated the patient’s care in a continuous relationship.

Under Shinseki, the VA also fully integrated mental health professionals and substance abuse specialists into its medical home teams. This practice of treating body and mind together is virtually unknown outside of the VA because insurers, including Medicare and Medicaid managed care organizations, won’t pay for it. But the innovation was crucial in treating the VA’s patient population, 25 percent of whom suffer from chronic mental illness and 16 percent of whom struggle with addiction.

The VA also continued to excel over the private sector in its use of evidence-based therapies for mental illness. A study conducted in 2014 of how often appropriate drugs are prescribed to mentally ill patients found that “[i]n every case, VA performance was superior to that of the private sector by more than 30 percent.” The VA’s adherence to evidence-based mental health treatments saved thousands of lives. Between 2000 and 2010, rates of suicide increased by 40 percent among veterans who didn’t use the VA, but declined by 20 percent among those who did.

Each of these efforts brought real benefits to veterans. But some had the effect of also making the problem of access to the VA more challenging. The VA’s use of evidence-based mental health treatments, for example, entailed providing far greater numbers of individual therapy sessions than are generally available to patients outside the VA. This practice, in combination with surging demand for mental health services among younger vets, plus a nationwide shortage of qualified mental health professionals, caused wait times to see a VA psychiatrist or therapist to lengthen, sometimes tragically for individual vets in desperate need of mental health care.

Similarly, the VA’s heavy emphasis on primary care bumped up against the reality that the American health care system as a whole faces an acute shortage of primary care physicians. Indeed, the Department of Health and Human Services estimates that the demand for primary care physicians outstripped supply by 7,500 nationwide in 2010, and that the shortage is headed toward 20,400 by 2020. These factors, combined with more aging Vietnam-era vets coming in for treatment and surging enrollments among younger vets, put pressure on VA appointment schedulers in some VA facilities to keep wait times for all VA health care services from growing. Adding to that pressure was the fact that in 2011, the VA had set strict, publicly disclosed performance metrics for itself that were virtually unprecedented in American health care: it mandated that anyone enrolling with the VA for the first time be offered an appointment with a primary care physician within fourteen days regardless of whether the enrollee faced any urgent health care need.

It was a noble goal, and by and large the VA achieved it. Across facilities, veterans waited an average of just six and half days from their preferred date of care to see a primary care doctor. As a point of comparison, consider that a private survey taken at the time by the consulting firm Merritt Hawkins showed that in fifteen major medical markets across the country, non-VA patients seeking a first-time appointment with a family practice doctor had to wait an average of 19.5 days. Access is much more limited in most rural areas. Though precise comparisons are not possible due to data limitations (unlike the VA, most private health care providers aren’t required to make their performance numbers public), a recent study by the RAND Corporation has found that, given certain reasonable assumptions, “wait times at the VA for new patient primary and specialty care are shorter than wait times reported in focused studies of the private sector.”

In 2011, Republicans took control of the U.S. House. Jeff Miller, a Tea Party conservative from Florida, became chairman of the House Veterans Affairs Committee. Meanwhile, John Boehner became the new speaker. He had caused a controversy about twenty years earlier when he proposed privatizing the VA. He hadn’t talked much about it since, but that didn’t mean he wasn’t thinking about it.

Mar-16-McLean8
Bern notice: Before his run for president, Bernie Sanders struck a deal with Republicans to outsource VA health care in return for funding increases. Credit:

Later that year—on Veterans Day, appropriately—Mitt Romney was at a barbecue in South Carolina with a group of vets, campaigning for the GOP presidential nomination. During his talk, he floated the idea of a voucher system for VA health care similar to one he was then proposing for Medicare. The Veterans of Foreign Wars was the first of the major veterans’ service organizations to object to the idea: “The VFW doesn’t support privatization of veterans’ health care,” its spokesman told the news website Talking Points Memo. “This is an issue that seems to come around every election cycle.” Indeed, when Senator John McCain was running for president in 2007, he too floated an idea—first developed by the Koch-funded Cato Institute—of giving vets a voucher, in the form of a “plastic card” that they could use to pay for care from private doctors. In the face of protests from VSOs, McCain stopped talking about the idea. Like McCain, Romney also quickly walked his voucher proposal back.

The VSOs were clearly the single biggest obstacle to the conservative dream of voucherizing the VA. Yet in many ways they were weaker than they appeared. Their base of members was shrinking and aging. Moreover, there was a basic ideological tension in their support of traditional VA health care. On the one hand, many VSO members were grateful recipients of VA health care, and even those who weren’t saw such care as a benefit they had earned through their service to the country and might someday use. On the other hand, many of those VSOs’ members leaned Republican, and VA health care is pretty obviously the closest thing America has to socialized medicine: health care delivered by a giant government bureaucracy whose employees are represented by eleven different unions. This internal conflict had long existed, but it had never been fully exploited, until Concerned Veterans for America organized in 2012 and found the funding it needed.

Though the CVA’s incorporation papers don’t reveal its donors, Wayne Gable, former head of federal affairs for Koch Industries, is listed as a trustee. The group also hired Pete Hegseth as its CEO. Hegseth is an Army reserve veteran of Iraq and Afghanistan with two Bronze Stars and degrees from Princeton and Harvard. He is also a seasoned conservative activist, having been groomed at a series of organizations connected to—and often indirectly funded by—the Koch brothers: the Princeton Tory newspaper, where he was publisher; the Manhattan Institute, where he was a policy specialist; and Vets for Freedom (VFF), an advocacy group where he was executive director prior to moving to the CVA.

At VFF, his main job was to counter liberal groups such as MoveOn.org and VoteVets.com that were calling for George W. Bush to bring the troops home. Smart, quick, and telegenic, he popped up frequently on conservative radio, Fox News, and mainstream media, and led clusters of like-minded vets to members of Congress to defend Bush and his “surge” in Iraq. During 2008, the VFF undertook a million-dollar campaign against then Senator Barack Obama, who had voted against the war in Iraq in 2006.

At the CVA, Hegseth played a similar role. He went on TV and wrote op-eds defending Mitt Romney and attacking Obama. But this time, his message was no longer just focused on military matters, but on a broad array of economic policy issues, with an unambiguous libertarian bent, including turning traditional military pensions into portable 401(k) plans. In an interview on CNN, he segued from veterans’ support for Romney to fiscal matters in the 2012 election: “We’ve got to be serious about reforming defense so we can also be serious about reforming entitlements like Medicare, Medicaid, and Social Security and getting our debt under control.”

Republicans were in disarray at the start of the second Obama administration in 2013, but the CVA was just hitting its stride. It sent Hegseth and his colleagues on tours around the country, with rallies and town hall meetings aimed at gathering members and garnering support for the organization’s agenda. It began a series of breakfast meetings in Washington with leading conservatives—one, sponsored by the Weekly Standard magazine, featured Hegseth and guest speakers North Carolina Senator Richard Burr, Florida Representative Jeff Miller, and Stewart Hickey of AMVETS, a conservative veterans’ advocacy group that supported most of the CVA’s proposals. It hired key behind-the-scenes players in the Republican veterans’ policy world such as Dan Caldwell, staffer to Representative David Schweikert, a newly elected darling of the Tea Party from Arizona, and Darin Selnick, a former political appointee in George W. Bush’s Department of Veterans Affairs.

Hegseth became a fixture on Fox and was a guest on Bill Maher’s show. He coauthored a Washington Post op-ed with Representative Duncan Hunter, the hawkish California Republican, calling for Shinseki to resign over the backlog of VA benefits—a backlog the VA chief was in the midst of resolving—charging, with no evidence, that veterans were dying while waiting for benefits. Reviving a tactic that conservatives had used in the early 1990s against “Hillarycare,” he also put out a steady stream of op-eds in which he trashed the Affordable Care program by comparing it to alleged VA dysfunction. “If you really want to know what Obamacare is going to be like, just look at the VA system.”

Behind the scenes, some leaders at the traditional VSOs became alarmed about the CVA. According to a longtime executive with one of the groups, “We didn’t know what they did, but they were the new go-to guys when [the media] needed a veteran’s perspective or a quote.” The VSOs were being marginalized, he said, and were too large and too slow to react.

By late 2013, Hegseth and the CVA were making the case that the VA needed “market-based” reform that provided vets with more “choice” to receive care from private doctors and hospitals (though they were careful not to use unpopular words like “vouchers” or “privatize”). They were also signaling their sympathy for another abiding cause of the Koch brothers: crushing the power of unions.

In his media appearances and op-eds, Hegseth blamed the agency’s problems on a lack of “accountability,” and argued that the VA was “unable to fire bad employees and reward good employees based on merit (instead of tenure).” The CVA was among the first groups to applaud legislation, introduced in February 2014 by Florida Senator Marco Rubio and Representative Miller, that would enable the VA secretary to fire underperforming managers by limiting the civil service appeals process. Unions like the American Federation of Government Employees, which represents many VA workers, objected, but got little notice. The CVA also launched a website calling for greater accountability at the VA and warning that veterans were dying because of poor-quality care at the agency.

Then, on April 9, 2014, at a hearing in the House Committee on Veterans’ Affairs, Representative Miller dropped the bomb. He announced that his staff had been quietly investigating the VA hospital in Phoenix and had made a shocking discovery: some local VA officials had altered or destroyed records to hide evidence of lengthy wait times for appointments. And worse, Miller claimed, as many as forty veterans could have died while waiting for care.

This latter charge guaranteed screaming headlines from the likes of CNN, but was later shown to be unsubstantiated. An exhaustive independent review of patient records by the VA inspector general uncovered that six, not forty, veterans had died experiencing “clinically significant delays” while on waiting lists to see a VA doctor, and in each of these six cases, the IG concluded that “we are unable to conclusively assert that the absence of timely quality care caused the deaths of these veterans.”* In other words, the reality behind the headlines had little, if any, more significance than the fact that people die every day while waiting for an appointment to see their tax accountant or lawyer.

Those who showed up on waiting lists usually turned out to have been waiting for a routine visit with a primary care doctor rather than facing an urgent health care problem. Moreover, among those shown as waiting to see a primary care physician, many turned out to be already under the active care of a VA or non-VA specialist. In only twenty-eight out of the more than 3,000 patient cases examined by the inspector general was there any evidence of patient care being adversely affected by wait times. During the worst of the “crisis,” fully 89 percent of patients received appointments within thirty days of their preferred date. There was a long backlog of people waiting to see a urologist, but the nation as the whole faces an acute shortage of specialists in that field.

Moreover, the wait times in Phoenix were not typical of the system as a whole. Capacity constraints, for example, were greater in Phoenix than in most of the rest of the country due to the large number of retirees who had moved to the area in recent years, including “snow birds” who used the Phoenix-area VA system only during the winter months. In most VA facilities, wait times** for established patients to see a primary care doc or a specialist were in the range of two to four days, which compares favorably to the experience of most patients seeking care outside the VA. For the VA system as a whole, 96 percent of patients received appointments within thirty days.

In short, there was no fundamental problem at the VA with wait times, in Phoenix or anywhere else. But there was evidence of specific VA employees in Phoenix and other facilities using unorthodox scheduling practices to make wait times look shorter than they were, just as had happened during the Bush administration. Under Shinseki, the VA’s central office tried to crack down by issuing flurries of admonishing memos. Unfortunately, however, these edicts had little effect, in part because Shinseki had upped the ante. His metric demanding that all newly enrolled patients within the VA be offered an appointment with a primary care doc within fourteen days was a benchmark worth striving for, and one that few other health care providers would dare hold themselves accountable for meeting. But in trying to impose this ambitious goal on already-overstrained employees and facilities, the VA made itself vulnerable to enemies who were already set to pounce.

The Arizona Republic blew out the wait times story with details on the VA executives who were cooking the books. CNN swept in and produced a story on a veteran described as having died of bladder cancer while waiting for an appointment to see the right specialist. Another had committed suicide awaiting for a callback from the VA to schedule an appointment. The White House was completely caught by surprise, and Democratic leaders felt blindsided by VA officials. No one, including individuals and groups who normally defended the VA, was sure just how far the scandal might go, and so they hesitated to respond while they sought out more information.

The CVA, however, jumped into action. Within days, the group was holding a veterans’ rally in Phoenix, demanding the resignation of Shinseki and two of his top lieutenants and criticizing Obama. The VA wait list issue dominated headlines for days, as the CVA’s Hegseth and Selnick were everywhere—on TV, radio, and Capitol Hill.

It wasn’t long before House and Senate leaders were holding hearings and backroom conversations about legislation to overhaul the VA. During a meeting of the Senate Committee on Veterans’ Affairs, ranking member Bernie Sanders asked representatives of each major service organization whether they still believed that the VA provided superior care; all said that they did. But the VSOs had to deal with their own agitated rank-and-file members, and in early May, the American Legion, the largest VSO, joined with the CVA in calling for Shinseki’s resignation. The call opened the flood gates, and Shinseki soon resigned. Hegseth’s statement read, “This is only the beginning . . . it’s essential for Congress to pass systemic reforms at VA in the coming weeks and months, bringing real accountability, transparency, and choice to the Department of Veterans Affairs.” A few weeks later, Congress complied. Each house passed similar versions of bills that would, among other things, make it easier to fire underperforming VA employees and allow vets to get private health services outside the VA system.

Days after these bills passed, Hegseth attended a meeting with Charles Koch, along with leaders of the brothers’ network of political organizations and other leading conservative donors, in Dana Point, California, for the Kochs’ annual summer strategy session. It was a private, invitation-only gathering, but someone taped the session; the recording was later made public by the website Undercurrents and written about in the Nation. Hegseth certainly had every incentive to impress his donors; but even allowing for that, the speech he gave to the group is worth quoting at length. It reveals much not only about the CVA’s central role in promoting the VA scandal and subsequent legislation, but also about its broader plans to undo worker protections and, ultimately, gut Big Government and unions.

Concerned Veterans for America is an organization this network literally created to empower veterans and military families to fight for the freedom and prosperity here at home that we fought for in uniform on the battlefield. . . . Now, unless you’ve been living under a rock for the last couple of months, you know about the crisis at the Department of Veterans Affairs. What you probably don’t know is the central role that Concerned Veterans for America played in exposing and driving this crisis from the very beginning.

After years of effort behind the scenes privately and publicly, the scandal eventually made national headlines when initially in Phoenix it was exposed that veterans were waiting on secret lists that were meant to hide the real wait times veterans had at VA facilities of months and months and months. Veterans literally dying while waiting on secret lists that benefited only bureaucrats.

In driving [inaudible] and monitoring this crisis, we utilized the competitive advantage that only this network provides: the long-term vision to invest and the resources to back it up. We focused relentlessly on both exposing the failures of VA bureaucracy and improving the lives of veterans, meeting our people where they’re at.

The Concerned Veterans for America issue campaign pushing for systemic reform of VA bureaucracy is of critical importance, we think, for three key reasons. First, it is going—it has produced and will produce more market-based public policy victories that will improve the lives of veterans and their families; second, it provides the perfect opportunity to educate the American people about the failures of big government; and three, to position us for the long term as a trusted, effective, and credible grassroots organization we can build upon. . . .

Two pieces of groundbreaking VA reform legislation passed the House of Representatives with an overwhelming majority. . . . And Nancy Pelosi and the majority of collectivists voted for them. They didn’t like the bills, but they had to vote for the bills because they were outnumbered by a new, nimble, and principled movement of veterans. . . .

Ten days ago, the Senate struck a historic deal, a deal that Concerned Veterans for America was central to in every aspect, literally ensuring that the language stay focused on real market-based reform, and we pushed the ball across the finish line. . . . This bill would empower the secretary to actually fire a manager for cause . . . [and veterans] will literally get a card and the ability to visit a private doctor if they need.

The latter reform, which seems like a no-brainer to everyone in this audience, is a huge development, rocking the core of big-government status quo in Washington. The option for veterans to choose private care upends how the VA has fundamentally done business for the last seventy years, attacking the very heart of the failed top-down, government-run, single-payer health care system that’s failed veterans.

Throughout this effort, Concerned Veterans for America, along with our network partners, have intentionally broadened the debate to include big-government dysfunction generally, further fortifying a new skepticism that AFP [Americans for Prosperity, the Koch-funded political advocacy organization] and others have brought to what government-run health care does.

Hegseth closed out his remarks with a personal thank-you to Charles and David Koch and their team.

In Washington during the summer of 2014, lawmakers worked feverishly to iron out differences between House and Senate versions of the VA overhaul legislation. Republicans, especially in the House, wanted to roll back civil service protections that would make dismissals for low performance easier to apply to employees deep within the bureaucracy. Labor unions pleaded with the Democrats to limit the rollback only to top VA managers.

Libertarian warrior: Pete Hegseth, an Iraq and Afghanistan veteran turned prominent conservative activist, was a fixture on cable news and in congressional offices.

Meanwhile, major private medical institutions and health care systems had been waiting for this opening, circling like vultures over the idea of dividing up the VA’s multibillion-dollar budget. In July, a group of medical center leaders told Representative Miller’s committee that they would be more than happy to have more such VA business. The president of the American Hospital Association, Rich Umbdenstock, talked about how to apply “best medical practices” from the private health care sector to the VA, saying private hospitals “are harnessing the power of collaboration to dramatically improve the quality and safety of patient care,” and urging lawmakers to ensure that the new law would avoid any barriers to outsourcing. A representative from the Duke University Health System even speculated that the VA could become a “hybrid” of government and private providers and their shared facilities.

Behind the scenes, the lobbying was even more intense. According to Democratic and Republican Senate staffers, directors of major medical institutions were calling their senators and representatives to talk about what private medicine could do, and what it would mean in terms of jobs and economic growth in their states and congressional districts. A longtime staffer to a member of the GOP leadership who was involved in VA legislation told me, “My boss had a lot of invitations to golf in his district, and at some of the finest golf clubs in America, from big hospital system CEOs” who wanted to talk about how their facilities were ready to absorb veterans in their area for certain kinds of treatment—usually expensive, usually in their new wings.

As the lobbying for more outsourcing heated up, the major VSOs, including the American Legion, the VFW, Paralyzed Veterans of America, and Disabled American Veterans, finally got off the sidelines and began to reach out to Hill offices. They said they and their members did not want the VA to be dismantled and that they were worried that the “plastic card” envisioned in the legislation would be the first step in total VA privatization. They reminded anyone who would listen that the wounds of combat, seen and unseen, required doctors and nurses experienced in treating such injuries and sensitive to the unique life experiences of those who serve in the military.

Their case against privatizing the VA was, ironically, further underscored when President Obama signaled that he was considering putting Dr. Delos “Toby” Cosgrove, CEO and president of the world-renowned Cleveland Clinic, in charge of fixing the VA. Cosgrove unexpectedly withdrew his name on June 7, hours after the journal Modern Healthcare broke a story revealing a long history of safety problems at the Cleveland Clinic—such as a suture needle having been left inside a patient after surgery, among other charges. The problems had been so severe that the federal government had repeatedly threatened to cut the $1 billion in Medicare funds that flowed annually to the clinic.

Fortified by their VSO and union allies, the Democrats, led by Bernie Sanders, managed to minimize their losses in their negotiation with the Republicans. Under legislation that came to be known as the Veterans’ Access to Care through Choice, Accountability, and Transparency Act, they managed to get increased funding for the VA. And Republican demands to loosen civil service protections to VA employees were limited to those in the senior executive service. But the law also called for the creation of a “Choice Card” system that was designed as a first step toward privatizing VA health care services. Under the statute, veterans who lived forty miles from a VA hospital or had to wait more than thirty days for a VA appointment were promised that they could use their Choice Card to receive care from a network of private providers, much like one would use a private health insurance card.

The basic idea of the VA partnering more with private providers was not flawed in principle. Indeed, the agency already had programs through which it contracted private doctors to perform certain kinds of specialty care or care in remote regions where it lacked facilities. The VA also had an extensive history of collaborating with academic medical centers. Done right, closer collaboration between VA and non-VA providers could improve care for everyone in many areas. But the new legislation set in motion a “choice” program in which the government would be paying for bills submitted by private providers for care that was unmanaged, uncoordinated, and, to the extent that it replicated the performance of the private health care system, often unneeded. This is the very opposite of the integration and adherence to evidence-based protocols that has long made VA care a model of safety and effectiveness.

Worse, implementation of the Choice Card was a disaster from every point of view. Congress gave the VA only ninety days to stand up the program. Largely because of that insane time line, the VA was able to attract bids from only two companies. Each of these has a sole contract that gives it a monopoly wherever it operates, and each put together networks that were so narrow and poorly administered that that for many months vets who received Choice Cards typically could not find a single doctor who would accept them.

Over the course of 2015, many of these problems of implementation were at least partially sorted out, but the basic flaw in the model remains. Where does this leave the VA going forward? The privatizers are doubling down, and it looks like they may well prevail.

As part of the Choice Act, Congress established a Commission on Care to examine the long-term future of VA health care. Under its enabling statute, any recommendation the commission makes must be enacted, provided it is “feasible.” The commission met for the first time in September of 2015. Its members, chosen by congressional leaders and the White House, weigh heavily in favor of privatization.

Four of the fifteen represent major medical centers that stand to gain from the outsourcing of veterans’ care to private providers. One of those is Toby Cosgrove, the Cleveland Clinic CEO who dropped out of the running for the VA job. Meanwhile, Concerned Veterans for America effectively has two seats: one for its official Darin Selnick, and one for the head of AMVETS, which is closely aligned with CVA positions. Only one member on the commission—a retired representative from Disabled American Veterans—is from a mainstream VSO. The American Legion and the VFW are not represented. (Also on the panel, appointed by Senator Harry Reid, is the health care analyst Phillip Longman of Johns Hopkins University and New America, who is also a senior editor at the Washington Monthly.)

The first item on the commission’s agenda was to review the $68 million worth of reports about the VA that numerous management consultancies had put together as mandated by the Choice Act. The reports identified many challenges facing the VA. Predictably, they found the organization suffering from an acute crisis in morale. The agency is enduring wave after wave of retirements, and recruitment of quality managers has become nearly impossible. Who would want to go to work at a place the press routinely has described as “broken,” and that Republicans routinely have promised to privatize? The consultants also reported that the VA’s remaining employees have developed an extremely “risk-averse” culture, as they hope to survive simply by keeping their heads down. The consultants also pointed out other well-known deficiencies of the VA: its aging infrastructure, lack of sufficient capacity in high-demand areas, and the continuing dysfunction of its increasingly outsourced health IT system.

Yet if the VA has taken a licking, it still keeps on ticking. The consultants’ independent research reviewed by the commission also found that the VA still generally outperforms or matches the rest of the health care system on most measures of quality. For example, work done by RAND compared VA care to private-sector care on six quality measures of inpatient safety, six of inpatient safety outcomes, thirty on effectiveness (split between inpatients and outpatients), and eleven on patient-centeredness for the inpatient setting. They also conducted “descriptive analyses” of the individual facilities.

In the end, RAND said, “The average performance of VA facilities was the same or significantly better than the average performance of non-VA care on the majority of quality measures analyzed for inpatient and outpatient settings.” This finding, as we’ve seen, matched numerous other analyses over many years. In February, a new study in the Journal of the American Medical Association found that VA hospitals compare favorably to non-VA hospitals in treating older men with three common conditions: heart attacks, heart failure, and pneumonia. The VA also continues to excel over the rest of the health care system in treating conditions that particularly affect combat veterans, such as polytrauma and traumatic brain injuries.

Nonetheless, Cosgrove and all but one of the commissioners representing private health care providers have recently put themselves on record as being in favor of getting the VA entirely out of the business of providing health care to vets. In their vision, which appears to be shared by a majority of the commission members, the VA will become a pure “payor” of bills sent to it by private providers.

The ultimate fate of the VA will likely be determined in the coming months. The Commission on Care, which has been holding hearings throughout the winter of 2016—hearings that have received no attention in the mainstream press—is scheduled to announce its recommendations in June. The president might act on those recommendations—or, more likely, will leave the job to his successor. While all the remaining GOP candidates, including Donald Trump, have come out in favor of more outsourcing of VA care, Hillary Clinton and Bernie Sanders have both spoken out strongly against it. So far, the VA issue has played out on the margins of the presidential race. But given the stakes and the ideological valence of the subject, that could change, especially once the commission’s recommendations are made public. In the end, it may be the voters who—knowingly or not—decide the future of the VA, and of the quality of the health care afforded millions of veterans.

[*Language clarified for accuracy.]
[** Correct hyperlink added .]

The post The VA Isn’t Broken, Yet appeared first on Washington Monthly.

]]>
971 Mar-16-Mundy Mar-16-McLean8 Getty Images
Mend, Don’t End, Fannie and Freddie https://washingtonmonthly.com/2016/03/13/mend-dont-end-fannie-and-freddie/ Sun, 13 Mar 2016 20:54:24 +0000 https://washingtonmonthly.com/?p=926

Conservatives blame the mortgage giants (wrongly) for the financial crisis, and both parties want them dead. But to finish the job of financial reform without destroying the housing market and costing taxpayers billions, we need to let them live.

The post Mend, Don’t End, Fannie and Freddie appeared first on Washington Monthly.

]]>

It’s been almost a decade since the slow-rolling financial crisis, which reached its grand finale in the fall of 2008, got started. But as the response to The Big Short, the Academy Award-nominated film about the crisis based on Michael Lewis’s book of the same name, shows, there’s still a big fight about what actually went wrong. Some on the right wing immediately decried the movie, which focused on Wall Street’s greed, for ignoring the problems with government policies encouraging homeownership—specifically, the role of the so-called government-sponsored enterprises (GSEs), the Federal National Mortgage Association and the Federal Home Loan Mortgage Association, better known as Fannie Mae and Freddie Mac.

While most Americans don’t know what Fannie and Freddie do, many of us are in an intimate financial relationship with them involving the most important financial instrument in our lives—our mortgage—and the most important asset—our home. The way we finance housing, which makes up some 20 percent of the U.S. GDP, affects anyone who has a stake in our economy.

The idea that these little-understood but critically important companies caused the crisis is just the icing on top of the controversy about Fannie and Freddie, which were created by Congress to serve the dream of the United States as a society of individual homeowners. The two are essentially giant insurance companies. They stamp mortgages made to American homeowners with a guarantee that they’ll pay the principal and interest if the homeowner can’t. Their stamp makes it possible to package the mortgages backed by homeowners’ monthly payments into securities, which are then sold to investors, who otherwise wouldn’t want to bet their money that you and I will pay in full and on time. For years, although Fannie and Freddie had all the trappings of normal companies—shareholders, boards of directors, stocks that traded on the New York Stock Exchange—they were also, in part, government agencies, with a congressional mandate to foster homeownership. Everyone always believed that if there were a crisis, the government would rescue them. Critics hated their government-granted political and financial power, their structure—wasn’t it impossible for them to serve both shareholders and homeowners?—and the very idea that the government needed to be involved in the housing market.

Most people who weren’t paying close attention probably date the beginning of the global financial crisis at September 15, 2008, the day Lehman Brothers declared bankruptcy. But a few days earlier, on September 6, the U.S. Treasury put Fannie Mae and Freddie Mac into a status called “conservatorship,” a kind of government life support system hooked up because the rapidly swooning mortgage markets had put Fannie and Freddie in mortal peril, and their failure would have caused global economic chaos. The Treasury gave Fannie and Freddie an immediate $200 billion line of credit.

The conservatorship was orchestrated by Hank Paulson, then secretary of the treasury, who told President George W. Bush in a meeting at the Oval Office that it was, in essence, a “time out.” According to the rhetoric in Washington at the time, that time out was supposed to end with the death of Fannie and Freddie and the creation of some better, less conflicted, more pure way of financing homeownership. “This is an opportunity to get rid of institutions that shouldn’t exist,” said Paul Volcker, the revered former chairman of the Federal Reserve, in 2011. Said President Barack Obama in 2013, “I believe that our housing system should operate where there’s a limited government role, and private lending should be the backbone of the housing market.”

And yet, here we are in 2016, and—surprise!—the companies are still very much with us. The Dodd-Frank Wall Street Reform and Consumer Protection Act, which was supposed to reshape the financial sector and which President Obama signed into law in the summer of 2010, quite deliberately did not deal with Fannie and Freddie. Nothing has happened since then, either. The GSEs remain wards of the government. As the longtime housing analyst Laurie Goodman wrote in a 2014 paper, “The current state of the GSEs can best be summed up in a single word: limbo.” It turns out that solving the problem of Fannie and Freddie is the most difficult problem of the financial crisis.

Meanwhile, the mortgage market in the United States has effectively been nationalized, too. In fact, it is precisely the opposite of what President Obama said he wanted. According to Goodman, from 2008 to 2013 the government, mainly in the form of Fannie and Freddie, was the major source of credit for most people who got mortgages in the five years following the crisis. This trend hasn’t changed. Goodman recently noted that the “private label” market—mortgages packaged into securities by Wall Street, rather than by Fannie and Freddie—which hit $718 billion in 2007, plunged to $59 billion in 2008 and has not been above $64 billion since.

Nor have Fannie and Freddie shrunk. They still have some $5 trillion in securities outstanding. By one important measure, they are in more precarious shape than they were in the run-up to the crisis: thanks to a 2012 amendment to the terms governing their conservatorship, the government is taking almost every penny of profit that the two companies generate, so Fannie and Freddie have not been allowed to rebuild any capital, which could absorb losses in the event of another downturn in the housing market. “The two mortgage funders are effectively federal bureaucracies, stripped of their independence, with basically zero capital, but still dominating the market for mortgage financing,” wrote the conservative pundits Alex Pollock and James Glassman in a recent Politico piece. “We are faced with running this business with really no cushion. It is a challenging situation for us,” Fannie Mae CEO Timothy Mayopoulous said on a conference call in early 2015. “It’s the last unsolved issue of the financial crisis, and the ramifications are enormous for everyone,” says Ryan Israel, a partner at a hedge fund called Pershing Square.

Not only is the issue unresolved, signs of movement toward resolution are few. The omnibus spending bill President Obama signed in December contains a provision effectively preventing the administration from taking any action, and leaving it up to Congress. And the issue has barely been mentioned by any of the 2016 presidential candidates.

This broad silence reflects the genuinely thorny nature of the problem, but also the fact that virtually everyone in Washington supports “solutions” that are ideologically or politically convenient but don’t make sense as policy. Tea Party Republicans favor killing off Fannie and Freddie and replacing them with nothing—a move that will, at best, hand the mortgage market over to the big banks and, at worst, crater the housing sector. The Obama administration and establishment types in both parties support eliminating Freddie and Fannie but replacing them with . . . something else. Something perfect! Something that preserves all the benefits provided by Fannie and Freddie, but eliminates the old controversies and doesn’t create new ones, and, oh, by the way, the money to fund this something will miraculously appear, and Fannie and Freddie’s existing $5 trillion in liabilities will miraculously disappear, without any unpleasant ripples. A third option, which no one in Washington supports openly but all do operationally by their own inaction, is to keep Fannie and Freddie as they are: crippled government cash cows that will have to be bailed out (again) with the next (inevitable) cyclical decline in home prices.

There is, however, a fourth option: fix the flaws in Fannie and Freddie and let them operate, as they did—effectively—for more than half a century, as the main public-private guarantors of the thirty-year mortgage. This idea might sound sensible to most Americans. But in Washington it is considered, if not completely insane, then at the very least a political nonstarter. Yet it does have some backers, including certain reform-minded financial analysts, think tank scholars, civil rights groups, lobbyists for small banks, and, curiously, a few hedge fund billionaires who bought Fannie and Freddie stock low and stand to make a killing if the companies are revived. While this odd assortment of players isn’t getting much of a hearing right now, their idea has one advantage over all the others: it would actually work.

Mar-16-McLean2
Credit:

Freddie or not: Conservatives unfairly scapegoated the two government-sponsored behemoths for the financial crisis.

It’s impossible to understand why Fannie and Freddie are such a difficult problem to solve without going back to long before the financial crisis—even before anyone had thought to invent mortgage-backed securities.

Homeownership is deeply ingrained in the American psyche, in part because our politicians have always stressed its importance. But for most of the early years of our history, the government wasn’t involved. There were huge ups and downs in real estate, and great variability in the cost and the availability of credit. By the 1920s, mortgages were typically three to ten years in length, and required high down payments—sometimes as much as 50 percent. Homeowners often only paid off the interest, not the principal, so the mortgage had to be repaid or refinanced at maturity in one big “balloon” or “bullet” payment. If someone lived on the West Coast, they might pay double the rate of a person on the East Coast, where more lenders were based.

The Depression, which set off a vicious circle of plunging home prices and lack of access to credit, made a historically bad situation seem completely untenable. By the peak of the Depression, the national delinquency rate was 50 percent, according to David Min, an assistant professor of law at the University of California, Irvine, and lenders—primarily mutually owned building-and-loan societies—were failing in large numbers.

And so the government stepped in. After President Franklin D. Roosevelt took office in 1933, Congress passed the National Housing Act, which created the Federal Housing Administration. The FHA offered to insure lenders against defaults on long-term mortgages with low down payments. It was meant to calm everything down by encouraging lenders to lend—after all, the government bore the credit risk—and borrowers to borrow, by offering them certainty about the interest they would owe, and a long time to pay back the money. In 1936, the FHA reported to Congress that “the long term amortized mortgage has gained nation-wide acceptance at uniform lower interest rates in all sections of the United States.”

The National Housing Act also included a provision that created privately owned national mortgage associations that would buy the new FHA-insured mortgages from lenders. It wasn’t enough for lenders not to have to worry about borrowers defaulting. If they also knew that they could instantly turn their loans into cash, they’d be even more willing to lend. The associations were supposed to be funded by private capital, but in the three years after the new associations were authorized, none were set up. So to demonstrate proof of concept, in 1938 the FHA helped set up a government-owned entity to buy the loans it guaranteed. This entity soon became known as the Federal National Mortgage Association, or FNMA—or Fannie Mae.

In its sponsorship of a congressionally chartered company to help increase homeownership, the United States was, and is, unique. Around the world, the most common mortgage product is a shorter-term adjustable-rate mortgage. Indeed, the rest of the world offers no evidence that you can have a mortgage market like that in the U.S., with long-term, fixed-rate loans, without some sort of system that guarantees risks investors don’t want to take.

For consumers, mortgages are commonplace, even mundane. For investors, they are dangerous—very dangerous. Dick Pratt, who was the first president of Merrill Lynch Mortgage Capital, used to say, “The mortgage is the neutron bomb of financial products.” Mortgages come packed with risks, including credit risk (the risk that the homeowner won’t pay), interest rate risk (the risk that the lender will earn less on the mortgage than it could get investing its money elsewhere if interest rates rise), and prepayment risk (the risk that a homeowner will pay off a mortgage much earlier than expected, thereby forcing the lender to replace a high-paying asset with a lower-paying one). Of those risks, the one that most investors like the least is credit risk. The longer the term of the mortgage, the more risk there is for the lender. And so it’s come to be conventional wisdom that a fixture of American life, the thirty-year fixed-rate fully prepayable mortgage, would not exist for the wide swath of American consumers but for the presence of companies like Fannie and Freddie, which remove the credit risk and disperse the interest rate and prepayment risk to a wide set of investors. The only other country in the world that offers such a product is tiny Denmark.

It wasn’t until the 1960s that Fannie was reborn as what it was originally supposed to be—a private company with all the trappings like stock that could be bought and sold. This was done because in 1967, during President Lyndon Johnson’s administration, a budgetary commission recommended that the debt of agencies like Fannie Mae be included in the federal budget. Adding to the federal debt was no more palatable then than it is today, and so, in 1968, when Johnson signed the Housing and Urban Development Act, he effectively split Fannie in two. The Government National Mortgage Association, or Ginnie Mae, stayed in the government, and guaranteed the credit on only FHA and Veterans Administration mortgages. Fannie Mae, which sold stock to the public, was allowed to guarantee mortgages made to the great American middle class—and its debt stayed off the government’s books. “I was in the government when Fannie Mae was a government-owned institution,” Paul Volcker later told the interviewer Charlie Rose. “And it was created to take care of the mortgage market in times of stress. It was privatized for extraneous reasons. It was privatized to get it out of the budget. Ridiculous.”

At the same time, no one wanted to risk hurting Fannie’s ability to grease the mortgage market. And so the 1968 legislation also gave Fannie some special advantages. One was that the U.S. Treasury was authorized to buy up to $2.25 billion of Fannie’s debt, thereby sending a signal that this was no ordinary company, but rather one that had the support of the U.S. government. Thus began what Rick Carnell, an assistant treasury secretary in the Clinton administration, later described as a “double game.” What he meant was that while Fannie and Freddie were ostensibly private companies, their debt was viewed by investors as being akin to U.S. treasuries, because everyone believed that, if necessary, the U.S. government would bail them out. This was called an “implicit guarantee,” because it wasn’t written down anywhere and didn’t officially exist.

In the ensuing decades, Fannie and Freddie (which was created in 1970 at the behest of the savings-and-loan industry, which wanted their own company to which they could sell mortgages) became two of the largest, most powerful companies in the world. What triggered it was a Wall Street invention—a new way of financing homeownership by packaging up mortgages as securities. The Big Short shows how a Salomon Brothers trader named Lewis Ranieri made the once-stodgy business of selling bonds into a sexy, high-octane gusher of profits, and, while this is true, the real story is a bit more complicated. In essence, Ranieri needed Fannie and Freddie’s guarantee to make investors willing and able to buy his new securities. For a long time, theirs was a mutually beneficial competition, but it’s not as if Wall Street was ever happy about having Fannie and Freddie siphon off some of the profits in the mortgage market. “Wall Street had a love-hate relationship with them,” says one mortgage industry veteran.

But Wall Street couldn’t do much, because Fannie and Freddie had the ear of politicians who saw how fostering homeownership could help them, and the decade of the 1990s was, at least on the surface, a golden age. Although there was some regulatory pressure, Fannie’s political power helped ensure that the regulator was weak and the companies’ capital requirements were low. (The companies were also obligated to make sure that certain percentages of the mortgages they guaranteed went to lower- and middle-income homebuyers, a requirement that later became the key source of the controversy over their role in the financial crisis.)

The mortgage market exploded in size, from just under $3 trillion in 1990 to $5.5 trillion by the end of the decade. Fannie and Freddie, by setting the standards for what kinds of mortgages they would guarantee, effectively determined the sort of mortgage that much of the American middle class would get—and, of course, they took a toll, in the form of a guarantee fee, on every mortgage that passed through them. By the end of the 1990s, Fannie Mae had become America’s third largest corporation, ranked by assets. Freddie was close behind. The companies were ranked one and two respectively on Fortune’s list of the most profitable companies per employee. Fannie, in particular, became known as a place where Democratic operatives went to make fortunes.

The profits were not just from the business of stamping mortgages with a guarantee. In addition, Fannie and Freddie began to hold the mortgages as investments on their own balance sheet. Because of that “double game,” they could make money on the difference between higher yield of the mortgage portfolio and what their cost of funds was. The “big fat gap” is what Alan Greenspan, the very powerful chairman of the Federal Reserve for almost two decades, who became one of the GSEs’ most powerful enemies, took to calling it.

Greenspan wasn’t their only enemy. Bill Maloni, Fannie’s longtime chief lobbyist, used to call the ideological opposition to the GSEs’ very existence the “vampire issue,” because it couldn’t be killed, try though Fannie might. Economists disliked the hidden subsidy in the form of the implicit guarantee. And increasingly, other players in the mortgage industry—the banks and mortgage insurers—were angry about the extent of the profits that Fannie and Freddie were siphoning off.

Mar-16-McLean3
The Chairman: Franklin Raines went from balancing Bill Clinton’s budget to playing political hardball as head of Fannie Mae. Credit:

For most of this period, Fannie and Freddie were able to shut down the opposition to them. Under the leadership of Jim Johnson—whom the Washington Post described in a 1998 profile as “one of the most powerful men in the United States,” followed by Franklin Raines, a former financier who, as the head of the Office of Management and Budget in the Clinton administration, got great credit for balancing the budget, and who people once thought could be the country’s first black president—Fannie Mae developed a reputation for playing political hardball. “Fannie has this grandmotherly image, but they will castrate you, decapitate you, tie you up, and throw you in the Potomac,” a congressional source told International Economy magazine in the late 1990s. “They are absolutely ruthless.” Gene Sperling, who was the director of the National Economic Council in the Clinton administration, used to joke, “If you think a bad thought about Fannie and Freddie, you can hear the fax machine going.” When Richard Baker, then the Republican congressman from Louisiana, began trying to get new, tougher regulation of Fannie and Freddie passed, Fannie squelched it.

The political power had a backlash. Even some of those who might have been expected to be on the GSEs’ side were offended by what they saw as their abuse of power. “The GSEs brought out a conservative side of me,” says Sperling. “The thing that turned me, that made me unwilling to do anything personally for them, is when you see that dynamic where a company is completely dependent on the U.S. government for their profit and they spend so much money and time focused on lobbying the U.S. government. It really gets kind of sick.” The fact that executives like Raines and Howard made tens of millions of dollars only heightened the anger.

But in 2004, a scandal over the accounting at Freddie, and then Fannie—over the charge, essentially, that Raines, Fannie CFO Tim Howard, and other executives had manipulated their companies’ results to please investors—led to the decapitation of the top executives at both companies. The long-standing, slow-burning resentment of the two companies exploded into the open. Fannie’s regulator even called Fannie a “government sponsored Enron.” And yet Fannie’s executives were never criminally charged, and in 2012, after eight years, sixty-seven million pages of documents, and testimony from more than 150 witnesses, a civil suit against Howard, Raines, and another executive ended with the federal judge dismissing all the charges and concluding that there was no evidence that either Raines or Howard had purposefully tried to deceive anyone.

The result was a complete tangle: Fannie and Freddie’s stable management was gone; their institutional reputations were badly tarnished; but no one among the GSEs’ many critics had the nerve—or the political support—to create anything positive out of the mess. So the GSEs rolled on, deeply wounded, with thin levels of capital and ever-more-onerous requirements to make riskier loans as the mortgage market entered its most dangerous period in history.

Mar-16-McLean7
The Adviser: Gene Sperling, who worked in the Clinton and Obama administrations, said Fannie and Freddie “brought out the conservative side of me. Credit:

By the mid-2000s, so-called subprime lending, which had started in the 1990s, was taking over the industry. The mortgages were sold to Wall Street, not to Fannie and Freddie; within the industry, another term for subprime was “nonconforming,” because the mortgages didn’t conform to the GSEs’ standards. As an executive from a major subprime lending company called New Century told Congress in early 2004, subprime lenders were necessary to the economy, because they provided credit to “customers who do not satisfy the stricter credit, documentation, or other underwriting standards prescribed by Fannie Mae and Freddie Mac.” He went on to point out that while over 40 percent of New Century’s loans were made to borrowers who didn’t have to verify their income, Fannie and Freddie “have more stringent income documentation guidelines.”

Indeed, as subprime mortgages proliferated, and were sold to Wall Street, Fannie and Freddie were rapidly becoming irrelevant. Their market share fell from 57 percent in 2003 to 37 percent in 2006, according to data gathered by the Financial Crisis Inquiry Commission, which was tasked with investigating the causes of the 2008 financial crisis. A 2005 internal presentation at Fannie Mae noted, with some alarm, “Private label volume [meaning mortgages that were sold to Wall Street, not the GSEs] surpassed Fannie Mae volume for the first time.”

If Fannie and Freddie had stuck to their original business—guaranteeing mortgages made to people who (mostly) could pay—there would have been no reason for a bailout. There will always be people, including Frank Raines and Tim Howard, who will insist that if the seasoned executive teams at the GSEs hadn’t been ousted just as subprime lending was crescendoing, history would have been different. There is no way, of course, to prove that.

One piece of evidence would seem to point against it, which is that even before the accounting scandals, both Fannie and Freddie had begun acquiring hundreds of billions of Wall Street’s private label securities as investments that they would own on their own balance sheets. They did this both because the securities seemed to be a profitable investment at the time, and because—in an incredibly perverse twist enabled by regulators—these loans counted toward the congressionally mandated goals to guarantee loans made to middle- and lower-income people that Fannie and Freddie had to meet.

But it wasn’t until after their executive teams were ousted that the GSEs also began guaranteeing supposedly less risky, unconventional mortgages, like so-called stated income loans, in which the borrower simply states her income. They did this because they were under immense pressure from all sides, particularly shareholders, to win back the market share they had lost. In a presentation for a 2005 executive retreat, Tom Lund, who was then the head of Fannie’s single-family business, put it this way: “We face two stark choices: stay the course [or] meet the market where the market is.”

As the financial crisis gained steam in 2007 and 2008, Fannie and Freddie’s regulator continued to tell the market that everything was fine. “The companies are safe and sound, and they will continue to be safe and sound,” said Jim Lockhart, the Bush appointee who by then ran the agency that regulated the companies, in the spring of 2008.

But at the same time, the government was quietly pressuring the companies to raise capital. Between the start of 2007 and the summer of 2008, Fannie and Freddie sold a combined $22 billion in so-called preferred stock, bringing their total outstanding preferred stock to $34 billion. (Preferred stock pays a dividend like a bond.) The buyers, at least initially, were individual investors in search of dividends, and community banks, who were encouraged to hold GSE securities to bolster their own capital. This preferred stock would turn out to be a huge problem for the government.

By the end of the summer, their stock prices were plummeting, and it was becoming harder for them to sell the debt they needed to fund their operations. On September 5, Paulson pulled what he later called an “ambush.” At Freddie, executives were in New York for board meetings when then CEO Dick Syron received what another executive calls a “nasty gram” from Lockhart, taking back all the things the regulator had just said about the company being safe and sound, and instead leveling a host of charges at it. They were told to come to Washington for a meeting at five p.m. on September 5th at the regulator’s offices. They had no idea what was coming until they walked into the fourth-floor conference room, where they had all been many times before, and saw not just Lockhart but also Paulson on his left and then Federal Reserve chairman Ben Bernanke on his right. There was a provision in the law that if the directors agreed to conservatorship, they were immune from legal action by shareholders or creditors, making it difficult for them to do anything but agree. The management teams were told to go, and both Fannie and Freddie had to immediately fire all their lobbyists. Paulson later called the decision to take over Fannie and Freddie the “most impactful and the gutsiest thing we did.”

In a recent piece in the New York Times, Gretchen Morgenson noted that the bailout terms were “draconian” compared to those soon offered to the big banks. The government got the right to take 79.9 percent of the common stock of both Fannie and Freddie. Why not just nationalize them and take 100 percent? “If the U.S. government were to own more than 80 percent of either enterprise, there was a sizable risk that the enterprises would be forced to consolidate onto the government’s balance sheet,” explained the analyst Laurie Goodman—meaning that the federal government’s debt could skyrocket. Although the Treasury would provide no up-front cash, it committed to putting in a great deal of money—up to $200 billion—as needed over time. Fannie and Freddie would have to pay a 10 percent interest rate on any funds the government advanced. Any money the Treasury put in would become senior preferred stock, which would have to be paid before any investor in either the preferred stock that had just been sold or the GSEs’ common shares got anything. Although these shares continued to trade, their worth plummeted to pennies.

Of course, these were the shares that community banks had just been encouraged to buy (while the regulator was saying Fannie and Freddie were safe). The Federal Reserve later estimated that more than 600 depository institutions in the United States were exposed to at least $8 billion in investment losses from these securities, and that at least fifteen failures resulted. “In effect, for the small lenders serving Main Street, it was let them eat cake,” wrote the Independent Community Bankers of America in a letter addressed to the Wall Street Journal’s editorial board. “Treasury’s takeover [of the GSEs] is crafted to protect the giant players.” What the ICBA meant was that big Wall Street banks had billions of dollars in derivative contracts with the GSEs, so their failure would have ricocheted through the banking sector. But small banks? They could be sacrificed.

Things quickly got worse for the GSEs. During the presidential race between Barack Obama and John McCain, the charge, mostly promulgated by Republicans, that the GSEs were the sole cause of the crisis, and Wall Street just an innocent bystander, first emerged. McCain called Fannie and Freddie “the match that started this forest fire.” It got so bad that Freddie employees were told not to wear anything with a corporate logo, and the company offered its top executives twenty-four-hour security protection. In the spring of 2009, Freddie’s acting CFO committed suicide.

The appeal of blaming the GSEs was, and is, obvious—it’s a way to blame Democrats for the crisis, because, thanks to Johnson, Raines, and others, Fannie was regarded as a Democratic company. And, of course, if the GSEs caused the crisis, and Wall Street is blameless, then no new regulation is needed, and we can repeal the Dodd-Frank financial reform bill.

But that narrative isn’t supported by the GSEs’ loss of market share as subprime lending took off, or by the loss figures. According to an analysis by the Financial Crisis Inquiry Commission, mortgages turned into securities by Wall Street defaulted at a rate that was almost four times higher than comparable mortgages guaranteed by the GSEs, making it awfully hard to argue that the GSEs led a race to the bottom. Nor is it true that loans made to lower-income borrowers caused the crisis. A study published by the National Bureau of Economic Research in early 2015 found that the wealthiest 40 percent of borrowers obtained 55 percent of the new loans in 2006—the peak year of the bubble—and that over the next three years, they were responsible for nearly 60 percent of delinquencies.

Mar-16-McLean9
The Failed Assassin: Tennessee Republican Bob Corker sponsored bipartisan legislation that would have killed Fannie and Freddie, but it never passed. Credit:

In Washington, it’s far from clear that the real lessons matter. “I wish it was simply a matter of telling the truth,” says John Taylor, the president of the National Community Reinvestment Coalition. “This is a political issue. It means you don’t have to rely on facts. You can make up your own.” “People have a visceral reaction to [the GSEs],” marvels one longtime mortgage investor. “People want to say ‘I killed them.’ ”

So if everyone wants the GSEs dead, and they were such a bad idea, why aren’t they dead? “Making policy on this was one of the hardest things by an order of magnitude for the administration,” says a former official. “The danger is that it leads to all kinds of narratives that feel good but ultimately don’t lend themselves to reality. It’s fucking terrible to explain to the public. Both the politics and substance are much more complicated than anyone expected.” He adds, “And if you get the substance wrong, it could be really problematic. This is a major segment of the economy supporting the major asset most Americans have.”

One of the narratives, which is appealing to those on the right, is that we can get the government out of the housing market with the flip of a switch. In 2013, Jeb Hensarling, the Tea Party Republican representative from Texas, authored a bill that would kill the GSEs and, with the exception of some support for very low-income housing, not replace them with anything. While no one knows for sure what would happen—Fannie Mae has been around since the 1930s, after all—most analysts and market participants agree that the downside is that a great swath of the middle and lower classes probably would get five- to fifteen-year mortgages with floating rates, rates that would vary significantly depending on income and geography. Homes would be less affordable, so housing prices would likely fall. Consider that with interest rates at 3.75 percent, a $200,000 home with a 20 percent down payment and a ten-year fixed-rate mortgage on the remaining $160,000 would have a monthly payment of $1,521. With a thirty-year fixed-rate mortgage, the monthly payment is $752. Mortgage capital might be hard to come by in times of stress. Under the new system, not much would change for wealthy borrowers, but the effect on lower- and middle-income Americans could be significant.

A recent paper by the University of Chicago economist Benjamin Keys shows that when mortgages are guaranteed and turned into securities by the GSEs, the interest rate that borrowers pay doesn’t vary much from region to region, even if the economic health of those regions varies. In contrast, the cost of mortgages that are securitized by Wall Street varies much more and is less predictable. This is because the GSEs, with their national reach, engage in cross-subsidization so that, say, borrowers in a struggling region aren’t hit with higher mortgage costs.

Whatever the appeal of the “free market,” the housing industry, including the real estate agents and the home builders, still has enough clout to scare politicians about the consequences destroying their businesses. In addition, even right-wing politicians are afraid of being accused of decimating homeownership opportunities for their constituents. Real estate agents, who are fairly evenly split between Democrats and Republicans, came out against Hensarling’s bill, and it went nowhere.

There’s a deeper problem with the purportedly free market approach. Barring a total restructuring of our whole financial system, getting rid of the GSEs would turn over the mortgage market to the biggest banks. But they were bailed out in 2008, too. Dodd-Frank may have addressed (if not fully fixed) the “too big to fail” issue by, for instance, demanding higher capital requirements on larger institutions. But if such big banks control the nation’s mortgage market, does anyone think they’ll be allowed to fail in the next crisis? In which case, how are they not government-supported entities, as well? Not to mention entities whose political power would make the old Fannie Mae look like a pipsqueak.

There’s an argument, most prominently made by the think tank Bipartisan Policy Center as well as some former administration officials and analysts, that we should be able to put in place a perfect new system, one without Fannie, Freddie, or big banks. “It is simply not true that we are forced to choose between one system dominated by Fannie Mae and Freddie Mac and another dominated by a few huge banks,” wrote Jim Parrott, a former administration official who now consults for various financial services companies, including Bank of America, and Mark Zandi, the chief economist at Moody’s Analytics, who also serves on the board of a large mortgage insurer. (“A Revolving Door Helps Big Banks’ Quiet Campaign to Muscle Out Fannie and Freddie” was the headline of another recent piece by the New York Times’s Gretchen Morgenson.) Newspapers with editorial desks that are opposed to the GSEs, including the Washington Post, often opine on how this “new system” will simultaneously get rid of the GSEs, preserve access to affordable housing, not give control to the big banks, and protect taxpayers. In short, nirvana!

It would be nice if we could achieve nirvana in housing finance. But if it is possible, no one has shown precisely how it would work. With the housing finance system, the devil is often in the missing details, and the one bill that Congress did seriously undertake (a bill that was supported by both Parrott and Zandi) shows how difficult those details can be. In 2014, bipartisan legislation sponsored by Tennessee Republican Bob Corker and Virginia Democrat Mark Warner passed the Senate Banking Committee. The legislation would have killed Fannie and Freddie but preserved a government backstop for the mortgage markets in the form of a new entity. Small lenders were opposed to the bill, because despite reassuring language about how this wasn’t a big-bank giveaway, they viewed it as precisely that. Affordable-housing activists opposed it because while it offered subsidies for the poor, it did nothing for the bulk of lower- to middle-income Americans because it didn’t offer the cross-subsidization created by the GSEs. The housing finance expert Joshua Rosner, who is a managing director at the research consultancy Graham Fisher, noticed that although there was a requirement that private capital bear risk ahead of the government, in the fine print there was a provision that the requirement could be waived—meaning that in bad times, all the risk would go to the government. And, of course, the government was still backstopping the mortgage market. While some in the Obama administration, most notably Gene Sperling, who was then serving as the director of the National Economic Council, worked hard to pass the bill, the administration as a whole didn’t put its weight behind it—President Obama didn’t make any calls to senators who were on the sidelines. The bill ultimately stalled.

Supporters of the bill insist that it is fundamentally different from the current GSE system—but the key component, which is a government backstop, would remain. Says one Wall Streeter about the bill, “It’s like ripping up the whole national highway system just to build another one next to it.” We take for granted the functioning of the highway system, just as we take for granted that the price we’re quoted for our mortgages is going to be in place when we go to the closing—indeed, that the mortgage will be available at all. Even if an untested new system eventually worked, there would for sure be glitches along the way.

As a large aside, neither this bill nor any other proposal has addressed how to capitalize the new system, or what to do with the existing $5 trillion in GSE securities, 15 percent to 20 percent of which are in the hands of foreign banks.

Mar-16-McLean5
The Co-Conspirator: Virginia Senator Mark Warner joined Corker in trying to take down the GSEs. Credit:

In short, it’s easy to say we should kill the GSEs until you start thinking about the alternatives. This is why Frank Raines used to say privately in the wake of the crisis, “We might call them Dick and Harry, but give it ten years, and there they will be.”

Keeping Fannie and Freddie in any form is an outcome to which many, including the Obama administration, are furiously opposed. Some of it is due to the widespread belief that the bailout of Fannie and Freddie proves that their business model was fatally flawed, even though many of the people who say this don’t say the same thing about the big banks. Some is due to the legitimate fear that any entity that has access to any type of government subsidy (even if the guarantee is explicit, rather than implicit), and that operates in an area as politicized as homeownership, will inevitably become corrupt. Some of it is due to the personal animus toward the GSEs that exists in much of Washington. “When they were in their prime, they rolled over a lot of people in [Washington],” one closer observer says. “Now, people are getting even. There’s a lot of that out there. I don’t care which side of the aisle you’re on.” And some of it is due to the power of the idea that the GSEs’ low- and middle-income housing goals were solely responsible for the crisis.

But some of the opposition to Fannie and Freddie is, ironically enough, a direct function of who it is that is pushing for it. Surprisingly enough, it is some of the most powerful hedge funds in the country.

You might recall that when the government took over the GSEs, they left roughly 20 percent of the common shares outstanding and trading, as well as that preferred stock that had been sold in the run-up to conservatorship. Despite the rhetoric surrounding the GSEs, Lockhart even said that the goal was to return Fannie and Freddie “to normal business operations” and that “both the preferred and common shareholders have an economic interest in the companies . . . and going forward there may be some value in that interest.”

Mar-16-McLean6
The Profiteer: After making nearly $4 billion from shorting subprime securities, John Paulson bought up shares in GSEs. Credit:

In the dark years following the bailout, the GSEs appeared to be racking up tens of billions of dollars in losses. But some investors noticed that the situation wasn’t nearly as dire as it appeared. Under the terms of their bailout, Fannie and Freddie were required to draw money based not on current cash losses or needs, but on when their net worth fell below zero. Net worth is an accounting concept that takes into account estimates of future losses, so Fannie and Freddie were required to draw money based on estimates that they would lose billions in the future. But these estimates turned out to be way too high.

In addition, the bailout forced Fannie and Freddie to pay a 10 percent dividend back to the Treasury on any money they took. Because the dividend payment further reduced their net worths, they had to draw additional money from Treasury to fill the hole created by the dividend payment. According to a FHFA official, around $45 billion of Fannie and Freddie’s $187 billion bailout consisted of draws that took money from Treasury only to round-trip it right back to Treasury to pay the dividend. “It was a complete payday-lender situation,” says someone close to the situation.

Ultimately, Fannie Mae took almost $116.2 billion and Freddie Mac $71.3 billion from the U.S. Treasury, a total of $187.5 billion. One analysis done on behalf of a major investor shows that most of the losses were caused by non-cash charges such as provisions for loan losses that didn’t materialize. During the period in which the GSEs lost money, from 2007 to 2011, the provisions for losses exceeded the actual losses by $141.8 billion. According to this analysis, the combined equity deficiency of the GSEs was really only about $10 billion.

A handful of investors realized that when accounting rules required that the estimated losses be reversed, the GSEs would post staggering profits. And so they began buying up those preferred shares, which were still priced near zero. Some of them, like the hedge fund Perry Capital, had made fortunes betting against, or shorting, subprime mortgages in the run-up to the crisis. Paulson & Co., run by John Paulson, who made almost $4 billion from shorting subprime securities, bought shares. So did a hedge fund run by the Carlyle Group, a politically connected Washington, D.C.-based private equity firm. “We expected the political rhetoric,” says one investor. “We thought, ‘It’s easy for you to say you want to kill them, and that they are an endless black hole.’ But once they were profitable, we thought the rhetoric would change.”

Rhetoric aside, conservatorship is supposed to be governed by the law, which in essence says that the conservator must either “preserve and conserve” the GSEs and release them back, or throw them into receivership, in which case their assets would be distributed to shareholders. The investors argue that even a few years after the crisis, there were sufficient assets that the preferred stockholders would have gotten all the money they were owed.

But then on August 17, 2012, a sleepy summer Friday, Treasury and the FHFA changed the rules of the game. Going forward, instead of paying a 10 percent dividend, Fannie and Freddie would be required to send every penny they made to Treasury. If everything went to the government, then there was no value left for investors. Both the common and the preferred shares plunged in price.

The official explanation for this change is that the administration had no idea that the GSEs were about to become so wildly profitable, and so they executed the sweep of profits to prevent the GSEs from owing money they couldn’t pay. The sheer amount of money the GSEs started making immediately following the sweep makes it hard to believe this.

Another explanation is that the change in the deal came a year after the huge fight in Congress over raising the debt ceiling. Since that time, battles over spending have become commonplace. The profits generated by Fannie and Freddie, which go straight to Treasury, have at critical times helped buy breathing room, or, as Treasury Secretary Jack Lew said in recent congressional testimony, “As a practical matter, it’s what has helped us to reduce our overall deficit.” Thanks to the GSEs’ profits, federal spending was underreported by a combined $178 billion in 2013 and 2014, according to a paper by the Heritage Foundation. Not incidentally, there is no accountability for how the profits from Fannie and Freddie are spent; and once the money is spent, it is gone and cannot be used to buffer any losses they might suffer again.

Eventually, investors, including Perry Capital and Fairholme Capital Management, which manages around $10 billion on behalf of some 180,000 individual investors and a few institutions, sued. To date, around two dozen lawsuits have been filed, some of them by big investors, but others by individual stockowners and pension funds like the City of Austin Police Retirement System. What’s happened in the courts is a drama all its own, but the upshot is that it is impossible at this stage to guess what the outcome might be.

But the lawsuits are in some ways a sideshow to the question of what should be done with the GSEs, and this is the real battleground. What some investors really want is a stake in a recapitalized, albeit reformed, version of Fannie and Freddie, which, they argue, is the right solution—as well as one that would increase the value of their stock.

In response, the Obama administration has made it clear that they will not bring Fannie and Freddie back in any way, shape, or form. Officials refer derisively to the investors’ plans as “recap and release,” meaning that the GSEs would be allowed to build capital, and then we’d send them back out, exactly as they were before the financial crisis. At the Mortgage Bankers Association’s annual convention in October, Michael Stegman, former counselor to the secretary for housing finance policy at the Treasury and now senior policy adviser for housing at the White House, said recapitalizing the companies would be “turning back the clock to the run-up to the housing crisis.” He added that investors had bet big that the companies would be allowed to exit conservatorship “and they are doing everything they can to make sure those bets pay off.” Other officials speak in broad terms about “comprehensive housing finance reform.” As Antonio Weiss, the counselor to the secretary of the treasury, wrote in a recent op-ed, the administration “wants to transition to a better system, one that provides broad access to housing supported by a sound and robust mortgage market, without exposing taxpayers to another rescue.” Once again, nirvana! But, of course, without any details.

One reason for the unwillingness to consider any plan that releases Fannie and Freddie is that politicians don’t want to give up the stream of money flowing into Treasury from the GSEs. It’s also clear that the administration does not want to see investors get paid. (A Treasury official even wrote a memo to then Treasury Secretary Geithner before the 2012 profit sweep citing the “administration’s commitment to ensure existing common equity holders will not have access to any positive earnings from the GSEs in the future.”)

But everyone involved in the housing finance debate—most notably, the big banks that this administration has done so much to protect—has money at stake. Stegman has repeatedly referred to the “failed” GSE business model. But the idea that the GSEs failed relies on inflated loss figures. And if the bailout means the business model failed, then what about the big banks? Isn’t theirs a failed business model too?

The real issue isn’t whether investors get paid. It’s whether we have a housing finance system that makes sense. The investors aren’t the only ones who would like to see Fannie and Freddie reformed rather than eliminated. These include civil rights organizations like the NAACP, who are worried about the plunge in minority homeownership rates since the crisis; affordable-housing advocates, who worry what the world will look like without the GSEs (this summer, the Census Bureau reported that the homeownership rate had fallen to 63.4 percent, the lowest level in forty-eight years); and community banks and other small lenders, who don’t want to lose all their business to the big banks.

Mar-16-McLean4
The Reformer: The author and analyst Josh Rosner has proposed a solution that would treat GSEs like utilities. Credit:

The best idea, whose most prominent backer is Graham Fisher’s Josh Rosner, is that the GSEs would operate as utilities, much like your electric utility, with a cap on the return they are allowed to earn, and regulated as such by a competent regulator with real teeth. The regulator, as Rosner writes, would “ensure that the firms employ their benefits of scale to minimize the costs to end-users while allowing them to earn acceptable, rather than excessive, rates of return.” They would be somewhat like the GSEs were in the 1980s, before all public companies faced inordinate pressure to grow their earnings and please investors. They would be well capitalized at a level consistent with that of other large financial firms, and they would no longer be able to hold mortgage securities on their own balance sheet. (Their portfolios of such securities have already shrunk dramatically.)

Rosner also writes that it is important that the GSEs serve as “countercyclical providers of liquidity.” What he means is that if the market is going crazy, and Wall Street is happily providing mortgage capital, the GSEs can and should stand back. That way, they will have dry firepowder if there are problems, and private capital flees the market. There’s already a taste of how that might work. Today, the GSEs are selling a portion of the risk they insure to other investors. The current way the GSEs sell risk is not without its flaws, but it is a start to doing exactly what President Obama said he wanted, which is getting private capital in front of the government.

This idea isn’t perfect, especially if you believe any government involvement in business opens the door to eventual corruption. It also requires regulatory competence, which is something that has been in short supply in modern times.

One of the major objections is that there’s a conflict inherent in the GSE business model, in which they are publicly traded companies that owe a duty to investors, but also have a congressional mandate to encourage homeownership. Critics say that it is impossible for a company to serve two masters. The utility structure would alleviate the issue, in that investors in such a business wouldn’t be looking for turbo-charged growth, but rather for stability, but the two masters would remain.

But it’s also worth asking whether this conflict is truly the problem that critics make it out to be. There are a lot of evolved companies today that talk about “stakeholder value” instead of “shareholder value.” Indeed, you can argue that the monomaniacal focus on shareholder value hasn’t served our markets so well. Isn’t there a counterargument in which the two mandates—serving homeowners, but with a focus on the bottom line— balance each other? After all, a company with a duty to homeowners but without any responsibility to shareholders could be very dangerous indeed. The bottom-line responsibility, at least theoretically, not only keeps the companies conscious of the risks they are taking but also helps attract a different sort of employee than a pure government bureaucracy might attract. And that is important. The mortgage market is fierce and fast moving. The old Fannie and Freddie could hold their own with Wall Street traders, who are looking for any and every opportunity to make money from slow-moving government institutions. We do not want companies that are completely neutered to serve in this role.

It is true, though, that some wrinkles would emerge in this business model. While requiring the GSEs to get rid of their portfolios will make them less risky, it also means that they will be less profitable, which in turn means less money for affordable housing. While some investors say privately that they support the utility model—and it’s worth noting that there are none who advocate for a simple “recap and release”—it’s also not clear what sort of value owners of the GSEs’ common stock would be able to extract from this model. It’s quite possible that the odd alliance between investors and affordable-housing groups would break down in a bitter fight over who gets what piece of a much smaller pie.

But for citizens and taxpayers, it’s the right answer. We know that the basic infrastructure of the GSEs works, and worked well for fifty-plus years. On the other side, the argument that we shouldn’t settle for something less than perfect sounds a whole lot less compelling once you realize that no one has any vision of perfect, let alone any plan to get there, nor any clue about what glitches or outright corruption might emerge in a new model. And there’s this: all the talk about “comprehensive” reform is just empty words. Now, reform is in Congress’s hands, and one industry lobbyist says that everyone in Washington knows that after the failure of Corker-Warner, the chance that Congress will act is nil. All the words are a pretext for doing nothing.

Yet there’s also a risk to doing nothing. It’s impossible for the private market to resume functioning, even if it can, until the government decides what its role will be. More importantly, because the government has been taking all their profits, at this point the GSEs have less than $5 billion in equity supporting their more than $5 trillion in liabilities, leaving them with a capital ratio of 0.1 percent. To put that in context, when the Federal Housing Administration, which is fully owned by the government, had its capital fall below 2 percent, there was a political uproar over the potential loss to taxpayers. Indeed, the situation is painfully ironic in that the widespread belief is that capital is the one thing that makes the system safer. The largest banks are now required to have a capital ratio that is close to 5 percent. If there’s a recession and housing prices fall again, or if there’s a big swing in interest rates, Fannie and Freddie would have to be bailed out by taxpayers again. Don’t we deserve more of a plan than that?

In The Big Short, there’s a moment when Ryan Gosling tells the audience that he knows this stuff is really complicated, and it seems easier not to care, but that’s really dangerous, because what you don’t know can hurt you. When it comes to housing finance, Americans’ best interests have rarely dictated the answer, precisely because too few people care. That’s another thing the movie got right.

The post Mend, Don’t End, Fannie and Freddie appeared first on Washington Monthly.

]]>
926 Mar-16-McLean2 Mar-16-McLean3 Mar-16-McLean7 Mar-16-McLean9 Mar-16-McLean5 Mar-16-McLean6 Mar-16-McLean4
Redlining from Afar: How Consolidation Killed Off St. Louis’s Exemplary Minority Lender https://washingtonmonthly.com/2016/03/13/redlining-from-afar-how-consolidation-killed-off-st-louiss-exemplary-minority-lender/ Sun, 13 Mar 2016 20:52:03 +0000 https://washingtonmonthly.com/?p=924 Just after Thanksgiving in 1996, a group of 146 St. Louisans, mainly from the black business community, boarded a chartered TWA jetliner and flew to New York City. Their goal? To block the takeover of their hometown Boatmen’s Bank by shutting down the New York Stock Exchange. Led by the St. Louis lawyer Eric Vickers, […]

The post Redlining from Afar: How Consolidation Killed Off St. Louis’s Exemplary Minority Lender appeared first on Washington Monthly.

]]>
Just after Thanksgiving in 1996, a group of 146 St. Louisans, mainly from the black business community, boarded a chartered TWA jetliner and flew to New York City. Their goal? To block the takeover of their hometown Boatmen’s Bank by shutting down the New York Stock Exchange.

Led by the St. Louis lawyer Eric Vickers, the group included members of the St. Louis Minority Contractors Association, the National Black Chamber of Commerce, and the Minority Business Enterprise Legal Defense and Education Fund of Washington. Their ultimate target was Charlotte-based NationsBank, which had recently announced plans to spend $9.5 billion to buy the 150-year-old Boatmen’s, which many considered a cornerstone of the St. Louis business community.

According to company lore, the founder, George Knight Budd, started the institution in 1847 to assist the working-class longshoremen and boatmen who loaded and crewed the riverboats that plied the Mississippi. This tradition of focusing on working-class citizens and small businesses continued right into the 1990s, when Federal Bank officials and black community groups praised Boatmen’s for having one of the nation’s most responsive minority loan-lending programs—a bright spot in a city with a long, painful history of commercial redlining.

Bank regulators gave Boatmen’s an “outstanding” rating three consecutive times from 1992 to 1995 for its adherence to the Community Reinvestment Act (CRA) of 1977, which required lending institutions to meet the needs of borrowers in all communities. By 1993, Boatmen’s was providing $284 million per year under the CRA program.

The St. Louis Small Business Administration ranked Boatmen’s as the top SBA lender in both the number of approved loans and dollar volume, a first for any bank in the Midwest. Its Boatmen’s Specialized Small Business Lender Program was specifically designed to increase the amount of money given to women- and minority-owned enterprises.

Vickers and a coalition of minority advocacy groups believed that NationsBank would dismantle Boatmen’s community lending programs. Their outrage was fueled by a lawsuit filed by John Relman, a fair-housing director at the Washington Lawyers’ Committee for Civil Rights and Urban Affairs. He alleged that NationsBank loan officers engaged in discriminatory lending practices, on a national level. Minority customers in New York City, Memphis, New Mexico, and Texas, where Boatmen’s also had a notable presence, opposed the takeover too.

By the time Vickers and his group arrived in New York, police already had barricaded the streets around the New York Stock Exchange. Only employees who flashed their IDs could walk onto the trading floor. Harry Alford, president of the National Black Chamber of Commerce in Washington, protested, “The black community is getting shortchanged.”

Yet in the end, NationsBank barely had to sweeten the deal to get it past bank regulators at the Federal Reserve. The East Coast bank pledged $10 million in loans for affordable housing in St. Louis, and donated $4 million to Forest Park, the city’s preeminent, 1,300-acre green space.

Shortly thereafter, NationsBank cut 500 employees in its new St. Louis-area holdings, including fourteen senior managers. It also dismantled the local bank’s suite of minority-lending programs. Construction workers removed the old Boatmen’s signs, with their iconic paddlewheel riverboat logo, and replaced them with red-and-blue print that read, plainly, NationsBank.

Within a year, NationsBank itself was swallowed by Bank of America. This time, the protests of black business leaders, in both St. Louis and cities across the nation, reached the White House. In July of that year, President Clinton joined Cathy Bessant, head of Bank of America’s community development banking group, on a platform near a newly built Walgreens store in East St. Louis. She outlined the bank’s plan to spend $500 million to “catalyze” investment in the region, and in other underserved areas across the nation.

But a year later, the grants awarded from the fund didn’t include any projects in the St. Louis metro area. Meanwhile, Community Reinvestment Act ratings indicate that Vickers and his supporters had the right idea when they flew to Wall Street two decades ago. The most recent performance evaluation for Bank of America’s St. Louis lending activity notes that “the geographic distribution of home mortgage loans is poor,” and that only one of its sixty branches serves residents in low-income areas.

Click here to read main story The Real Reason Why Middle America Should be Angry.

The post Redlining from Afar: How Consolidation Killed Off St. Louis’s Exemplary Minority Lender appeared first on Washington Monthly.

]]>
924