November/December 2011 | Washington Monthly https://washingtonmonthly.com/magazine/novdec-2011/ Tue, 11 Jan 2022 05:04:59 +0000 en-US hourly 1 https://washingtonmonthly.com/wp-content/uploads/2016/06/cropped-WMlogo-32x32.jpg November/December 2011 | Washington Monthly https://washingtonmonthly.com/magazine/novdec-2011/ 32 32 200884816 David Brooks’s mistake… Barack Obama’ successes… And his critics and their choice in 2012 https://washingtonmonthly.com/2012/03/11/david-brookss-mistake-barack-obama-successes-and-his-critics-and-their-choice-in-2012/ Sun, 11 Mar 2012 17:06:07 +0000 https://washingtonmonthly.com/?p=25214 Hidden capital Consider the alternative The company you keep They know not what he does The takeover Christie as cynic Regulation is not the problem The danger of doing nothing Learning on the job When going public was bad for the public The tune-up How both sides got wiser Medical laissez-faire Who’s in charge? How […]

The post David Brooks’s mistake… Barack Obama’ successes… And his critics and their choice in 2012 appeared first on Washington Monthly.

]]>
Hidden capital

Consider the alternative

The company you keep

They know not what he does

The takeover

Christie as cynic

Regulation is not the problem

The danger of doing nothing

Learning on the job

When going public was bad for the public

The tune-up

How both sides got wiser

Medical laissez-faire

Who’s in charge?

How the Washington, D.C. area got rich

The right time to propose

The post David Brooks’s mistake… Barack Obama’ successes… And his critics and their choice in 2012 appeared first on Washington Monthly.

]]>
25214
Scandal in the Age of Obama https://washingtonmonthly.com/2011/10/23/scandal-in-the-age-of-obama-2/ Sun, 23 Oct 2011 18:00:05 +0000 https://washingtonmonthly.com/?p=27790 Why Washington feeding frenzies aren't what they used to be.

The post Scandal in the Age of Obama appeared first on Washington Monthly.

]]>
Barack Obama was not in office for more than a couple of minutes, it seemed, before conservatives began trying to cover him in muck. Yet for almost three years, the administration has been scandal-less, not scandalous. In a capital culture that over generations has become practiced at the art of flinging mud pies, Republicans and a few reporters have been tossing charges against a Teflon wall.

First there was the pathetic charge that Obama was born in Kenya and therefore ineligible to be president. The story was heavily stoked by the conservative media, aided by the non-denial denials of GOP leaders, and gave Donald Trump his fifteen minutes of presidential contender fame. It was never taken seriously by the mainstream press, but the president ended the story in April of this year by taking the unprecedented step of releasing his birth certificate, a tacit acknowledgment that the desperate effort to smear him had become a political distraction.

This came after two years in which Fox News and congressional Republicans tried to make the case that Obama had abused his power by hiring unaccountable “policy czars.” Though the agitation lead to the ouster of a low-level White House policy aide, the broader charge never passed the Washington laugh test—policy czars have been fixtures in every administration since Richard Nixon’s.

During the 2010 campaign, Representative Darrell Issa, who would later become chairman of a key congressional investigative panel, called Obama “one of the most corrupt presidents in modern times.” He then walked the charge back for lack of evidence. Since taking over the committee, Issa has launched a series of investigations into alleged political malfeasance—including, for instance, that DNC fundraisers held in the White House violated the Hatch Act—that have so far yielded nothing.

Conservatives had especially high hopes about allegations that the Department of Health and Human Services (HHS) was illegally doling out waivers to the Affordable Care Act mandates to politically connected businesses and insurers. The conservative columnist Michael Barone called the health waivers part of Obama’s “gangster government.” But when the House Republicans demanded a GAO investigation, the GAO came back with bupkes.

Finally, at the end of August, the collapse of the solar energy company Solyndra into bankruptcy after a half-billion-dollar loan guarantee from the Department of Energy seemed to offer the Republicans the answer to their prayers. For several weeks, the creaky machinery of scandal in the press and on Capitol Hill revved up, with stories detailing how Solyndra executives had been in close contact with White House officials and how some of the company’s investors were high-profile political donors, having raised funds for Obama’s 2008 campaign. It certainly did not help that the president and vice president both touted Solyndra as a stimulus and clean energy success story, Obama doing so in a speech at the company’s factory last May.

But complications soon emerged. Though approved under Obama, the process to secure the loan guarantee had, in fact, begun in 2007 under President Bush. George Kaiser, an Obama fund-raiser initially implicated as the main influence peddler, was only one of the key investors in the firm—so were members of the Walton family, of Wal-Mart fame, and they had given generously to Bush. And some claims in the initial press reports—for example, that the Bush administration Department of Energy had rejected Solyndra’s loan application in its final days in office—later turned out to be flat out wrong (the committee responsible for vetting the project delayed approval by two months, raising some concerns but also suggesting that “the project appears to have merit”).

As these wrinkles started to appear, Republicans did not help their cause when, predictably, they started overreaching. Instead of keeping a narrow focus on the events surrounding how the company obtained funding and then went bankrupt despite red flags, they began to use Solyndra’s failure to implicate the entire loan guarantee program as a socialist Obama boondoggle. But it soon came to light that many of the loudest GOP critics, such as Senator Jim DeMint, had actually voted to establish the loan guarantee program back in 2005.

Investigations are ongoing and questions remain—for instance, what pressure, if any, did the White House exert on the Department of Energy to approve and maintain the company’s loan guarantee—so it’s possible that “Solyndra-gate” could explode into a full category 5 scandal. But on the strength of the evidence so far, that seems highly unlikely.

Obama’s presidency is nearly three years old. It has been nearly four years since the furor over Reverend Jeremiah Wright. During this period, Obama has been reviled by conservatives to a degree matched only by the venom directed at Franklin Roosevelt and, later, Bill Clinton. But the vituperation, distortions, and outright lies have been mostly about policies, alleged policies, conspiracy theories, and ludicrous fantasies.

In all that time—a record span, according to scholars—there has been no major Obama scandal to speak of. Some potential scandals—like Tom Daschle’s back taxes—have been nipped in the bud. (Two weeks after the inauguration, Daschle asked that his nomination to be HHS secretary not be sent to the Senate.) Other stories that might have paralyzed earlier administrations just fizzled.

The question is why.

First, a definition of terms. We’re not talking here about conspiracy theories or self-evident partisan sniping. A true scandal, as Dartmouth political scientist Brendan Nyhan puts it, is a “socially constructed event” in which elites decide that a sufficient perception exists that “a public figure has acted in a manner that contravenes established moral, political, or procedural norms.” In other words, a story becomes a scandal when the mainstream press begins treating it as such. According to Nyhan’s criteria, it’s when news stories that use the word “scandal” in the reporter’s own voice start appearing on the front page of the Washington Post. (That hasn’t happened yet with Solyndra.)

The question is, have Obama and his administration objectively engaged in less scandalous behavior, or has some combination of external forces kept scandals from spreading through the public consciousness? And if Obama has managed to build a scandal-proof administration, is that purely a good thing, or has it come at a cost?

Here I humbly offer some theories about how the Washington scandal machine works, and why there has been such a dearth of scandals in the age of Obama. There is, as usual, no one answer; the explanation is likely to involve some combination of the factors cited below.

THE PATTERN-OF-BEHAVIOR THEORY

During the 2008 campaign, Senator Obama, after some prodding by aides, made the death of his mother a part of his stump speech. While delivering his pitch on reforming health care, he told the moving story of Ann Dunham, who died at age fifty-two of ovarian cancer after spending the last months of her life fighting with her insurance company.

But in “A Singular Woman: The Untold Story of Barack Obama’s Mother,” Janny Scott, a reporter for the New York Times, tells a different story. She reports that Dunham’s fight with the insurance company CIGNA was not over medical coverage but over a disability claim related to missed work resulting from the cancer. In truth, Scott writes, Dunham had an employer-provided health insurance policy that paid her hospital bills directly, leaving her “to pay only the deductible and any uncovered expenses, which, she said, came to several hundred dollars a month.”

CIGNA behaved obnoxiously in penalizing Dunham for seeing an unapproved doctor in Hawaii, and she did spend some of her last days battling that insurance company, as Obama said. But his mother was not discriminated against because of a preexisting condition, and the disability insurance that was troubling her was not ultimately part of Obama’s health care proposal.

The fact that the president based a chunk of his campaign—and the centerpiece of his legislative program—on a story that wasn’t quite true never registered in the public debate. It never became a major topic of conversation, much less a scandal, and only in part because his mother is dead.

The bigger reason is that it didn’t conform to any perceived pattern in Obama’s behavior. A critical variable in aggressive press coverage is whether a story is consistent with what we think we already know about a politician. If it is, the story is more likely to resonate. If Obama had developed a reputation for tall tales about his background or his family, this story might have ignited.

Contrast this to Richard Nixon, who was seen as shifty and corrupt going back at least to when Dwight Eisenhower nearly dumped him from the GOP ticket in 1952. (He saved himself with the lachrymose “Checkers speech.”) So reporters, egged on by their liberal friends, were prepared to believe the worst when Watergate came along. Even then, it took many months after the June 1972 break-in and aggressive reporting by Bob Woodward and Carl Bernstein before the scandal broke.

Jimmy Carter came to office in 1977 surrounded by fellow Georgians and carrying a reputation for cold political calculation cloaked in piety. So when his budget director, a good ole boy named Bert Lance, ran into trouble with a bank he owned, the press pounced. New York Times columnist William Safire, determined to prove that Carter was as corrupt as his old boss Nixon, won a Pulitzer for his hounding of Lance (the two later became friends). Unlike Obama’s mother, the president’s clownish brother, Billy Carter, who farmed and owned a gas station, was fair game, especially after he began scheming for business deals with the government of Libya. The Billy Carter scandal coverage also fell into a well-worn genre of reporting on oddball presidential siblings (Sam Johnson and Donald Nixon) and presidential sons (Elliott Roosevelt and Neil Bush).

Ronald Reagan was routinely depicted as detached, even out to lunch. So the Iran-Contra scandal, in which an off-the-shelf foreign policy of trading arms for hostages was run by Oliver North from the White House basement without the president’s knowledge, fell in fertile soil. So did now-forgotten scandals involving a government contractor called Wedtech, Labor Secretary Raymond Donovan, and EPA Administrator Anne Gorsuch Burford, who was held in contempt of Congress and forced to resign for trying to gut her agency.

Reagan’s successor, George H. W. Bush, ran a relatively clean administration, but he inherited Reagan’s lax view of regulation, which gave some juice to a scandal at HUD in 1989. The savings-and-loan failures, and the resulting bailouts (which cost the taxpayers far more than it looks like TARP will), fit into a pattern of crony capitalism and were often covered as scandals. John Sununu had to resign as White House chief of staff after charging the government for unauthorized trips. That story might not have had legs if it hadn’t fit a different kind of pattern: the press thought Sununu was a jerk and was looking for a way to make him pay for it.

Bill Clinton was dogged by stories of womanizing (Gennifer Flowers) and real estate shenanigans (Whitewater) before coming to office, and the media obsession with them continued once he got there. In those pre-blogging days, the White House had no media defense against right-wing assaults and pseudo-scandals like the travel office firings (allegation: that the Clinton White House used the FBI to go after a former travel office official named Billy Ray Dale), trumped-up charges against former HUD Secretary Henry Cisneros, alleged Chinese spies, and on and on. As Hillary Clinton complained to me at the time, the so-called “liberal” media—the New York Times and NPR—did nothing to defend the Clintons against coordinated assaults by the Wall Street Journal editorial page, talk radio, and right-wing staffers on the Hill. Worse, this was the era of the special prosecutor, an institutionalizing of the scandal culture that gave reporters a steady stream of leaks, culminating in the Monica Lewinsky story.

Under George W. Bush, scandals like torture at Abu Ghraib prison, the outing of CIA operative Valerie Plame, cozy contracts with Halliburton (Dick Cheney’s former employer), and the corrupt firing of U.S. attorneys who didn’t toe the White House line all fit a pattern of abuse of power after 9/11. Cheney and others were determined to restore the pre-Watergate culture of unaccountable authority in times of war, and they largely succeeded.

With Obama, the perceived pattern of behavior that he carried with him into office was mostly positive. Being seen as a professorial type who stands above the fray hasn’t always endeared him to the public, but it hasn’t exactly set the stage for scandal either. That, plus his history-making debut as the first African American president and the intense news climate of 2009, may have given him a longer-than-usual honeymoon from the scandal machine.

THE NEWS CLIMATE THEORY

This is the political science theory of the case. There has been surprisingly little scholarship about scandal, but Nyhan, the political scientist, has set out to change that. His explanation is simple:

Got that? Actually, once you ignore the abstruse equations that have come to define modern political science, Nyhan is on to something. His study of scandals going back to the Carter administration suggests that two factors are especially important for a scandal to catch fire: first, an opposition party that views the president more negatively than normal; and second, a slow news period that allows scandals to emerge. Under this theory, scandals are a “co-production” of the press and the opposition party, with each feeding off the other. The party needs the press to publicize allegations of wrongdoing, and the press needs quotes from partisans to legitimize scandal reporting and protect itself against charges of bias.

Nyhan says that Obama’s extremely low standing among Republicans is a “key risk factor,” but that the second variable, a slow news environment, hasn’t been present. He explains how Bush set the modern record of thirty-four months without a negative scandal story on the front page of the Washington Post—the period between his inauguration in January 2001 and the Valerie Plame scandal in October 2003—because of 9/11 and the Iraq War. Now, thanks to the financial crisis, the Deepwater Horizon oil spill, the Arab Spring, the shooting of Representative Gabrielle Giffords, the earthquake and tsunami in Japan, the killing of Osama bin Laden, the debt ceiling crisis, and the threat of a second recession, Obama has broken Bush’s record.

Writing in May, before the Solyndra affair broke, Nyhan predicted that Obama’s scandal-free streak would not last for much longer. His forecasting model showed that the chances of an Obama scandal would exceed 95 percent by early 2012. Republicans, he says, are shifting to the National Labor Relations Board’s tiff with Boeing (right-wingers charge without evidence that Obama partisans have assembled an “enemies list”), and to allegations that the ATF’s bungled Operation “Fast and Furious” (in which straw purchases to track the flow of guns into Mexico led to some weapons falling into the hands of criminals) might become, as Representative Issa claims, “Obama’s Iran-Contra.”

If Solyndra continues to heat up, it will be partly because it connects so directly to disappointment over Obama’s performance on green jobs in particular and the economy in general. In a way, it’s a story that feeds on the dominant news of the day—and the central jobs theme of the 2012 campaign.

Scandals unrelated to such major themes are unlikely to go anywhere in election years, as I learned firsthand many years ago. When Bill Clinton was running for reelection in 1996, I was a regular visitor to Dick Morris’s suite at the Jefferson Hotel. Morris was then a shadowy figure who almost never talked to the press. (When I interviewed him and his sidekick, Mark Penn, I had no idea that a few weeks later a supermarket tabloid would reveal that Morris was getting his toes sucked in the suite by a prostitute.) At the time, Morris’s client, Clinton, was the subject of daily scandal stories about secret Asian fund-raising. Morris waved off the stories, explaining that only a mammoth scandal like Watergate ever has any actual influence over voters. That’s especially true in a bad economy.

THE ETHICAL-TONE THEORY

A fish rots from the head. But it also navigates from the head. The direction a leader charts sends a message to underlings: You’d better follow. Even if the rest of the country is peeling off, the people who work for a president take their cues from him.

From the start, Obama has sent a message of intolerance not just of corruption but even of controversy within his administration. He shoots first and asks questions later when it comes to firing someone who hasn’t yet been proven to have erred.

In July, Assistant Labor Secretary Raymond Jefferson, a wounded Special Forces vet, was alleged to have steered contracts to friends. He was forced to resign, even though the inspector general of the Department of Labor never referred the matter to the Justice Department. Even less well known was the ouster of a National Endowment for the Arts communications official who reportedly tried to get artists to create pro-Obama works of art. In 2009, the White House forced the resignation of Louis Caldera, the director of the White House Military Office, after a screwup over an Air Force One publicity flight over New York. And of course there was the ousting of Van Jones, a White House adviser on green jobs, over some dubious past statements and untrue charges that he had been a “truther” (someone who believes that the U.S. knew of the 9/11 attacks in advance).

The most regrettable dismissal so far was of Shirley Sherrod over a “scandal” trumped up by Andrew Breitbart. In that case, Agriculture Secretary Tom Vilsack, not the White House, was responsible for jumping the gun. (And he acted in part because even the NAACP had believed Breitbart’s maliciously edited tape of Sherrod’s speech on race.) But the tone had been set by the man at the top.

Compare that hair-trigger willingness to toss out officials who are in any way fodder for scandal with Bush’s hesitancy to fire anybody, no matter how scandalous. His Enron-linked Army secretary, Thomas White, was the classic example. Joshua Green, who wrote the definitive piece about White and the Bush administration’s tolerance for corruption, in this magazine (“The Gate-less Community,” July/August 2002), noted that White himself was surprised at how long Bush let him stay in office.

If Obama is so adamantly focused on running a clean administration, it may be because he is familiar with the alternative. His political roots are in Chicago, but he isn’t a product of the Chicago machine, his employment of Bill Daley as his chief of staff notwithstanding. In 2005, he and his wife got the help of soon-to-be-convicted influence peddler Tony Rezko on the purchase of a piece of property adjacent to his house on Chicago’s South Side, but he otherwise steered clear of the lowlifes of Illinois politics. He was never a governor and thus never enmeshed in the “pay to play” system especially endemic to governorships, whereby campaign contributors pony up in not-so-subtle anticipation of state government contracts. (Beyond disgraced Illinois Governor Rod Blagojevich, the most conspicuous current example of sleazy pay to play is Texas Governor Rick Perry.) When Blagojevich went down, Obama noted that people go into politics for two reasons: to make money or to serve the public. It’s a simple and useful distinction; whatever one thinks of his presidency, Obama is clearly in the latter category.

From the beginning of his administration, Obama was determined to set a high ethical standard. Yes, when the economy was on the brink of a depression, he stood behind Treasury designate Tim Geithner even though the man almost certainly chiseled on his taxes. And he granted waivers in a few cases to officials who had been lobbyists (for example, Deputy Defense Secretary William Lynn, a former Raytheon lobbyist and procurement expert). But the relevant part of the revolving door is what happens at the exit. And here Obama signed an executive order preventing everyone serving in his administration from lobbying their former colleagues until the president left office. His vetting process was so over the top (thanks in part to Senate Finance Committee staffers) that it prevented many good people from taking government jobs because of minor infractions. (One White House aide designate had to withdraw her nomination because of a meaningless lien on a piece of property.) But the high bar no doubt also prevented some bad people from slipping through.

It’s a trade-off: Obama got fewer swashbuckling entrepreneurs from outside of Washington who have realworld experience and perhaps a few creative ideas—but he also got fewer sleazeballs.

THE FAMILY-MAN THEORY

Of course the most entertaining and explosive scandals involve sex, which reporters and pundits will ride all day and night. But you’ve got to give them something to work with. As far as we know, the president, the vice president, the top White House staff, and the Cabinet members are either committed family men and women or single. Nowadays you need flagrant adultery— or Anthony Weiner-style weirdness—to get some traction with sex. Barack Obama and an intern? Highly unlikely. The first lady would kill him, cover it up, look fabulous at the state funeral—and no one would be any the wiser.

THE OVER-THE-TOP-OVERSIGHT THEORY

When scandals aren’t about sex, they’re usually about money. In Obama’s Washington, the real money is in the American Recovery and Reinvestment Act (ARRA), better known (to the dismay of the White House) as “the stimulus.” The stimulus totaled more than $787 billion, an enormous sum and far more in constant dollars than Franklin Roosevelt spent in his first year in office. But where FDR’s early New Deal spending programs were so wasteful that a word was coined for them—”boondoggles”—Obama’s stimulus, while much maligned for not producing more jobs, has been astonishingly clean so far.

The explanation lies in two complementary areas: tough management and transparency. Obama assigned Vice President Biden responsibility for supervising the stimulus, and Biden has performed well, working closely with Earl Devaney, who runs the ARRA oversight. Devaney is a former Secret Service agent and crimebusting investigator for Treasury, Interior, and the EPA who helped put Jack Abramoff away. His Recovery Board includes twelve fellow inspectors general (with seventeen others helping), an oversight structure that didn’t exist during the New Deal and has often been weak in the past.

So far there’s been no major fraud—only a couple million dollars stolen out of the $787 billion. That’s sure to grow, but the 270,000 contractors know that there are a lot of eyes on them. “If you’re a burglar and one house is well lit, you’ll go instead to one that’s not well lit,” Devaney likes to say.

The way he’s lit the house is with Recovery.gov, a Web site with the subtitle “Track the Money.” If entities that have received stimulus money don’t file quarterly reports on how they’ve spent their awards, they will see their names publicized on the site. Sure enough, you can click on “Non-compliers” and learn the names of the 367 wryly named “Non-Compliant Award Recipients.” Almost all the other recipients of grants and loans have spelled out how the money was spent. That doesn’t mean it was spent wisely, but at least there’s less of a chance that it was stolen.

Of course, we can’t know for sure. The decimation of reporting at the state and local levels mean there are few reporters anywhere who make it their business to scrutinize stimulus spending.

THE DISTRACTED-REPORTER THEORY

This is true at the national level, too. Except for a brief moment after Watergate, investigative reporting has never been especially fashionable in Washington. But it has reached a low ebb today. Talk is cheap and reporting is expensive, which means that cable networks, blogs, and even newspapers are moving away from working the streets to working the TV studios. Even experienced reporters often find big chunks of their days consumed by tweeting and trying to get on television. It’s awfully hard to break a scandal that way.

The old business model for journalism is dead, and a new one is still struggling to be born. “There’s less manpower working full-time on investigative reporting on this White House and Congress,” says Mark Feldstein, an investigative reporter turned journalism professor. Investigative reporters, who often work months on a single story, “are the most expensive to hire and first to go.” Feldstein’s recent book, Poisoning the Press: Richard Nixon, Jack Anderson, and the Rise of Washington’s Scandal Culture, depicts a time when politicians like Nixon and reporters like Anderson had no scruples about lying, cheating, and stealing to enhance their power. (Were Anderson alive, Feldstein says, he wouldn’t hesitate to hack voice mails for a good story.)

Scandals in the old days were juicy and full of names, thereby increasing demand for them. By collecting scalps, yesterday’s tabloid reporters galvanized change faster than many of today’s worthy media projects, which data-mine effectively but often feel more like GAO reports than sexy scoops. Today, Anderson and other inside-dope columnists like Safire and Rowley Evans and Robert Novak have been replaced by … almost no one. That’s a good thing in terms of improved accountability and accuracy, but it means fewer stories challenging powerful interests. For every Dana Priest, Seymour Hersh, or Michael Isikoff breaking rocks in a painstaking process, there are 200 journalists for whom “shoe leather” means, well, shoe leather. To them, “Digg” is a popularity-based news aggregator, not a way to find stories.

THE INVESTED-INVESTIGATORS THEORY

Even at its peak, investigative reporting depended on officials with subpoena power. These sources in law enforcement and on Capitol Hill fed various reporters, who had few tools for uncovering wrongdoing on their own. But this system depended on sources who were, if not nonpartisan, at least committed to some version of the truth that stood above blatant political ax grinding. Nowadays everyone has an agenda, which makes the information offered the press more suspect.

For instance, Issa, who chairs the House Oversight and Government Reform Committee, has no credibility in his probes of the Obama administration. As the New Yorker and the New York Times have amply demonstrated, Issa is a sketchy character with shady business interests that he has used his public office to enrich. He promised a new investigation of Obama every week, but each area of inquiry looked like chicken feed even to Fox News, which always stands ready to inflate the tiniest story about the Obama administration into the new Watergate.

Even when Fox and its allies in the powerful conservative media establishment get traction on a supposed scandal, it doesn’t grip the nation like scandals of old because of its origins in partisan reporting. The stories are combated by fierce blogging on the liberal side. For instance, when Van Jones was wrongly accused on Fox of believing 9/11 was a U.S. plot, a counterassault by liberals kept the mainstream press from taking up the story in a big way. It was seen in a partisan context, which had the effect of protecting the president from a feeding frenzy.

This is a paradox of our hyper-partisan culture. On cable, noise gets the ratings. But at the networks and big papers, some of the old rules still apply. Disinterested journalism makes for more interested readers. The stories they report have more heft when they seem motivated by nothing more than a commitment to good government.

THE OBAMA PARADOX THEORY

An even bigger paradox involves Obama and the power of the presidency. The essence of power is getting people to do what they don’t want to do with carrots and sticks. The carrots can morph into bribery and the sticks into blackmail and extortion, as earmarks and campaign contributions become catnip for corruption.

But some of these techniques are the flip side of political success. Feldstein notes that neither FDR nor LBJ were lawyers. They were interested in results, not strict adherence to the law. “You don’t get the sense that Obama relishes exercising power,” says Feldstein. “He’s both cleaner and less effective than some of his predecessors.”

I’d amend that argument on the effectiveness front; the president has won more than he’s lost over the last couple of years. But whatever his successes and failures in office, he is, as Joe Biden got in trouble for saying in 2007, “articulate and bright and clean.” Polls consistently show that the public agrees. Integrity is a nice calling card in a bruising election. If he manages to get reelected amid sky-high unemployment, this will be a big reason why.

The post Scandal in the Age of Obama appeared first on Washington Monthly.

]]>
27790
A Geography Lesson for the Tea Party https://washingtonmonthly.com/2011/10/23/a-geography-lesson-for-the-tea-party/ Sun, 23 Oct 2011 14:00:04 +0000 https://washingtonmonthly.com/?p=27920

Even as the movement’s grip tightens on the GOP, its influence is melting away across vast swaths of America, thanks to centuries-old regional traditions that few of us understand.

The post A Geography Lesson for the Tea Party appeared first on Washington Monthly.

]]>

When 2011 began, the Tea Party movement had reason to think it had seized control of Maine. Their candidate, Paul LePage, the manager of a chain of scrappy surplus-and-salvage stores, had won the governor’s mansion on a promise to slash taxes, regulations, spending, and social services. Republicans had captured both houses of the state legislature for the first time in decades, to the surprise of the party’s leaders themselves. Tea Party sympathizers had taken over the GOP state convention, rewriting the party’s platform to demand the closure of the borders, the elimination of the Federal Reserve and the U.S. Department of Education, a prohibition on stimulus spending, a “return to the principles of Austrian Economics,” and a prohibition on “any participation in efforts to create a one world government.” A land developer had been put in charge of environmental protection, a Tea Party activist was made economic development chief, and corporate lobbyists served as the governor’s key advisers. A northern New England state’s rather liberal Democrats and notoriously moderate Republican establishment had been vanquished.

Or so they thought.

Less than a year later, it’s Maine’s Tea Party that’s on the wane. Prone to temper tantrums and the airing of groundless accusations, Governor LePage—who won office by less than two points in a five-way race, with just 38 percent of the vote—quickly alienated the state party chair and GOP legislative leadership. His populist credentials were damaged when it was revealed that much of his legislative agenda— including a widely condemned proposal to roll all state environmental laws back to weak federal baselines—had been literally cut and pasted from memos sent to his office by favored companies, industrial interests, or their lobbyists. His economic development commissioner was forced to step down after allegedly insulting several (previously friendly) audiences, while a court ruled that his environmental protection nominee violated conflict-of-interest provisions. He triggered international media coverage, a lawsuit, and large protests after removing a mural depicting the history of Maine’s labor movement from the Department of Labor because an anonymous constituent compared it to North Korean “brainwashing.” Eight of twenty GOP state senators blasted the governor’s bellicose behavior in an op-ed carried in the state’s newspapers, the largest of which declared in April that “the LePage era is over.” Power in the state’s diminutive capital, Augusta, now resides with the senate president, a Republican moderate who was Senator Olympia Snowe’s longtime chief of staff.

The Tea Party itself has been all but destroyed in Maine by its association with the debt ceiling hostage takers in Washington, according to Andrew Ian Dodge, founder of the organization Maine Tea Party Patriots and the state movement’s most high-profile activist. “There were people saying, ‘Yes, I think we should default,’ and there were the rest of us saying, ‘You’re insane,’ ” says Dodge, a dark-horse challenger to Snowe. “Now I’m emphasizing my Tea Party links even less because a lot of people think they are the crazy people who almost drove us off a cliff.”

Indeed, in much of the northern tier of the country, the Tea Party has seen a similar reversal of fortune. Wisconsin Governor Scott Walker—who won by just 6 percent— has faced powerful resistance to his deregulatory, antiunion, antigovernment agenda, including the recall of two of his senatorial allies; his political future is uncertain. In Massachusetts, Tea Party-backed Senator Scott Brown has emerged as a moderate Yankee Republican along the lines of Snowe. In New Hampshire, Tea Party organizer Jack Kimball stepped down as state party chair this September after losing the confidence of the state’s leading Republicans. “This is the establishment Republicans versus the Tea Party that helped get them into office,’’ one angry Tea Party activist said of Kimball’s departure. “They rode us in, now they’re bringing us back to the barn.’’

When the Tea Party burst onto the national scene in the summer of 2010, it looked like a national movement. From Wasilla, Alaska, to Augusta, Maine, it dominated GOP rhetoric and produced candidates in virtually every level of government and section of the country. But over the past year, even as its grip on the national GOP has strengthened, its influence has melted away in large swaths of the northern half of the continent, its activists forced to confront the fact that their agenda and credo are anathema to the centuries- old social, political, and cultural traditions of these regions. The Tea Party agenda may hold sway over large parts of the South and interior West, and with the economy and the president in such a weakened state a Tea Party favorite like Rick Perry could conceivably win the White House. But the movement has no hope of truly dominating the country. Our underlying and deeply fractured political geography guarantees that it will never marshal congressional majorities; indeed, it almost guarantees that the movement will be marginalized, its power and influence on the wane and, over large swaths of the nation, all but extinguished.

We’re accustomed to thinking of American regionalism along Mason-Dixon lines: North against South, Yankee blue against Dixie gray or, these days, red. Of course, we all know it’s more complicated than that, and not just because the paradigm excludes the western half of the country. Even in the East, there are massive, obvious, and long-standing cultural fissures within states like Maryland, Pennsylvania, Delaware, New York, and Ohio. Nor are cultural boundaries reflected in the boundaries of more westerly states. Northern and downstate Illinois might as well be different planets. The coastal regions of Oregon and Washington seem to have more in common with each other and with the coasts of British Columbia and northern California than they do with the interiors of their own states. Austin may be the capital of Texas, but Dallas, Houston, and San Antonio are the hubs of three distinct Texases, while citizens of the two Missouris can’t even agree on how to pronounce their state’s name. The conventional, state-based regions we talk about—North, South, Midwest, Southwest, West—are inadequate, unhelpful, and ahistorical.

The real, historically based regional map of our continent respects neither state nor international boundaries, but it has profoundly influenced our history since the days of Jamestown and Plymouth, and continues to dictate the terms of political debate today. I spent years exploring the founding, expansion, and influence of these regional entities— stateless nations, really—while writing my new book, American Nations: A History of the Eleven Rival Regional Cultures of North America. It demonstrates that our country has never been united, either in purpose, principles, or political behavior. We’ve never been a nation-state in the European sense; we’re a federation of nations, more akin to the European Union than the Republic of France, and this confounds both collective efforts to find common ground and radical campaigns to force one component nation’s values on the others. Once you recognize the real map (see above), you’ll see its shadow everywhere: in linguists’ dialect maps, cultural anthropologist’s maps of the spread of material culture, cultural geographer’s maps of religious regions, and the famous blue county/red county maps of nearly every hotly contested presidential election of the past two centuries. Understanding America’s true component “nations” is essential to comprehending the Tea Party movement, just as it clarifies the events of the American Revolution or the U.S. Civil War.

Our regional divides stem from the fact that the original clusters of North American colonies were settled by people from distinct regions of the British Islands—and from France, the Netherlands, and Spain—each with their own religious, political, and ethnographic characteristics. For generations, these discrete Euro-American cultures developed in remarkable isolation from one another, consolidating their own cherished principles and fundamental values, and expanding across the eastern half of the continent in nearly exclusive settlement bans. Some championed individualism, others utopian social reform. Some believed themselves guided by divine purpose, others championed freedom of conscience and inquiry. Some embraced an Anglo-Protestant identity, others ethnic and religious pluralism. Some valued equality and democratic participation, others deference to a traditional aristocratic order modeled on the slave states of classical antiquity. Throughout the colonial period and the Early Republic, they saw themselves as competitors— for land, settlers, and capital—and even as enemies, taking opposing sides in the English Civil War, the American Revolution, and the War of 1812. Nearly all of these regional cultures would consider leaving the Union in the eighty-year period after Yorktown, and two went to war to do so in the 1860s. Immigration enriched these nations—or, more accurately, the nations that were attractive to immigrants—but it did not fundamentally alter the characteristics of these “dominant” cultures; the children and grandchildren of immigrants didn’t assimilate into an American culture, instead tending to assimilate to the norms of the regional culture in which they found themselves. There’s never been an America, but rather several Americas, and there are eleven today.

Yankeedom
Founded on the shores of Massachusetts Bay by radical Calvinists as a new Zion, since the outset Yankeedom has put great emphasis on perfecting earthly society through social engineering, individual self-denial for the common good, and the aggressive assimilation of outsiders. It has prized education, intellectual achievement, community (rather than individual) empowerment, and broad citizen participation in politics and government, the latter seen as the public’s shield against the machinations of grasping aristocrats, corporations, and other tyrannies. From its New England core, it has spread with its settlers across upper New York State, the northern strips of Pennsylvania, Ohio, Illinois, and Iowa, parts of the eastern Dakotas, and on up into the upper Great Lakes states and Canada’s Maritime Provinces.

New Netherland
Established by the Dutch at a time when the Netherlands was the most sophisticated society in the Western world, New Netherland has displayed its salient characteristics throughout its history: a global commercial trading culture— multiethnic, multireligious, and materialistic—with a profound tolerance for diversity and an unflinching commitment to the freedom of inquiry and conscience. Today it comprises Greater New York City, including northern New Jersey, western Long Island, and the lower Hudson Valley. Like seventeenth-century Amsterdam, it emerged as a leading global center of publishing, trade, and finance, a magnet for immigrants, and a refuge for those persecuted by other regional cultures, from Sephardim in the seventeenth century to gays, feminists, and bohemians in the early twentieth. Not particularly democratic or concerned with great moral questions—it sided with the South on slavery prior to the attack on Fort Sumter—it nonetheless has found itself in alliance with Yankeedom in defense of a shared commitment to public-sector institutions and a rejection of evangelical prescriptions for individual behavior.

The Midlands
America’s great swing region was founded by English Quakers, who believed in man’s inherent goodness and welcomed people of many nations and creeds to their utopian colonies on the shores of Delaware Bay. Pluralistic and organized around the middle class, the Midlands spawned the culture of Middle America and the Heartland, where ethnic and ideological purity have never been a priority, government has been seen as an unwelcome intrusion, and political opinion has been moderate, even apathetic. An ethnic mosaic from the start—it had a German rather than British majority at the time of the Revolution—it shares the Yankee belief that society should be organized to benefit ordinary people, but it rejects top-down government intervention. From its cultural hearth in southeastern Pennsylvania, southern New Jersey, and northern Delaware and Maryland, Midland culture spread through central Ohio, Indiana, and Illinois, northern Missouri, most of Iowa, southern Ontario, and the eastern halves of South Dakota, Nebraska, and Kansas, sharing the border cities of Chicago (with Yankeedom) and St. Louis (with Greater Appalachia).

Tidewater
Settled in many cases by the younger sons of southern English gentry, Tidewater was meant to reproduce the semifeudal manorial society of the countryside they’d left behind, where economic, political, and social affairs were run by and for landed aristocrats. These self-identified “Cavaliers” largely succeeded in their aims, turning the lowlands of Virginia, Maryland, southern Delaware, and northeastern North Carolina into a country gentleman’s paradise, with indentured servants and, later, slaves taking the role of the peasantry. Tidewater has always been fundamentally conservative, with a high value placed on respect for authority and tradition, and very little on equality or public participation in politics. The most powerful nation in the seventeenth and eighteenth centuries, today it is a nation in decline, having been boxed out of westward expansion by its boisterous Appalachian neighbors and, more recently, eaten away by the expanding Midlands.

Greater Appalachia
Founded in the early eighteenth century by wave upon wave of rough, bellicose settlers from the war-ravaged borderlands of northern Ireland, northern England, and the Scottish lowlands, Appalachia has been lampooned by writers and screenwriters as the home of rednecks, hillbillies, crackers, and white trash. It transplanted a culture formed in a state of near-constant warfare and upheaval, characterized by a warrior ethic and a deep commitment to personal sovereignty and individual liberty. From south-central Pennsylvania, it spread down the Appalachian Mountains and out into the southern tiers of Ohio, Indiana, and Illinois, the Arkansas and Missouri Ozarks, the eastern two-thirds of Oklahoma and on down to the Hill Country of Texas, clashing with Indians, Mexicans, and Yankees along the way. Intensely suspicious of lowland aristocrats and Yankee social engineers alike, Appalachia has shifted alliances based on whoever appeared to be the greatest threat to its freedom; since Reconstruction and, especially, the upheavals of the 1960s, it has been in alliance with the Deep South in an effort to undo the federal government’s ability to overrule local preferences.

The Deep South
Established by English slave lords from Barbados as a West Indies-style slave society, this region has been a bastion of white supremacy, aristocratic privilege, and a version of classical Republicanism modeled on the slave states of the ancient world, where democracy was the privilege of the few and enslavement the natural lot of the many. It spread apartheid and authoritarianism across the southern lowlands, ultimately encompassing most of South Carolina, Georgia, Alabama, Mississippi, Florida, and Louisiana, plus western Tennessee and southeastern Arkansas, Texas, and North Carolina. Its slave and caste systems smashed by outside intervention, it continues to fight for rollbacks of federal power, taxes on capital and the wealthy, and environmental, labor, and consumer safety protections.

El Norte
The oldest of the Euro-American nations, El Norte dates back to the late sixteenth century, when the Spanish empire founded Monterrey, Saltillo, and other outposts in what are now the Mexican-American borderlands. Today this resurgent culture spreads from the current frontier for a hundred miles or more in both directions, taking in south and west Texas, southern California and the Imperial Valley, southern Arizona, most of New Mexico, parts of Colorado, and the six northernmost Mexican states. Most Americans are aware that the region is a place apart, where Hispanic language, culture, and societal norms dominate; few realize that among Mexicans, nortenos have a reputation for being more independent, self-sufficient, adaptable, and work centered than their central and southern countrymen. Long a hotbed of democratic reform and revolutionary settlement, various parts of the region have tried to secede from Mexico to form independent buffer states between the two federations. Today it resembles Germany during the Cold War: two peoples with a common culture separated from one another by a large wall.

The Left Coast
A Chile-shaped nation wedged between the Pacific Ocean and the Cascade and Coast mountain ranges and stretching from Monterey to Juneau, the Left Coast was originally colonized by two groups: merchants, missionaries, and woodsmen from New England (who arrived by sea and dominated the towns); and farmers, prospectors, and fur traders from Greater Appalachia (who generally arrived by wagon and controlled the countryside). Yankees expended considerable effort to make it “a New England on the Pacific,” but were only partially successful: the Left Coast is a hybrid of Yankee idealism, faith in good government and social reform, and the Appalachian commitment to individual self-expression and exploration. The staunchest ally of Yankeedom and greatest champion of environmentalism, it battles constantly against Far Western sections in the interior of its home states.

The Far West
The other “second-generation” nation, this is the one part of the continent where environmental factors trumped ethnographic ones. High, dry, and remote, the Far West stopped the eastern nations in their tracks and, with minor exceptions, was only colonized via the deployment of vast industrial resources: railroads, heavy mining equipment, ore smelters, dams, and irrigation systems. As a result, settlement was largely directed and controlled by large corporations headquartered in distant New York, Boston, Chicago, or San Francisco, or by the federal government itself, which controlled much of the land. Exploited as an internal colony for the benefit of the seaboard nations, Far Western political leaders have focused public resentment on the federal government (on whose infrastructure spending they depend) while avoiding challenges to the region’s corporate masters, who retain near Gilded Age influence. It encompasses nearly all of the interior west of the 100th meridian, from the northern boundary of El Norte to the middle reaches of Canada, including much of California, Washington, Oregon, British Columbia, Alaska, Colorado and Canada’s Prairie Provinces, and all of Idaho, Montana, Utah, and Nevada. Two other nations—the Inuit-dominated First Nation in the far north and Quebec-centered New France—are located primarily in Canada and are peripheral to this discussion. Their U.S. enclaves in northern and western Alaska and southern Louisiana respectively have scant electoral power, but they both have considerable sway in Canada and have come the closest to forming independent nation-states of their own (in Quebec and Greenland).

Nearly every internally divisive development in U.S. history in the past two centuries has pitted Yankeedom against the Deep South. Since neither of these regional “superpowers” has had a sufficient share of the population to dominate federal politics in this time period, they have sought to build and maintain alliances with other regional cultures. Some of these alliances have been remarkably durable, like those between Yankeedom and the Left Coast or between the Deep South and Tidewater, each of which has survived since before the Civil War. Others are younger and weaker, such as the axis between Greater Appalachia and the Deep South—cultures that took up arms against one another in both the American Revolution and the Civil War—or between the Deep South and the Far West, where resentment of corporate control may one day eclipse anger at the federal government.

During the Revolution, each of the regions fought to preserve their distinctive societies. New Netherlanders— dependent on commerce and unaccustomed to self-rule— generally remained loyal to the Crown. Yankee citizen minutemen and mounted Tidewater gentlemen enthusiastically took up arms to maintain local control and institutions, while Deep Southerners reluctantly did so in response to fears the British would free their slaves. Midlanders tried to remain neutral, supplying both British forces in Philadelphia and American forces wintering in Valley Forge. Appalachian people sided with whoever was against their oppressors on the coast, who’d denied them representation in colonial assemblies and the Continental Congress; they joined the rebellion in Pennsylvania (at one point occupying Philadelphia and overthrowing the Midland elite) and the British in the Carolinas and Georgia (against the Deep Southern oligarchs, triggering a bloody civil war there). Only in Virginia and Maryland—whose gentry had extended them reasonable representation—did they find common cause with coastal regions against the British.

In the run-up to the Civil War, Yankees were isolated in their willingness to go to war to stop Deep South-controlled states from seceding. Most observers expected the country to split into three or four confederations, as the other regions had no desire to remain with either party. New York City Mayor Fernando Wood proposed that the city and its Long Island suburbs should become an independent citystate modeled on those of the Hanseatic League, a plan endorsed by at least one congressman, many merchants and bankers, and three major newspapers. The Midlands, Tidewater, and Appalachia sought to create a Central Confederacy that would act as a buffer state between the rival superpowers, a plan championed by Maryland Governor Thomas Hicks. Had Deep Southerners not attacked Fort Sumter— a move that instantly made enemies of most neutral regions—they would almost certainly have peacefully seceded. Instead, they wound up with only one ally, Tidewater, who shared a commitment to slavery and a racial mythology that cast the conflict as a reprise of the Norman invasion and the English Civil War, with southerners the descendants of the aristocratic, civilized Normans, and the Yankees the offspring of the crude Anglo-Saxons. (The Yankee “Roundheads,” Tidewater’s leading journal, the , predicted in 1861, would “lose the last [battle] and then sink down to their normal position of relative inferiority,” freeing the Confederacy to create “a sort of Patrician Republic” ruled by people “superior to all other races on the continent.”) Appalachian people overwhelmingly sided with the Union, leading a successful secessionist movement to create (Unionist) West Virginia, and unsuccessful ones in eastern Tennessee and northern Alabama; a quartermillion men from Appalachian sections of the Confederacy volunteered for Union service, joining tens of thousands more from Pennsylvania, Maryland, Kentucky, and beyond.

Backed by the Midlands, the Left Coast, and the Far West, Yankeedom dominated the federation in the late nineteenth and early twentieth centuries, though Reconstruction lost them the support of Appalachia. In the following decades, alliances shifted around based on the fear of Yankee-directed federal power, but over the past half century the regional blocs have remained stable. Yankeedom, New Netherland, and the Left Coast have faced off against the Deep South, Tidewater, Greater Appalachia, and the Far West over civil rights, the Vietnam and Iraq wars, the environmental and gay rights movements, health care and financial reform, and the last three presidential elections.

The “northern” alliance has consistently favored the maintenance of a strong central government, federal checks on corporate power, and the conservation of natural resources, regardless of which party was dominant in the region at any given time. (Recall that prior to the civil rights struggle of the 1960s, the Republicans were the party of Yankeedom.) The presidents they have produced—John F. Kennedy, Gerald Ford, George H. W. Bush, and Barack Obama—have all sought to better society through government programs, expanded civil rights protections, and environmental safeguards. All faced opposition from the Dixieled nations even from within their own parties. With the southern takeover of the GOP, all three nations have become overwhelmingly Democratic in recent years.

The goal of the Deep Southern oligarchy has been consistent for four centuries: to control and maintain a oneparty state with a colonial-style economy based on largescale agriculture and the extraction of primary resources by a compliant, low-wage workforce with as few labor, workplace safety, health care, and environmental regulations as possible. Not until the 1960s was it compelled by African American uprisings and external intervention to abandon caste, sharecropper, and poll tax systems designed to keep the disadvantaged majority of their region’s population out of the political process. Since then, they have relied on fearmongering— over racial mixing, gun control, illegal immigrants, and the alleged evils of secularization—to maintain support. In office they’ve instead focused on cutting taxes for the rich, funneling massive subsidies to agribusiness and oil companies, rolling back labor and environmental programs, and creating “guest worker” programs and “right to work” laws to ensure a cheap, compliant labor supply. Tidewater, weakened to satellite status over the past 150 years, has fallen in line. But keeping Greater Appalachia and, now, the Far West in the coalition has been trickier, as both have strong populist and libertarian streaks that run counter to the interests of the modern-day southern aristocracy.

Which brings us to the Tea Party movement and the recent debt ceiling debacle.

The Tea Party movement is active across the country, but it has had only limited success in the three nations of the northern alliance. Of the sixty members of the House Tea Party caucus, only three hail from Yankeedom, and not one comes from the Left Coast or New Netherland. The three Yankees have had a tough go of it; in the seven races they have collectively won, only twice did one of them achieve a margin of victory of greater than 5 percent (Michele Bachmann in 2006 and 2010). One, Illinois freshman Joe Walsh, won his seat by just 291 votes and has since been gerrymandered into lame-duck status by local Democrats. Add to that the previously mentioned setbacks in Wisconsin, Massachusetts, New Hampshire, and Maine, and the movement’s prospects in Yankeedom appear bleak. From the Puritan migration of the 1630s to the debt ceiling debate, as noted above, Yankees have championed individual self-denial for the common good, investment in strong public institutions, and governmental projects to improve society; the Tea Party is unlikely to ever take deep root in such inhospitable soil.

By contrast, the Tea Party has encountered little resistance to its agenda in the four nations of the Dixie bloc, as it is a carbon copy of the Deep Southern program of the last two centuries: reduce taxes for the wealthy and services for everyone else, crush the labor unions, public education, and the regulatory system, and suppress voter turnout. The four nations account for fifty-one of the sixty members of the House Tea Party caucus—or 85 percent of them—with the Deep South alone accounting for twenty-two. Of the sixtysix House Republicans who refused to support the final compromise on the debt ceiling—roughly half of whom were not members of the Tea Party caucus—fifty-three hailed from the same cultural regions. Debt ceiling lunacy was a regional phenomenon. The Dixie-led bloc has produced many of the Tea Party’s most influential politicians, including Senators Jim DeMint (Deep South), Mike Lee (Far West), and Rand Paul (Appalachia), former Governor Sarah Palin (Far West), secessionist-minded Governor Rick Perry (Greater Appalachia), and FreedomWorks boss (and former house majority leader) Dick Armey (Deep South). Tea Party activists can be found most anywhere in the country, but only within this four-nation bloc have they had significant and sustained political success.

Our cultural balkanization ensures that the Tea Party movement—and radical political movements generally— will never achieve lasting success on the national stage: they simply won’t be able to build a lasting coalition. It’s also the reason U.S. elections have become such nail-biters, decided by the shifting allegiances of a relatively small number of voters from a small and recurring cohort of (mostly Midlander) battleground counties in a handful of swing states. It can also inform winning strategies to defeat the destructive and ultimately undemocratic Deep Southern program, whether it travels in Confederate gray, Dixiecrat suits, or leggings and tricorn hats.

There are two ways to hasten the Tea Party agenda’s demise. One is to draw one or more weakly aligned regions away from their coalition. The other is for progressives to cultivate a lasting partnership with El Norte or the Midlands, the two great “swing regions” on today’s political map. The smartest strategy would be to do both simultaneously, in each case focusing on the lowest-hanging fruit. If the Democratic Party is to be the vehicle to accomplish this, it will need to retune its message accordingly.

The Dixie bloc is far from solid. Of the Deep South’s partners, Greater Appalachia is the most reliable after Tidewater, sharing a dominant Protestant religious culture that focuses on individual salvation in the next world and discourages efforts to perfect the current one, condoning slavery in the nineteenth century, the racial caste system in the twentieth, and laissez-faire capitalism throughout. But this culture also prizes personal freedom and resents domination by outsiders, be they mining companies or federal regulators. Significantly, Appalachia has had a near monopoly on the production of “southern” populists (LBJ, Ross Perot, Sam Rayburn, Mike Huckabee) and progressives (Cordell Hull, Bill Clinton, Al Gore). Meanwhile, the Far West, once a bastion of progressive politics, has parallel strains of colonial grievance and libertarian individualism, and its most powerful religious force—Mormonism—has Yankee roots and is firmly committed to the notion of improving the present world (just as the early Puritans were). Neither culture supports “regulation” or “taxation” in the abstract, as these are seen as encumbrances on individual liberty. However, both are eager to strike back at forces—particularly outside forces—that seek to exploit them.

If progressives were to campaign in these regions on promises to bring rogue bankers, mortgage lenders, mining interests, health insurers, seed companies, and monopolistic food processors to heel, they would have far wider appeal; here, regulation can be sold as a matter of justice, the closing of tax loopholes a matter of fairness. Calls for new government programs are unlikely to win many hearts and minds in these two regions, but improving the efficiency and fairness of both the government and the marketplace can. The potential dividends will likely be modest in Greater Appalachia, but small gains at the margins in places like southcentral Pennsylvania, southern Ohio, or western Virginia might tip the balance of an entire state in a presidential or Senate race. In the Far West, the gains could be dramatic, potentially tipping many mountain states out of the Dixie camp. In the aftermath of the 2008 financial collapse, an outsider who spent nearly his entire adult life in Yankeedom (Obama) was able to defeat a Far Western native son who chose to run on the Dixie-bloc platform (John McCain) in Colorado and Nevada, and almost captured Montana as well. The Far West is ready to leave the Dixie coalition—and the Tea Party—if someone offers them a palatable alternative.

Simultaneously, the northern alliance stands to benefit from the increasing political power and consciousness of El Norte. Hispanics have reasserted political control of the borderlands after more than a century of imperial subjugation. The Dixie agenda has always been unpopular there, while the Tea Party in the aforementioned states has been a vehicle for white fears that they are losing “their” country to Hispanic Americans and Mexican and Latin American immigration. (“Immigration attitudes are an important predictor of Tea Party movement support in the West,” a recent study of polling data by two Sam Houston State University political scientists found, as were “economic issues related to minority relations.”) So long as northern-alliance political leaders continue to champion cultural inclusiveness—and the Dixie bloc does not—they can count on political and electoral support from this fast-growing region. The Hispanic population is expected to triple by 2050— accounting for most of the nation’s overall growth—and most of that will take place in El Norte. This will result in a commensurate decrease in Tea Party influence in the legislatures and congressional delegations of Texas, California, Arizona, and New Mexico. (Currently, the House Tea Party caucus has just two members from El Norte—both anti-immigration whites from Orange County.)

The people of the Midlands generally want their communities left alone to get on with their lives, but in the midst of a crisis they can be counted on to defend the federal union from authoritarianism, bigotry, or dismemberment. The region has been generally apathetic about the Tea Party movement, providing just two members of its House caucus. But were the Tea Party to actually implement its agenda—slashing Social Security, Medicare, and federal spending on public education—Midlanders would rally to their northern neighbors, just as they did after—and only after—Deep Southerners opened fire on Fort Sumter.

In short, the Tea Party and the Deep South may do the country serious harm, but they will not take it over. They may hobble the workings of Congress, inject flat-earth thinking into Senate debates, or even capture the presidency next year. But their policy program will never win the hearts and minds of a clear majority of Americans, and it carries the seeds of its own destruction. The political pendulum will indeed swing back. How far it goes—and how long it stays there—will depend on how many of America’s cultural regions the Deep South’s opponents can attract to their cause.

The post A Geography Lesson for the Tea Party appeared first on Washington Monthly.

]]>
27920
Shovel-Ready Clinics https://washingtonmonthly.com/2011/10/23/shovel-ready-clinics-2/ Sun, 23 Oct 2011 14:00:03 +0000 https://washingtonmonthly.com/?p=27830

A job creation idea so obviously good even Washington couldn't possibly say no... could it?

The post Shovel-Ready Clinics appeared first on Washington Monthly.

]]>

Barack Obama has spent most of his first term as president wrestling with three enormous tasks: kick-starting the economy to create jobs again; standing the banking sector back on its feet; and providing health care to the 40 million Americans who lack insurance. He’s made progress on all these fronts.

But let’s be honest. Despite billions in federal stimulus money, the American jobs machine is barely functioning, and millions of previously hardworking Americans, especially in construction and the “trades,” are sitting idle. Despite billions in bailouts, America’s banks are barely lending, especially to small businesses. And while Obama did pass health care reform, those very reforms actually threaten to overwhelm an already severely strained primary health care infrastructure with a huge wall of new “customers” demanding health care services.

In 2014, a little more than two short years away, the provisions in the Affordable Care Act (ACA) that are designed to expand coverage will kick in, initiating a deluge of insurance-card-carrying Americans into the health care system. These disproportionately low-income, newly insured people will live in every state and community in the country. Unless we act now, they stand to join the ranks of the “medically disenfranchised”—the more than 50 million already insured Americans who have no regular access to primary health care for lack of physicians and facilities in their local communities. Think our transportation infrastructure is under stress? Our health care infrastructure is like an already clogged highway system that’s about to take on 32 million new vehicles overnight.

These three problems—the economy’s failure to create jobs, the banking sector’s unwillingness to lend, and the health care system’s lack of capacity to meet an accelerating rise in demand—might seem intractable, especially in a deadlocked Washington where no new money is likely to be put on the table. But if we could take off our ideological blinders for a moment—if conservatives could stop seeing every federal action as an assault on freedom, and liberals could get beyond their belief that spending more federal money is the way out of every problem—we would find a modest answer to all three of these problems staring us in the face.

Part of the solution is relatively uncontroversial. As Congress and the president have acknowledged, the way to meet the flood of new patients coming down the pike is to expand the nation’s existing network of community health centers— nonprofit clinics that offer primary care to the medically under-served, often in rural areas or inner cities. But to get this done, there’s no need to appropriate billions more in direct government spending. Rather, there is a way to lure skittish banks into lending private capital to finance a health center construction boom in all fifty states, simply by tweaking the language of an existing federal lending program. Doing so would save money in the long run by providing cost-effective primary care to those who desperately need it. And it would quickly create tens of thousands of jobs, many of them in the hard-hit construction sector. Moreover, unlike the roads, bridges, and other complex infrastructure projects the Obama administration wants to fund, few of which are shovel ready, health center projects could get the hammers swinging in months, not years.

Community health centers may sound like a liberal pet project—they originated under Lyndon Johnson’s Great Society, after all, and offer care regardless of patients’ ability to pay—but they have long enjoyed the steadfast support of Republicans and Democrats alike. President Obama invested $2 billion in health centers through his stimulus package, and the ACA—his signal legislative achievement—rightly carves out a huge role for them in handling the brunt of the newly insured patient load in 2014. But before the age of Obama, perhaps the greatest recent champion of health centers was, of all people, George W. Bush, who doubled their capacity under his watch. Richard Nixon, the first President Bush, and generations of GOP lawmakers also supported them.

Republicans like health centers in part because many of them serve the kind of rural districts that make up the conservative base, but also because they represent a fundamentally cost-effective institutional model. By providing comprehensive care under one roof, with primary care doctors at center stage, heath centers treat patients at a cost that is 41 percent lower than what other providers rack up, according to a recent study by the National Association of Community Health Centers. This translates into savings for the entire health care system of up to $17.6 billion annually. A low-income patient served at a community health center costs the federal government just $125 a year in direct subsidies. Moreover, that patient is far less likely to flock to the emergency room of a private hospital to receive basic care. (For this very reason, private hospitals also love community health centers.)

Today there are about 1,200 registered community health centers in America, serving 19 million patients with branches in 8,000 towns and cities. To meet the coming wave of patients newly insured under the ACA, the Department of Health and Human Services (HHS) estimates that community health centers will have to serve more than twice as many people as they do now—going from 19 million today to 40 million in 2015. To help them meet that goal, the ACA sets aside $11 billion to fuel nationwide health center expansion. There are two big problems with the ACA’s approach, however.

One is that big pots of money in Washington are always vulnerable to having their bottoms drilled out. In a fit of budget-cutting zeal in April, Congress slashed the annual funding that supports the operations of existing community health centers by $600 million. Hence HHS, which administers all the federal aid to health centers, has had to raid the “new expansion” piggy bank just to keep the old health centers running. This August, HHS was slated to announce grants to 350 health centers so they could open new branches. Instead, it named only sixty-seven.

The other problem with the funding set aside by the ACA is that the vast bulk of it can only go toward paying the operating expenses of a center, not its construction costs. That means a new center may use that federal money to pay rent on a new facility, but not to build or buy one. Thus the new centers will wind up operating as many do now, in make-do spaces like defunct big-box stores, rickety trailers, and even—in at least one case—a retrofitted gas station.

This is a very inefficient and, over the long run, expensive way for centers to expand. Needless to say, a converted gas station is hardly the ideal physical plant for a medical facility. But more importantly, rents only go up with time. If centers could build, own, or sign long-term leases for their own space, they would level out their occupancy costs, rather than watch those costs threaten to eclipse revenues year in and year out. Better yet, if health centers did so now, they would buy into a depressed real estate market with rock-bottom prices.

But direct spending from the federal government won’t get them there. Recently, a firm called Capital Link, which helps health centers put together the financing necessary for expansion, estimated that, to sustainably meet the goal of 40 million patients, 5,775 facilities would need to be built or refurbished, at an average cost per health center of $14 million. That amounts to about $16.6 billion in capital spending. Combined, the ACA and the stimulus only provide about $3 billion in grants for capital improvements, and friends in Congress have told health centers not to come looking for more. That leaves a gap of more than $13 billion for health centers to come up with on their own.

When left to their own devices, community health centers can succeed in building new facilities without grant money from the federal government. But a variety of factors conspire to make it devilishly hard for them to raise capital. I saw firsthand how arduous and time consuming this can be when I recently spent six years on the board of the National Cooperative Bank, which makes loans to health centers and other similar nonprofits. Generally speaking, health centers have steady revenue streams but tight operating margins, which means they don’t have large balance sheets for down payments, and they can’t afford to take on very large debt burdens. Therefore, when contemplating a new building project or the acquisition of an existing commercial property, they often have to start by raising as much as half of the cost through donations. But securing those donations isn’t easy. Universities have wealthy, grateful alumni, and big hospitals have wealthy, grateful ex-patients, but community health centers have no natural fund-raising constituency, since almost all of their patients are poor. It can take years and years of bake sales, silent auctions, and phone drives just to raise enough money for a down payment on a new building. In my conversations with the leaders of health centers serving Latino communities in the San Diego area, low-income groups in the inner city of Detroit, and farm communities in the rural heartland, I’ve heard tales of fundraising drives that lasted eight years or more, just to come up with the 50 percent of equity capital necessary to buy or build a new facility.

Nor is it easy for health centers to secure loans from banks once they have raised their equity capital. When examining any particular community health center—a clinic, say, in an economically distressed inner-city neighborhood serving a mixture of Medicaid patients and the uninsured, or one in a depressed heartland town where real estate prices are spiraling downward—a lending institution may balk. (Health centers also report that banks are made skittish by Washington’s erratic attitude toward Medicaid.) When a health center finally does manage to arrive at a financing package, the deal often pulls together an alphabet soup of government or not-for-profit agencies that have provided small slices of funding—resulting in complex arrangements that always incur substantial legal fees. At times the mere cost of structuring such a deal can run to more than 10 percent of the total project cost. Normally, in property development, you’d expect such non-construction costs to account for just 2 or 3 percent. The upshot: community health center projects get financed and constructed through a remarkably inefficient, costly, and time-consuming process that—when looked at from a business perspective—simply makes no sense.

And yet there is no reason why community health centers could not be financed more efficiently in the private sector. Out of the 1,200 community health centers in America today, only one or two have ever defaulted on a loan. As modest as their revenue streams are, they are nothing if not reliable. And health centers’ cash flows are only going to improve with time. Today, some 38 percent of health center patients are uninsured. By 2015, that number should be down to about 20 percent. (After Massachusetts passed its health care reform legislation five years ago, the proportion of uninsured health center patients dropped from 36 percent to 20 percent, and the patient base at health centers across the state increased by about 30 percent, as more people flocked to receive the primary care they couldn’t afford before.) If health centers were businesses, they would have a stellar outlook.

Nevertheless, given the current tools at their disposal, health centers are hobbled in their attempts to build capital. Looking to the federal government to come to the rescue with more direct spending—as the debacle with the ACA funding shows—is not the answer. Unless someone can figure out how to appropriate $13 billion more from the federal treasury (good luck with that), we need to find another way.

A good start would be to look beyond the nonprofit world, to see how certain small private-sector businesses put together capital for expansion. One of the most successful devices the government has to stimulate commercial development in low-and moderate-income communities is the Small Business Administration’s 504 loan program. Here’s how it works: When a small business wants to expand by buying, building, or renovating a facility, it approaches a local Certified Development Company (CDC)—a nonprofit lender that is approved by the Small Business Association to issue low-interest, fixed-rate, government-backed bonds to finance up to 40 percent of a project. Provided the small business meets certain criteria—that it can promise to create or retain one local job for each $100,000 loaned, for instance, and that it will occupy the facility being built—the CDC approves the deal and then partners with a bank, which puts up another 50 or so percent of the project financing. That leaves the borrower—the small business—needing to put down as little as 10 percent as a down payment. The program has one of the government’s most exceptional track records of providing return on public investment. And it is a self-funding loan guarantee program, so its cost to the government is virtually nil.

The problem is that the SBA 504 loan program is only open to certain for-profit businesses. In order to make it available to community health centers, Congress would have to change the statute governing the program. Here’s how: explicitly list community health centers as eligible to receive 504 loans, and further designate that a private developer or holding company may also qualify to receive a 504 loan if it signs a long-term commercial lease, equal to the term of the loan, with a community health center. If Congress decided to promulgate these minor changes, with one stroke it could set off a minor building boom in America.

Banks, which have been sitting on the sidelines of the commercial real estate market since 2008 for lack of demand and good deal flow, would almost certainly jump at the chance to fund the construction of new community health centers. They are already well accustomed to working with CDCs and the Small Business Administration. And financing health centers under this arrangement would allow banks to amass “community reinvestment credits,” thus meeting a federal requirement that they contribute a certain amount each year to the development of low- and middle-income communities.

At the same time, local real estate developers and construction companies across the country— which were doing a brisk business building strip malls, doctor’s offices, and the like before the recession—are dying to get back to work. Faced with a growth industry and access to capital, these commercial developers could be counted on not only to build what’s handed to them, but also to take the initiative to get as many projects as possible moving. Private developers could put up the equity themselves, design and build the facilities in joint venture partnerships with community health centers, and then rent out the facilities to the nonprofits on a long-term lease or through various lease-to-own arrangements. Indeed, hungry developers and construction firms would find any number of ways to get the hammers swinging—in small towns and blighted cities across the country—faster than Congress could ever ordain.

And of course, going the SBA route would alleviate many of the headaches and inefficiencies that currently make it so vexing for health centers to fund construction and expansion capital projects. Because SBA 504 loans are routine and efficient to process, the cost of structuring a new project’s financing would plummet. And the interest rates available to beneficiaries of an SBA 504 loan are among the lowest on the market. Right now, they are better than ever.

All those factors would dramatically lower the total cost of any given health center’s commercial real estate acquisition or construction project, which means centers would be able to finance a much higher proportion of the cost through loans. And that, in turn, means many fewer years running silent auctions and bake sales to scrape together a down payment. This is good, because time is of the essence. With real estate values depressed and interest rates at record lows, any health center that managed to receive an SBA 504 loan for a construction project today would wind up with a bargain. Better yet, the center would wind up with a debt burden that would only become more manageable with time, as the center’s revenues rise with inflation, but its payments remain fixed.

It’s hard to imagine Congress appropriating any more direct spending to fuel the construction of health centers. But there’s no good reason why they shouldn’t change a few words in a statute to achieve the same end. Not only would it quickly create much-needed jobs in the construction trades, it would also spark economic activity over the long run in some of the places in America that need it most. In a climate in Washington that is consumed with rhetoric about reducing government expenditures, creating jobs, and the need to get the private sector moving, this small change is one of the most surefire steps that Congress could take to show that government can both lead and get out of the way.

The post Shovel-Ready Clinics appeared first on Washington Monthly.

]]>
27830
Taxing the Kindness of Strangers https://washingtonmonthly.com/2011/10/23/taxing-the-kindness-of-strangers-2/ Sun, 23 Oct 2011 14:00:02 +0000 https://washingtonmonthly.com/?p=27825

Foster parents like us willingly pay a heavy price. The GOP wants us to pay more.

The post Taxing the Kindness of Strangers appeared first on Washington Monthly.

]]>

One day in the summer of 2009, during a walk alongside a neighborhood creek with my wife and our one-year-old son, I mused out loud that maybe we should become foster parents. I had just finished seminary and was working part-time. We weren’t quite ready for a second child of our own, and though we had long considered adopting, it seemed like more than we could commit to at that unsettled moment in our careers. What we did have was a house in the suburbs of Chicago, a pair of modest middle-class incomes, relatively flexible work schedules, bleeding-heart tendencies, and a son whose newly docile sleep habits and sweet disposition probably made us feel more seasoned at parenting than we really were. Fostering seemed like a good way for us to help meet an urgent social need, even if only for a while. “Somebody has to do it,” I remember saying.

After hearing me out, my wife agreed that we should look into it. A couple more walks and some days of research later, she was driving the process more vigorously than I was. She requested all the information, got us fingerprinted, and sent out our reference forms. By spring, we were licensed by the state to receive a child.

It was nonetheless a shock when someone from the Illinois Department of Children and Family Services interrupted my dentist appointment one afternoon that summer with a phone call to ask if I wanted to add a one-year-old girl to our household. The baby had come into the hospital a few days earlier under circumstances that aroused the suspicions of the staff, so the state had taken her into temporary emergency custody. She would be released from the hospital that evening to the first available foster home that agreed to take her.

Not being in the habit of doubling the number of children under our care without my wife’s full participation, I frantically called and texted her. After a long ten minutes with no response, I decided to say yes to the state investigator, knowing that my wife would be more upset with me if I let the girl go elsewhere. (“Never mind, said yes,” she read on her phone after getting out of the pool at the gym.)

That left us about four hours total for nesting—about the amount of time we had spent discussing the relative merits of different breast-pump models when we were expecting the birth of our son. We bought some formula and baby food, shoved some books out of the way in our spare room, and updated our Facebook status. Then my wife went to pick the little girl up from the hospital and came home at seven p.m.

Sophia turned out to be somewhat less than a year old, small and toothless, with sparse, wispy black hair. (Some details throughout this story, including Sophia’s name, have necessarily been changed. A foster child’s case is confidential, and a foster home is in some cases a kind of safe house.) We have a picture from that first night: our son, then age two, is looking down at her warily while she sits in my lap gumming a rattle from Ten Thousand Villages, wearing a stained onesie and a fiberglass cast that covers her right leg. The only thing we knew about her injury was that it had somehow brought her to our house. It was the first time I’d seen a baby with a broken bone.

Sophia slept poorly. Within a week I had settled into a nightly routine of driving her out to the western reaches of the Chicago suburbs to lull her unconscious. She did not cry, but only shrieked at discomfort or confusion. She did not like to be held. But these challenges were, it quickly became clear, only half the struggle in our new role.

In a way that we never really anticipated, welcoming Sophia into our home led us into the wilderness of red tape and frustration navigated every day by low-income parents who struggle to raise children with the critical help of government programs. That same week, the office of the bone specialist who had treated Sophia’s broken leg at the hospital tried to get out of scheduling her for an urgent follow-up appointment. Like many medical practices, his endeavored at all costs to avoid working for Medicaid’s paltry reimbursement rates. (The office went so far as to deny ever having treated her; eventually, however, they gave in.) We went through a similar amount of stress trying to put Sophia into daycare. We had to run down a pile of government paperwork, prove our employment, and then simply wait and hope that our daycare center would accept the state’s stingy pay. And yet, frustrated as we were, we couldn’t exactly blame the doctors and daycare providers for being heartless. As the state’s stinginess pushes more of the costs of caring for foster children onto them, it’s no surprise that they start to balk.

It’s a major bureaucratic process to remove a child from her home and family. The state insures the child, pays for daycare, investigates the claims of abuse, and retains legal custody, but it cannot actually put a baby to bed at night. And so, on the other side of this most intimate public-private partnership are usually people like us, left alone with a stranger’s child and a garbage bag full of clothes and wondering what’s going to happen next. And what happens next depends, to a stomach-churning degree, on the state’s willingness and ability to keep up its half of the bargain.

So it was with an unusual sense of urgency and dread that our family watched the 2010 Republican wave and the austerity budgeting that has followed in ceaseless progression. When Paul Ryan’s budget, approved by 235 Republicans in the House, proposed dramatic cuts to federal Medicaid spending, it was as if they were trying to make it even more hopeless for us to find a doctor to treat Sophia’s health problems. When Scott Walker in Wisconsin sought to cut the workforce that administers foster care in his state, we went up to Madison to join the protests in solidarity, because we knew how helpless we would be if there were no caseworker on the other end of the phone to answer our own urgent pleas for help and guidance. And the threats have continued, as House Republicans repeatedly propose cutting trillions of dollars in domestic spending to reduce the debt while making room for sustained upper-income tax cuts. The way this hits home for us is simple. A foster parent joins hands with the state in order to take care of a dispossessed child. For the last year, the state has been trying to slip free of our grasp.

When you tell someone that you’re a foster parent, the response often goes something like: “I could never do that; I would get too attached to the kids.” While superficially admiring, this line takes on an odd ring after a while, with its implication that we must be emotionless creatures. While the idea of an elite corps of radically detached substitute parents may hold a certain appeal, my wife and I came into the system out of a sense of attachment bordering on the maudlin. She had been a chaplain in a juvenile detention center and I used to run programs for at-risk youth in Chicago schools. Children who are afraid of their own homes leave an impression.

Foster parenting had been in the back of my own mind since my family first started telling stories about my grandfather. He went into his first foster home in 1932, when he was twelve years old. The “orphan trains” that brought an estimated 200,000 big-city children to the farms of the Midwest since 1854 had only stopped running three years earlier. In agrarian America, home-based foster care often functioned as a way to match orphaned or abandoned children with homes that needed additional labor. This approach, mercenary though it may seem in our more sentimental age, often counted as a meaningful improvement over orphanages and homelessness. (The population in foster homes did not exceed the population in orphanages until 1950.) My grandfather, whose biological mother had herself lived in an orphanage for five years, did not, however, appreciate the historical dialectic at work. He ran away from a series of foster homes where he had been housed in barns and worked like a hired hand. Then he landed with a pious Roman Catholic family in Kiel, Wisconsin. There, the only woman he ever called “Mother,” whom he met when he was fifteen, prioritized his graduation from high school over farm chores. What they had managed to do for him, I wanted to do for someone else.

A lot has changed since then. The county cold-called local families to see if they would take my grandfather; in the decades that followed, foster homes would increasingly be licensed and professionally supervised. State and federal support for the children in foster care replaced local charities. Abuse and maltreatment became more rigorously defined and aggressively pursued. Perhaps most importantly, children generally stopped providing needed labor for the household economy and began requiring financial and emotional investments unknown to the farm families of Depression-era Calumet County. (I sometimes wonder what my grandfather would think if he saw me diligently encouraging our foster daughter to waltz with a teddy bear at a Music Together class.)

When my wife and I took a nine-week training course as part of our preparation for becoming foster parents, we got a glimpse of our peers in the program. We were not a notably diverse group. Five married couples and two individuals, all of us churchgoers and late-model-car owners, all but one of us white, all but the youngest of us with biological children of our own. No one missed their turn to bring refreshments for the class. We epitomized the combination of genuine earnestness and social privilege that has driven child welfare in America from the start. The sessions took place in a small evangelical social agency’s suburban office, whose pastels and wood accents only added to the facade of gentility. But then the classes started, and we began learning things like how to respond to the behavior of children who had been raped by their parents. The operative lesson seemed to be that our earnest sentiment and social privilege were bound to be tested. “We don’t want you to have problems and call up your caseworker and say, ‘Come pick them up, it’s not working out,’ ” the trainer told us—an acknowledgment that such a temptation would arise, and that nothing short of adequate preparation and commitment would stop us from yielding to it.

Why people choose to become foster parents is something of a mystery. In the sparse literature on foster parents and their motivations, they report unfulfilled desires for biological children and the intention to adopt, a sense of obligation toward a family member entering the system, or the usefully vague “altruistic motivations.” One factor that turns up consistently is knowing a foster parent or being related to a foster child. Despite lingering popular impressions to the contrary, money does not seem to motivate many foster parents to participate. In most states, including Illinois, foster care reimbursement rates lag well behind the average cost of raising a child. This leaves child welfare advocates with a dilemma. Raising the board rates for foster children might attract and retain more foster parents, as well as ensure a better level of care. But it’s hard to argue for this when a substantial portion of the electorate considers foster parents to be in it for the money, and doubly hard to argue for it under conditions of severe austerity for safety net programs. (I have heard that some people do manage to turn fostering into a kind of cottage industry; I find it hard to imagine how.) “A strained economy and the perception among even a portion of the public that some foster parents are motivated by money may make enacting such legislation challenging,” a 2008 study of foster family finances suggested, “and it is likely that some people will continue to be skeptical of increasing payments for fear of incentivizing inappropriate arrangements.”

If the motives that bring foster parents into the system are hard to pin down, much less cultivate, the factors that drive them out are considerably more clear. Overburdened caseworkers and the lack of services for the children in their care are frequently mentioned. Foster parents don’t often cite low stipends as a source of frustration, but reading between the lines, it’s clear that miserly support amplifies the challenges inherent in providing care for someone else’s child. “Parents who want to make a contribution need better training and a better stipend,” Dr. Robert Goerge, an expert on foster care at the University of Chicago, told me. “So many foster parents have one kid and they’re out. They say, ‘I’d like to do it, but I need more support.’ ” A 2002 study by the federal Department of Health and Human Services put it more succinctly: “Every foster parent we spoke with said they had, at some time, considered leaving the foster care system.”

Children, once an economic necessity, have become a luxury. We are able to afford them—to feed, clothe, house, enrich, and educate them into their teens or twenties in a state of complete economic idleness—with considerable help from the development of the welfare state. This is true, albeit in different ways, for Americans across the class spectrum. The housing, health insurance, and daycare costs of middle-class and wealthy children are subsidized through the tax code. The needs of poor children are met (inasmuch as they are) through a patchwork of direct expenditures that includes Medicaid, nutrition programs, and housing vouchers. Sophia qualified for some of these services automatically by virtue of being in the foster system, and it was incumbent on us to make sure she got them.

In the basement of our county health department, two weeks after Sophia’s arrival, we waited for our first appointment with the Women, Infants, and Children (WIC) nutrition program administrators. We were the only native speakers of English on our side of the counter, and we had to contend with the alien experience of being asked to demonstrate our poverty and to provide a host of documents we did not possess. We finally established Sophia’s eligibility through a splendid transitive property of indigence whereby her Medicaid card was proof of her WIC status. Thus persuaded, a nurse examined her height, weight, and iron levels. We were handed a stack of coupons for formula, baby food, and a few other staples following a course on nutrition that a middle-class parent might be strongly tempted to find demeaning.

WIC coupons work like this: each coupon specifies both what may be purchased (usually the “least expensive brand”) and a total cash value, in case you are tempted to purchase items of your own choosing. Each coupon must be rung up separately, and no personal money may be used to top off an order (though more than once a sympathetic cashier has shaved a little off a total when I miscalculated the weight of $6 worth of fresh fruit). You can also forget about defrauding Uncle Sam by swapping the coupons for cash, since each one needs to be signed in person with a signature that matches the folder that accompanies them.

We adjusted rather quickly to being treated like morons and petty thieves by bureaucrats. The social anxiety that comes with buying welfare food among our fellow citizens was worse. Middle-class people like to think of themselves as self-sufficient. But after a few months of shopping with WIC coupons, and contemplating my own sense of shame at this, I came to realize that we are rather selective in the forms of dependence we disdain. People who would not give a second thought to claiming the child care tax credit or the mortgage interest deduction will blanch at getting a bag of frozen peas on the public dollar. A WIC order grinds the line to a halt and prompts me to feel all kinds of self-consciousness about my deportment, my children, and the purchases I make with my own money. I got to know which cashiers were least given to suspicion or contempt, and I gratuitously mentioned Sophia’s foster status to defuse my own irritation. I don’t relish using the coupons, but they really help. When poor weight gain necessitated supplementing Sophia’s diet with PediaSure (at $12 for six bottles), the coupons became more valuable still.

Over those first few months, Sophia’s broken bone healed, her complexion brightened, and her sleep habits settled down a little. Our son fell for her even harder than my wife and I did. By Christmas they were inseparable, laughing when the other laughed and going together, Spartacus style, into time out when the other was being punished. Even after her initial injury healed, however, Sophia was a sick little girl. In the fall she had a string of ear infections that brought us to the doctor at least twice a month. A specialist determined that she needed ear tubes and was willing to take Medicaid. But this time it was the state that was unwilling to pay—a fact we learned only days before the surgery was scheduled to take place. We ended up leading an impromptu lobbying effort with Sophia’s caseworkers to change the minds of the state’s Medicaid bureaucracy, an HMO of the damned. They relented, in the nick of time, and Sophia was spared more months of perforated eardrums.

We had the same procedure done for our son less than a year and a half earlier with much less drama. But his health care is secured by private insurance and subsidized by a huge income tax exclusion. Sophia’s health care will only become harder to secure as providers leave the field and state Medicaid programs face tightening budgets.

Both the subsidy for our son and the expenditure for our daughter expand the scope of the federal government, and both impact the deficit in the same way. Yet when the time came to strike a deal over taxes and spending in order to increase the debt ceiling in August, the expenditures that support the children of the poor were on the table while the expenditures that support the children of the middle class and wealthy, thanks to the unwavering insistence of Republican lawmakers, were not.

As the “super committee” goes to work, the same story is set to be repeated. The White House successfully insulated Medicaid from the “trigger” mechanism that will produce automatic cuts should the committee fail to reach an agreement. But in that scenario every other program for poor children will get hammered, from WIC to early childhood development assessment. At the same time, plummeting federal aid to the states will tempt state-level lawmakers to cut into their half of the Medicaid spending formula. Either way, the interests of poor children—and the tools that make modern foster parenting possible—are coming to a dangerous pass.

The reward for persistence in foster parenting tends to be more requests to provide foster parenting. In March, a caseworker asked if we would take two brothers, a one-year-old and a five-year-old, for ten days while their foster family took a vacation. “Well,” my wife said reluctantly, “ask the other foster families, and if you really need us …”

“We really need you,” she was told.

Three weeks later, we welcomed the two brothers into our home—which, with our son and foster daughter, already did not feel short on children. The boys came with fistfuls of prescription drugs, grocery bags full of clothes, state Medicaid cards, and a list of phone numbers. That was pretty much it. The one-year-old needed daily breathing treatments with a nebulizer, a face-mask contraption that helps asthmatic children inhale their albuterol while looking like tiny Darth Vaders. And the poor state of his older brother’s teeth shocked us. At bedtime, the older boy craved all the most sentimental storybooks we had about parental love. After I had read him the story Snuggle Puppy three times, enduring halitosis that no amount of brushing could conquer, and tucked him in and said good night, he simply sat in bed and recited it to himself. Outnumbered three to one, our son insisted that he, too, was “a foster boy,” and would not be persuaded otherwise.

Eventually the baby came down with conjunctivitis and I took him to the doctor. (Setting up the appointment required some haggling about who I was and whether he could be prescribed anything.) The nurses practically wept to see him, oozing prodigiously from his nose and eyes and limp from low oxygen levels. In his weakened state, he needed frequent and large doses of albuterol (twenty vials over the next three days). His doctor gave me the most tepid of reassurances: “He’s not doing great, but he’s doing well enough to go home.” And home—or what passes for it in this child’s life—is where we went. His brother woke up early the next morning to throw up, which he did repeatedly and with an uncanny lack of complaint. My parents, who had come from out of town to help me, supervised the emptying of his vomit receptacles while I caught up on some work. The next day, the boys’ regular foster mom—whose long-planned vacation had been taken up answering my frantic calls—picked them up.

Foster parenting takes a heavy toll on the idealism that drives it. We worked ourselves up to do a good deed for these boys, but it could hardly have seemed like a mercy to them. They were relatively new to foster care and had already been through one failed placement. Ours was the fourth roof they’d slept under in six weeks. We were just another pair of adults with an expired futon mattress, mismatched sheets, and unknown motives. Foster children obviously have suspicions about adults. “My parents don’t love me,” the five-year-old confided to my wife one night, after a day of gamely spinning fantasies about all the things they do for him. “I’m sure they do love you,” she told him, “but they can’t take care of you right now.” It was true, but it was cold comfort to a small boy.

Over a year later, Sophia is a vivacious chatterbox of a girl. The daycare staff who once quailed at her arrival now treat her as the darling of the paint-smock set. Visits to the doctor are, mercifully, rarer than they once were. And the economics of fostering have become a familiar part of our family’s accounting. She receives monthly WIC coupons for four gallons of milk, two loaves of whole-wheat bread, a jar of peanut butter, a dozen eggs, 36 ounces of cereal, 128 ounces of juice, and $6 worth of fresh, frozen, or canned fruits and vegetables each month (for a grand total not to exceed $49.41). The state sends us a reimbursement check for $392 each month for her care. Her doctor visits are paid for by Medicaid, as are prescriptions that would otherwise cost us hundreds of dollars out of pocket. The state pays for her to be transported to and from visits with her biological parents, and for her daycare. A caseworker supervises our home and handles our calls for help when Sophia has night terrors or a visit with her parents goes badly. A part-time nurse at the health department works with us to manage her health care. The state is paying for her dance class, and as she gets older, the state will send her to summer camp.

On the other side of the ledger, we’ve spent hundreds of dollars on diapers, clothing, and toys. We bought a lot of PediaSure and multivitamins when her growth was poor, and we paid for her prescriptions when we had an urgent need and were out of state (Medicaid does not travel well). We pay for all the food she eats apart from what WIC provides, including meals out prompted by desperation or celebration. We’ve thrown her two birthday parties. Cumulatively, we’ve driven her hundreds of miles for doctor appointments, and hundreds more to get her to sleep. We’ve spent five mornings in the basement WIC office when we were supposed to be working. And naturally we have given her whatever share of a happy, enriching childhood that we can, with countless trips to the zoo and the DuPage Children’s Museum.

It’s an irony of foster care in America that the only politician who has made this juggling act visible in recent years should be Michele Bachmann. The Minnesota congresswoman and Tea Party firebrand has often invoked her experience as a foster mother to twenty-three young women. She represents both the genuine evangelical zeal for at-risk kids that sustains the system and the hostility to social programs that threatens it. All of those girls were on Medicaid, which Bachmann voted to cut dramatically. The private virtue we claim to admire can’t escape its dependence on the public weal.

These days, when our kids instinctively comfort each other after a tumble at the town swimming pool, it’s easy enough to forget that our family is accidental and probably temporary. Parental affection can stretch itself farther than I could have imagined in those early days of round-the-clock shrieking. But we can never go long without realizing that Sophia’s difficult tendencies do not come from us, that she is likely to leave us someday, and that we are operating at the limits of our emotional, economic, and social capacity. Without a commitment by the state to cover the basic costs of her care, we would, like every other foster family, be asking ourselves daily whether we could keep doing it.

As social programs are unwound, foster parents watch our families being unwound with them. For most of us, our “altruistic motivations” always threaten to outstrip our resources. Foster parenting teaches us how to live as so many low-income families already live—check to check, coupon to coupon, appointment to appointment. The difference is that most foster parents hold middle-class passports, and they can cut short their sojourn among WIC recipients and Medicaid administrators at any time. No one knows what exactly will happen to Sophia and the nearly half-million kids in her situation if they exercise that privilege. If Republican lawmakers have their way, we may well find out.

The post Taxing the Kindness of Strangers appeared first on Washington Monthly.

]]>
27825
The Cure https://washingtonmonthly.com/2011/10/23/the-cure/ Sun, 23 Oct 2011 14:00:01 +0000 https://washingtonmonthly.com/?p=27789

The politics of debt have gotten so insane that both parties are on the verge of gutting Medicare. The moment might be right to actually fix it.

The post The Cure appeared first on Washington Monthly.

]]>

While the partisan gap in Washington is wider than it’s been at any time in living memory, the two parties do have one remarkable agenda in common. Both have proposed cuts in Medicare so drastic that they would have been politically suicidal a decade ago and may still be. Yet neither party is backing off.

All but six Republicans in the House of Representatives have voted to turn Medicare into a voucher program—a vision endorsed by all the GOP’s major presidential candidates as well. Under the proposal, famously crafted by Representative Paul Ryan, each senior citizen would receive only a fixed amount of money (about $8,000 on average in 2022) to spend on private health care insurance each year, regardless of what his or her health care needs and costs might actually be. The Congressional Budget Office (CBO) estimates that under the plan, seniors would pay about 68 percent of their health care costs out of their own pockets in 2030, as compared to 25 percent to 30 percent under traditional Medicare.

Democrats rightly characterize this plan as “ending Medicare as we know it,” but both President Obama and party leaders agree that deep cuts in Medicare spending must happen soon. “With an aging population and rising health care costs, we are spending too fast to sustain the program,” the president told a joint session of Congress on September 8. As part of his most recent deficit reduction plan, he has proposed $248 billion in Medicare savings over the next ten years. This includes higher copays for many beneficiaries and steep cuts in payments to providers. If you think Obama and the Democrats are bluffing, consider that the health care law they passed last year came with hundreds of millions in Medicare cuts and includes a mechanism that could cut vastly more. And though the president in September came out against Republican plans to raise the Medicare retirement age to sixty-seven, in the debt limit negotiations earlier this year he signaled his willingness to go along with it.

Then there’s the new Joint Select Committee on Deficit Reduction—aka the “super committee”—on which the president has also put his signature. By the end of the year, Congress must take an up-or-down vote on the recommendations of a majority of the committee, which are likely to include steep cuts to Medicare and, possibly, increases in the retirement age and other restrictions on eligibility. In the event the committee deadlocks, across-the-board spending cuts, including some to Medicare, go into effect.

Why are both parties declaring war on Medicare when both know that it could lead to their own political annihilation? The reason is simple. While both Democrats and Republicans fear the wrath of the AARP and the exploding ranks of hard-pressed seniors—to say nothing of lobbies like the American Hospital Association—Medicare’s relentless squeeze on the budget seems to party leaders to give them no choice but to attack the program’s spending regardless of the political cost. Medicare’s ever-expanding claims on the treasury threaten to crowd out nearly every other priority on either party’s agenda, from bullet trains and decent public schools to, yes, avoiding future tax increases and draconian cuts in the military.

The U.S. wouldn’t even face a structural deficit, much less have to endure the downgrading of its credit rating, were it not for the cost of Medicare (and, to a lesser extent, Medicaid). Just the projected increase in the cost of these two programs over the next twenty years is equivalent to doubling the Pentagon’s current budget, and there is no end in sight after that. By contrast, Social Security will rise only gradually, from 4.8 percent of GDP to 6.1 percent in 2035, and then taper off as the large Baby Boom generation passes. Meanwhile, according to the same CBO projection, all other government programs—the military, the courts, farm subsidies, Amtrak, infrastructure spending, education, and so on—are on course to shrink dramatically as a share of the economy, from 12.3 percent of GDP in 2011 to 8.5 percent in 2035. As others have observed, the federal government is not so gradually being transformed into a giant, and insolvent, health insurance company.

We can at least be thankful that both parties are sane enough to recognize the problem and brave enough to offer politically courageous proposals to solve it. But here’s the bad news: neither side’s solution is likely to work. The GOP’s privatization plan won’t actually cut health care costs, but merely shifts them to individuals. Meanwhile, the Democrats’ ideas, though offering more in the way of actual reform, are unlikely to bend the cost curve anywhere near far enough. Moreover, by focusing so much on cutting reimbursement rates to doctors without directly attacking the colossal inefficiency of the U.S. health care system, the Democrats’ approach runs the very real risk that it will lead to a severe shortage of doctors willing to treat Medicare patients.

Here’s a better idea—one that offers a relatively painless and proven fix that will also vastly improve the quality of U.S. health care. Approximately a third of all Medicare spending goes for unnecessary surgeries, redundant testing, and other forms of overtreatment, according to well-accepted estimates. The largest single reason for this extraordinary volume of wasteful and often dangerous overtreatment is Medicare’s use of the “fee-for-service” method of compensating health care providers that dominates U.S. medicine, under which doctors and hospitals are rewarded according to how many procedures and tests they perform. To fix this, the federal government should do the following: announce a day certain and near when Medicare will be out of the business of subsidizing profitdriven, fee-for-service medicine.

Going forward, Medicare should instead contract exclusively with health care providers like the Mayo Clinic, Kaiser Permanente, the Cleveland Clinic, Intermountain Health Care, the Geisinger Health System, or even the Veterans Health Administration. All these are nonprofit, mission-driven, managed care organizations widely heralded by health care experts for their combination of cost-effectiveness and high quality, including cutting-edge use of electronic medical records, adherence to protocols of care based on science, and avoidance of medical errors. Because doctors working at these institutions are not compensated on a fee-for-service basis, they are neither rewarded for performing unnecessary tests and surgeries nor penalized financially for keeping their patients well. And unlike for-profit HMOs, these institutions are not pressured by shareholders to maximize earnings through withholding appropriate care.

By the late 1990s, the spread of health maintenance organizations and other forms of managed care virtually eliminated health care inflation in the United States, providing a brief moment—not seen since—when the cost of health care did not outpace average wage increases. That triumph in cost containment had its downsides, to be sure, namely the corrupting entry of profit-driven institutions that undermined medical professionalism and often led to denial of needed care. But with the benefit of hindsight, we can avoid repeating those mistakes and reinvigorate the once idealistic and highly effective managed care movement by insisting that Medicare providers also be nonprofit institutions. If we have to control the cost of Medicare, why not do it this way?

The immediate answer might seem to be, “Because seniors would never stand for it!” But let’s examine that assumption, first by looking at the radical and far more painful alternatives that official Washington is now considering.

We’ve already seen how the Republican plan to “voucherize” Medicare would lead to seniors paying for nearly 70 percent of the cost of their health care, which is hardly insurance at all. This surely would save the federal government money, but it would bilk the American people. Nor would the plan do anything to improve the appallingly poor quality of health care received by Medicare beneficiaries. According to a study conducted by Medicare’s inspector general, every month 15,000 Medicare beneficiaries are victims of medical errors that contribute to their death. Another 8,000 a year do not survive hospital-acquired bloodstream infections, which the VA and other well-managed health care systems have shown are largely preventable. It’s hard to see how forcing Medicare patients to have more “skin in the game” will save them from being victimized by sloppy, dangerous, money-driven medicine—except, perhaps, by pricing more seniors out of access to infectious hospitals and the often fatal reach of money-chasing doctors.

Raising the Medicare retirement age to sixty-seven, a move favored by deficit hawks in both parties, might at first seem to be a reasonable adjustment. Since we are all living much longer, the idea goes, we can afford to wait longer to become entitled to Medicare. But the premise is false. For fully half of the U.S. population (specifically the poor and working-class Americans with earnings at or below the median), life expectancy at sixtyfive is virtually unchanged since the 1970s. In many parts of the country, including much the South, life expectancy at birth for black males is not yet even sixty-five, and in some places it is as low as fifty-nine.

As with plans to voucherize Medicare, the primary effect of increasing the age of Medicare eligibility would be to shift costs onto needy individuals, while also leading to worse health outcomes. Nor, in the grander scheme of things, would the proposal save the government much money, since most Medicare spending is concentrated on people well over the age of sixty-seven, and many of the people who would be cut from the Medicare rolls would wind up on Medicaid or qualifying for other means-tested government subsidies. The Kaiser Family Foundation estimates that if the proposal were fully in effect in 2014 it would generate only about $5.7 billion in net federal savings but would impose twice as much cost ($11.4 billion) on individuals, employers, and states.

Then we have the proposal generally favored by mainstream Democrats: cutting back on reimbursement rates for Medicare providers. To be sure, reimbursement rates need to be adjusted; Medicare pays far too much for many procedures of dubious value. By overpaying cardiologists relative to other providers, for example, the system encourages too many medical school students to go into cardiology rather than family practice. And in the process it also generates egregious rates of unnecessary and often harmful heart operations: as has been scientifically established for years, a million stents annually are placed in patients whose heart conditions would be better treated with drugs. By overpaying radiologists, Medicare fuels the unconscionable overuse of redundant scans that have little or no medical value and expose individuals to dangerous levels of radiation. But experience has shown that cutting back reimbursement rates doesn’t necessarily save money, let alone improve quality, so long as profit-maximizing providers remain free to game the system.

For example, after Medicare began restricting the amount it would pay for specific procedures in the mid-1980s, many providers responded by simply making it up on volume—by increasing the number of unnecessary tests and surgeries they performed. Often this takes the form of “up-coding,” the now widespread phenomenon whereby doctors diagnose patients as being sicker than they actually are so as to make more money on treating each one. Simply cutting prices in regions where Medicare spending is high due to overtreatment “will only cause providers in those regions to deliver more services,” notes Dr. Elliott S. Fisher, director of the Center for Health Policy Research at Dartmouth Medical School. Worse, cutting reimbursement rates, particularly if done crudely across the board, will create shortages of doctors who are willing to accept Medicare patients—especially vitally needed primary care doctors, who are already poorly compensated and in short supply.

At this point, defenders of the Affordable Care Act will be quick to assert that they have engineered solutions to these problems. First, they will point out that the act calls for the creation of the Independent Payment Advisory Board (IPAB), a new entity that will be charged with keeping the per capita growth in Medicare spending far below its historical average. IPAB will have extraordinary powers to fast-track cuts in reimbursement rates. Just as importantly, it will be able to use Medicare’s purchasing power to reform the way hospitals and health care networks do business—for instance, by the “bundling” of services into a single payment to encourage doctors to forego unnecessary tests.

While IPAB is arguably the most potent weapon the government has ever conceived to control Medicare spending and possibly improve its quality, there are strong reasons not to bet the farm on it. For one thing, Republicans are gunning to kill the proposed board with the usual talk of “death panels,” and more than a few Democrats are also conspiring to snuff it out. (See Sebastian Jones, “Friends Like These,” Washington Monthly, July/August 2011.) For another, the cost cutting will come slowly: IPAB can make no recommendations that affect reimbursement rates for hospitals until 2020, even though hospitals are the largest single category of Medicare spending. There’s also the ever-present danger that the board will eventually be captured, as many government oversight boards are, by the industries it’s meant to police. But even if IPAB survives politically and remains fiercely independent, it will be able to effect change only through the clumsy and imprecise leverage of Medicare reimbursement rules. Its new regulations might inspire the health care industry to reform itself, but, just as likely, providers will respond with new tactics to outfox the regulators, as they have in the past through schemes like making it up on volume. And given the magnitude of the cuts that would be required in the absence of vast improvements in the overall efficiency of the entire system, there is a serious possibility of creating severe shortages of physicians who will want to take Medicare patients.

But not to worry, say defenders of “Obamacare”; we’ve got a plan to speed up those reforms. The ACA contains billions of dollars to incentivize the creation of “accountable care organizations.” Just what are they? It’s hard to say, since the language of the bill on this subject is so vague. An essential feature, though, is that an ACO is an institution that contracts with Medicare to serve a specific population and promises to deliver specific quality metrics, such as keeping infection rates down or offering primary care services to patients. In return, it receives the right to retain a large share of any resulting savings.

So far, ACO pilot programs have proved disappointing, producing little if any savings. And there are good reasons to believe that most ACOs will never deliver the quality and cost-effectiveness of truly integrated nonprofit health care systems like the Mayo Clinic or the VA. Under newly minted regulations, there is nothing to prevent ACOs from being just loose networks of colluding, profit-driven, fee-for-service providers who go through the motions of pursuing quality. Even stalwart defenders of ACOs now acknowledge their large potential for abuse. As Donald Berwick, administrator for the Centers of Medicare and Medicaid Services, recently told a forum at the Brookings Institution, “There will be parties out there who want to repackage what they do and call it an ACO.”

Berwick went on to warn, as have many others, that many ACOs are likely to be effective monopolies in their local markets, given the massive consolidation already going on in the health care industry. This means they will be tempted to abuse their market power by, for example, raising their rates for non-Medicare patients. This “would ultimately undermine any short-term savings achieved by Medicare,” notes Merrill Goozner of the Fiscal Times, “since increases in a region’s top line health care tab would eventually force Medicare to raise its own rates.”

Even if all these and other pitfalls of ACOs are avoided, there still remains an objection that no one can rebut: any benefit ACOs might bring will at best be only gradual. Unless a more immediate and certain reform is applied, most of the Medicare population will continued to be treated—for years if not decades to come—by the status quo of a pattern of deeply fragmented, wasteful, and dangerous fee-for-service care, the cost of which everyone now agrees is unsustainable. If we’re going to avoid financial Armageddon, we have to do better than that.

As it happens, ACOs are not the first to attempt to provide higher-quality outcomes while lowering the cost of treatment. For ten years during the 1980s and ’90s, Americans embraced and then rejected HMOs and managed care. While the experiment in widespread managed care ultimately failed to reshape American health care, much can be learned by examining what worked and what didn’t.

Today, many Americans view HMOs simply as organizations designed to make money by denying them care. And it’s a sad fact that many HMOs have wound up doing just that, or else using clever marketing techniques to make sure they cherry-pick only young and healthy customers who are unlikely to get sick. But it is important to remember that HMOs and other forms of managed care came into existence in large measure because of a big problem that is still with us and getting worse—namely, vast amounts of poorly coordinated, excessive, and dangerous treatment.

The original vision of those who championed HMOs was that this new model of care would vastly improve the quality of American medicine and only incidentally lower its cost. Paul Ellwood, a pediatrician who more than any other single advocate built the case for HMOs starting in the late 1960s, put it this way: “My own most compelling interest as a physician was in the integration of health care, quality accountability, and consumer choices based on quality first and, secondarily, price.”

What Ellwood and other reformers wanted more specifically was an “integrated delivery system” in which primary care physicians would coordinate care in large, multispecialty medical group practices that would in turn be part of a coordinated system of hospitals, labs, and pharmacies. Moreover, to address the problems of overtreatment and lack of prevention, care providers would be prepaid a set amount per patient. As Alain Enthoven, another champion of managed care, once wrote, this would give “doctors an incentive to keep people healthy.”

Such were the highly idealistic and data-driven concerns and issues behind the emergence of HMOs. What went wrong? Eventually, HMOs morphed into many different forms and hybrids. Some were nonprofits, others were publicly traded companies answerable to Wall Street. Some were “staff models” that put physicians on salary and effectively eliminated the problem of intentional overtreatment; others became little more than loose networks of doctors on contract. Some were run by idealists, others by shysters, crooks, and knaves who convinced themselves that the road to riches could be found by low-balling on prepaid contracts and then denying their patients necessary care.

Even the many HMOs that tried to do the right thing often ran into a fundamental flaw in their business model. Most remained small enough that the majority of their customers changed plans every few years, either because they moved to a different market or because their employers switched to a cheaper plan. For all but the largest HMOs, this circumstance demolished the business case for prevention and effective management of long-term conditions like diabetes. Before any returns from investing in a patient’s long-term health could be realized, the patient was likely to be enrolled in some other plan. According to Lawrence P. Casalino, a professor of public health at Weill Cornell Medical College who has extensively interviewed HMO executives, the common view in the industry is “Why should I spend our money to save money for our competitors?”

By the 1990s, most people who were enrolled in any particular HMO had little or no choice in the matter; they were there because their employers were trying to save money. It didn’t help that many fee-for-service doctors felt threatened by the growing dominance of HMOs and other managed care providers and complained to their patients about it. Neither was the industry’s image helped by the negative press and lawsuits that some HMOs attracted.

The result was a public backlash. But with the benefit of hindsight, we can see that it didn’t have to turn out this way. We only have to look at the big exceptions to the often poor performance of managed care organizations over the last several decades. These are institutions with high levels of patient satisfaction that are also lauded by health care quality researchers for their patient safety, adherence to evidence-based protocols of care, and general cost-effectiveness. They include integrated providers like Intermountain Health Care, the Cleveland Clinic, the Mayo Clinic, Geisinger Health System, Kaiser Permanente, and the VA, the last of which ranks highest of all on most cost and quality metrics and is in effect the largest, and purest, nonprofit, staff-model HMO in the land (though, of course, government run and open only to veterans). The VA’s cost per patient is about 21 percent below what it would cost under Medicare to serve the same population with the same level of benefits. Until the wars in Iraq and Afghanistan heated up, the VA was also holding increases in its cost per patient down to just 1.7 percent a year, compared to annual increases of nearly 30 percent for Medicare.

What do these exceptions to the rule have in common? First, they are all large enough to achieve significant economies of scale. The VA’s scale, for example, has also been an important precondition for the deployment of its highly effective system of electronic medical records, the cost of which it has been able to spread across a large base of hospitals and clinics. So too with Kaiser Permanente and the other examples of “best-practice” health care delivery systems mentioned above. The size of these institutions also means that the data generated by their digitalized information technology about what works and what doesn’t has far greater scientific value because the records are drawn from a very broad population. And their scale allows them to integrate and coordinate care among a broad range of specialists who all work for the same institution and use the same patient records so that the care patients receive is far less fragmented (and dangerous) than found generally in fee-for-service medicine.

Furthermore, large size gives these institutions substantial market power to negotiate favorable deals with drug companies and other medical suppliers. The VA enjoys a 48 percent discount in the price it pays for frequently prescribed drugs compared to those obtained by even the next-biggest health care plans. The size of the VA also allows it to push past the cartels, known as group purchasing organizations, that control the prices paid by smaller health care providers for hospital supplies, from hypodermics to bed linens. (See Mariah Blake, “Dirty Medicine Washington Monthly, July/August 2010.)

Finally, and just as importantly, the size of these institutions allows them to hold on to a significant portion of their customers year after year. This, along with their nonprofit status, preserves a business case for prevention and investment in long-term health. Unlike for-profit HMOs, they are not under pressure to maximize short-term profits by withholding appropriate care; instead, all their incentives are aligned toward providing enough care, and no more than is necessary, to keep their patients healthy over the long term.

We should set a date when the Medicare system will stop covering fee-for-service medicine. Medicare beneficiaries would instead have the choice of deciding among competing managed care organizations that meet specific quality requirements. These organizations wouldn’t be standard for-profit HMOs. And they would not receive the inflated, no-questions-asked reimbursement rates that have prevailed under the Medicare Advantage program. Nor would they be anything as amorphous and underdefined as an accountable care organization.

Instead, providers qualified for reimbursement under Medicare would have to be nonprofit organizations to start with. They’d also have to use salaried doctors, deploy integrated health information like the VA and other best-in-class health care providers do, adhere to evidence-based protocols of care, and operate under a fixed budget. Specifically, for every Medicare patient who decided to join their plan, the government would pay a specific annual reimbursement based on that patient’s age. These Medicare-certified providers would not be allowed to turn away patients on Medicare or kick such patients out of their plans. In order to stay in the program, they would have to meet strict safety and quality requirements on such measures as hospital-acquired infection rates. And they would have to be at least of a certain size to participate.

The latter requirement would allow them to achieve the economies and other benefits of scale described above. With enough large institutions participating, the government could assure that no single one monopolized a local market and that seniors always had a choice of plans.

The best of our integrated health care providers would instantly qualify. With that advantage, top-flight regional providers like Mayo, Intermountain, and the Cleveland Clinic would have an incentive to expand geographically. The VA, which is already national in scope, could be allowed to expand by serving the many older veterans who are currently excluded from the system because they lack service-related disabilities or are not poor enough to meet the VA’s means test. By allowing these older vets to use their Medicare entitlement for VA care, and perhaps their elderly spouses as well, everyone would win.

Meanwhile, many existing health care providers that didn’t qualify would face a choice: they could merge with institutions that already deliver the highquality health care necessary to become a Medicare-certified provider and adapt to their cultures and protocols care, or they could reform themselves. Under the threat of losing their ability to collect from Medicare, they would find it much easier to stare down greedy, profit-driven specialists and others resistant to change and gain the power they needed as an institution to do the right thing.

Raising any capital needed to reform an existing institution, or to create a new one eligible to treat Medicare patients, should not be a serious obstacle. Banks and investment firms would gladly extend credit and capital to any institution that could show a reasonable plan for meeting the requirements, because such institutions would have a predictable future revenue stream that could be used as collateral. Our financial system routinely does this for other nonprofit entities that have predictable revenue streams, from cities and counties to universities, as well as certain hospitals with assured earnings.

Indeed, institutions that became certified to serve the Medicare population under this proposal could reasonably hope to attract many younger Americans, especially those who will become mandated by the ACA to purchase health coverage starting in 2014. Benefiting from an inherently efficient model of care, these institutions will be the thrifty option for fulfilling the individual mandate, while also happening to be the smart option as well. They may also be attractive to middle-age Americans contemplating retirement, who may want to transition early into the system that will wind up treating them into old age. Indeed, the government may even want to encourage this kind of behavior, given that the longer an HMO is on the hook for a patient’s care, the more financial incentive it has to keep the patient healthy. These and other effects could ripple through the system, hinting at a bigger truth: if you reform the delivery of Medicare, you just might reform the entire health care system.

Would there be resistance to such a proposal? Of course. But compared to what?

Let’s start from the point of view of individual citizens. Yes, many current Medicare beneficiaries would be upset by any change to the status quo. But these folks could and probably should be allowed to stay in traditional Medicare; the changes outlined here will take some years to put in place in any event. The people who will be affected first are those eligible for Medicare in, say, ten years.

Most of us who are now approaching retirement age or are younger have spent our entire lives living with, and largely accepting, some constraints on our choice of doctor, if only through the limits imposed by preferred provider networks. Personally, not once since I was still young in the early 1980s have I been part of a health insurance plan that allowed me to choose any doctor I wanted without paying a financial penalty, and I’ve had what by the standards of the times has been “gold-plated” coverage. Almost the only people left in America who don’t face such restraints are current beneficiaries of feefor- service Medicare.

That said, a plan like this could still provide future Medicare beneficiaries with plenty of options. In addition to being able to choose among competing Medicare-eligible HMOs, seniors should also be free to use their own money to pay to see any doctor they want or to access experimental drugs or unproven treatments that the HMOs (wisely) won’t cover. If the “price” of preserving Medicare is that some of us will be sometimes forced to go “out of network” and pay more of our own money to receive some kinds of care, then I think younger Americans already inured to the practice will almost certainly be willing to pay it.

To those who disagree, we could offer an additional choice: If you wait until you are, say, age seventy to apply for Medicare, then the system will cover you for the same wasteful fee-forservice medicine your parents currently get. But if you want to be covered at age sixty-five, you’ll have to agree to receive your care from a Medicare-certified nonprofit HMO.

These are tough choices, no doubt. But ask yourself: Do they sound all that onerous when compared to the competing policy proposals already on the table, such as turning Medicare into a voucher program that leaves all of us responsible in old age for paying 70 percent of our own health care costs, or seeing Medicare reimbursement rates reduced to the point that we can’t find a doctor who will treat us, or having to wait until age sixty-seven before being eligible for Medicare at all?

We can certainly expect lots of opposition from wellheeled practitioners of for-profit medicine—all those cardiologists making a killing doing unnecessary stent operations, for example. And we’ll hear from many prestigious academic medical centers, an unfortunate number of which engage in massive amounts of overtreatment because they are dominated by specialists who look down their nose at doctors engaged in “mere” primary care.

Yet as difficult as these challenges will be, reformers are now armed with abundant, peer-reviewed proof of just how dangerous and wasteful fee-for-service medicine has become, and the public has begun to catch on as well. Ten years ago, for example, researchers were just beginning to document how the death toll of medical errors, hospital infections, and inappropriate treatment had conspired to make contact with the health care system the third leading cause of death in the United States. Today, these facts are widely accepted by heath care experts and generally understood by policy makers at the highest levels of government. Educated Americans have read about them in the newspapers, and most citizens who have spent any time in a typical hospital trying to make sure a loved one gets her proper medicine on time have experienced firsthand the extent of routine system breakdown.

Some conservatives, no doubt, will instinctively align themselves with the forces of for-profit, fee-for-service medicine, or be lured into doing so by heaps of campaign contributions. Many Democrats as well can be counted on to carry water for prestigious but deeply wasteful and dangerous academic medical centers, which tend to be concentrated in Deep Blue zones like New York, Boston, and Los Angeles. So yes, enacting this proposal will not be easy.

But then, ask yourself again, compared to what? Both parties have already signed on to changes to Medicare that are hardly less radical, will be resisted by powerful interest groups, and risk the wrath of voters. Moreover, these proposals are not really solutions, because they either shift the inflating cost of health care onto individual Americans or cut reimbursement rates to a point where Medicare is “saved” on paper but in the real world has little value to elders who can’t find a doctor. By contrast, this approach directly attacks the root problem, which is the waste and inefficiency caused by fee-for-service medicine.

And as politically difficult as the road to this solution may be, it does give each side things it wants. It allows Democrats to say that they will not cut benefits to Medicare recipients. And Democrats should also like that these nongovernmental organizations serving the Medicare population will have the freedom to do things liberals have long wanted Medicare itself to do, like bargain with drug companies for lower prices. Meanwhile, Republicans who support this proposal will be able to boast that it takes vast decision- making power out of the hands of “unelected bureaucrats in the federal government” and puts that power in the hands of private organizations that compete with each other for customers. Under this approach, Medicare officials won’t have to figure out how to write regulations on what specific drugs and procedures are not appropriate medicine; they’ll be contracting out those details to private-sector organizations and simply holding them accountable for results, such as keeping a high percentage of their patients healthy and managing their conditions effectively.

Let’s close by stressing the positive. America is still a rich and productive country. Compared to Europe or Japan, it has a youthful population and no real long-term debt crisis except that caused by huge volumes of wasteful and dangerous fee-for-service medicine. So once again in our long history, Americans can have their cake and eat it too. We can improve our health care while lowering its cost, and in the process eliminate our long-term deficits and resume building for future.

So why don’t we feel more optimistic? Because there is this feeling of despair, especially among policy makers and the chattering classes, that we don’t know how, politically, to bring health care costs in line. We know that all other developed countries get better health care for less money, and that it is no real mystery how they do it. But all their approaches seem—or can be spun as— socialistic, paternalistic, and fundamentally un-American, and therefore impossible to consider.

Yet we have within our reach a solution that is not imported from abroad, and that has been proved on our own shores by all-American institutions, from our best nonprofit HMOs to the VA health system. We may not currently have the political will to use these institutions as the model and means to fix the health care crisis, and hence eliminate our long-term fiscal problems. But we shouldn’t fool ourselves into thinking it can’t be done.

The post The Cure appeared first on Washington Monthly.

]]>
27789
Tilting at Windmills https://washingtonmonthly.com/2011/10/23/hidden-capital/ Sun, 23 Oct 2011 13:18:28 +0000 https://washingtonmonthly.com/?p=27769 Hidden capital There is a problem with government accounting that drives me around the bend, but that I rarely see noted elsewhere. When corporations build a factory, it is considered a positive, the creation of a capital asset. But when government builds a road or a school, it’s an expense. It’s simply spending money, treated […]

The post Tilting at Windmills appeared first on Washington Monthly.

]]>
Hidden capital

There is a problem with government accounting that drives me around the bend, but that I rarely see noted elsewhere. When corporations build a factory, it is considered a positive, the creation of a capital asset. But when government builds a road or a school, it’s an expense. It’s simply spending money, treated the same as the most frivolous waste. Why can’t we have a system of accounting that gives government credit for the creation of genuine assets like bridges and schools?

Consider the alternative

I agree with those who say both parties are responsible for the sorry state of Washington, but isn’t it time to face the fact that much more than half the guilt lies with the Republicans? I say this even though I agree with much of the criticism of the Democratic Party and of Barack Obama that I find in the words of the liberal commentariat, in the centrist Matt Miller’s call for a third party, and even in Ron Suskind’s new book Confidence Men. But I also think it is foolish for thoughtful Americans to waste much more time focusing on the shortcomings of the Democrats and of the president. They indulged in a similar orgy of faultfinding in 2010, with the result that too many of them failed to vote and the country elected the worst House of Representatives in memory. There is a real danger that next year the Democrats will lose not only the presidency but also the House and the Senate.

To wake up to the danger, think about your choice. Obama and the Democratic Congress gave us national health care and Wall Street reform. Both were admittedly far from perfect. But do you really think that a Republican administration and Congress would have done—or will do—better?

Who gave us gays serving openly in the military, which had been such a great goal of liberals? It wasn’t Bill Clinton or any Republican. Who got Osama bin Laden, which was a goal of conservatives, as well as the rest of us? It wasn’t George W. Bush. Who was the first president to take on the issue of teacher quality, one of moderate Matt Miller’s main concerns? Neither Clinton nor Bush faced it and did something about it, as Barack Obama and Arne Duncan have done with their Race to the Top program.

The company you keep

One of the most disturbing trends is the one away from Obama among so many liberal American Jews. Are they going to let themselves be swayed by the right wing that has taken over Israel? That Rick Perry and his ilk are standing 100 percent behind Israel’s present leaders—see Perry’s recent Wall Street Journal op-ed “The US Must Support Israel at the UN” and the article “House GOP Finds a Growing Bond with Netanyahu” in the New York Times—should be warning enough against following Netanyahu and his American apologists. Thomas Friedman puts it bluntly, calling Netanyahu’s “the most diplomatically inept and strategically incompetent government in Israel’s history.”

They know not what he does

One little-noted Obama accomplishment was recently acknowledged by Kevin Sack in the New York Times. He reports that, according to the Centers for Disease Control, the number of uninsured young adults aged eighteen to twenty-five has dropped by a margin of 900,000. This reduction was recorded just one year after the effective date of the Affordable Care Act, which made parents’ health insurance cover their dependents up to age twenty-six. Before, typical insurers had dropped coverage at age eighteen or twenty-one.
The Times ran this story on its front page. Unfortunately, this was not the case in any other newspaper I’ve seen, a fate that has been typical of the majority of the national media’s treatment of Obama’s policy successes, as emphasis is given to politics over substance.

The result is a recent New York Times poll that showed only 34 percent of Americans approved of Obama’s handling of the economy but also found that more than a majority support every item in his current stimulus proposal, including 80 percent who think it’s “a good idea to spend money on the nation’s infrastructure like bridges, airports and schools.”

The takeover

In previous columns I have noted that congressional staffers now dream not of becoming members, as they once did, but of earning big bucks as lobbyists. Now comes a study with the hard evidence, produced by the transparency advocacy group LegiStorm, finding that almost 5,400 current and former staffers “have gone through the lobbying ‘revolving door’ in the past decade alone.”

Lobbying firms also have reverse influence by farming out their employees to serve as staff on influential congressional committees. For example, the Washington Post has reported that there are thirteen former lobbyists on the tax-writing House Ways and Means Committee and twelve serving as staff for the debt reduction “super committee.” To top it all off, the Project on Government Oversight reports that a former Goldman Sachs official is now employed by the House Committee on Oversight and Government Reform, “helping the committee chair, Rep. Darryl Issa (R-California), write letters to banking regulators questioning the need for new derivatives oversight.”

Talk about letting foxes into the chicken coop!

Christie as cynic

Back to Obama. I agree with the criticism that he has spent too much time reaching out to Republicans. But I do understand and admire his desire to find common ground. His own experience at the Harvard Law Review and in the Illinois legislature seemed to prove that he could get conservatives and liberals, Democrats and Republicans to work together. But of course he underestimated the extent to which the rigid right has come to dominate the Republican Party in Washington.

Maddeningly, Governor Chris Christie, in his recent speech at the Reagan Library, asked, “What happened to State Senator Obama? When did he decide to become one of the ‘dividers’ he spoke of so eloquently in 2004?” This is so cynical! Christie is not stupid. He knows that Obama tried again and again to reach out to the Republicans, only to be rebuffed again and again. Christie says he “thought hard” about the Reagan speech, which means we can be relieved by his final “no” to running for president.

Regulation is not the problem

Conservatives with minds as capable of subtlety and irony as David Brooks’s are rare. Recently, however, Brooks fell into an outrageous right-wing cliché, asserting that “a growing government sucked resources away from the most productive parts of the economy—innovators, entrepreneurs and workers—and redirected it to the most politically connected parts. The byzantine tax code and regulatory state has clogged the arteries of American dynamism.”

What sucked away resources from innovators and entrepreneurs was Wall Street’s emphasis on trading and creating exotic new financial instruments instead of helping new businesses get started and existing ones expand. And money that the government might have spent on financing new jobs through investment in schools and infrastructure had to be devoted to wars that Bush’s tax cuts did not pay for. As for regulations, our present economic distress stems far more from too little than from too much.

A survey of small businesses conducted by McClatchy newspapers came closer to the truth than Brooks had done. Though one owner declared that “higher taxes aren’t good for business,” another argued that “the rich have to be taxed,” and the study concluded that “there was little evidence” that a “fear of higher taxes” was responsible for tepid hiring.

Finally, the study found that “none of the business owners complained about regulation in their particular industries.”

The danger of doing nothing

“Imagine a football field packed 20 feet high with highly radioactive nuclear waste,” as Mark Moremond of the Wall Street Journal recently asked his readers to do. That, he explains, is the amount of nuclear waste sitting around at various sites in this country. The bad news is that nothing is being done about it. There is no good news.

Learning on the job

If Solyndra was a mistake, it was, as my friend Joe Nocera made clear in a recent New York Times column, an understandable one. But it also illustrates Obama’s greatest weakness as he began his presidency: the lack of understanding of the executive branch that, for example, led him to leave spending too much of the stimulus money to the Department of Energy. As this column noted in 2009, the DOE has a terrible record when it comes to effective spending. Indeed, as of mid-September of this year the DOE had “only two weeks left [in the fiscal year] to commit the [stimulus loan] program’s remaining $9.3 billion,” according to the Washington Post‘s Joe Stephens and Carol Leonnig. Obama has since acknowledged his early innocence about whether a project was truly “shovel ready” or not, so I am hopeful that he will do better in the future. I am fortified in this view by his appointment of Jack Lew to succeed Peter Orszag as head of the Office of Management and Budget. Lew, as a knowledgeable OMB veteran, understands the bureaucracy much better than his predecessor.

When going public was bad for the public

If the age of greed did not officially begin until the 1980s, there were some early signs that it was on its way. One was the craze for “going public” that took root a few years earlier. A Wall Street firm would descend on prosperous businesses that were either controlled by families or a small group of backers and tell them that they should let the firm assist them in selling their stock to the general public. The idea was that the sale would produce enough money to enrich the owners and, incidentally, give a hefty cut to the Wall Street firms. If the company appeared to be in good shape, the formula usually worked, and everyone did well. Often very well. There was almost no liberal criticism of this practice because “going public” sounded so thoroughly virtuous.

There was however, a downside: the former owners found themselves at the mercy of Wall Street’s habit of rating companies on the basis of constantly growing quarterly earnings. This made it difficult, for instance, for the original owner to keep all his employees on the job during a business downturn. The pressure from Wall Street and the stockholders was to cut expenses—like payroll—in order to protect earnings and make the company’s bottom line look good.

Thus many of these companies in the current recession have found themselves eliminating jobs, often losing employees that they would like to keep. On the other hand, a family-run business, as long as it manages to break even, has the option to keep everyone on the payroll. This is exactly what has helped soften the world recession in Germany, as Steven Rattner makes clear in a recent issue of Foreign Affairs. In Germany, family-owned businesses, called Mittelstand, are a major part of the medium-sized manufacturing sector. They can, as Rattner points out, “put a higher priority on employing Germans than do public traded companies,” because they are freer “to focus on long term growth than on short term profits.”

The tune-up

Speaking of Steven Rattner reminds me of another Obama triumph, the rescue of the auto industry, in which Rattner was the administration’s point man. This effort saved more than a million jobs in the auto and related industries—and, in helping inspire reform of the business, promises future growth with more jobs to come.

How both sides got wiser

Many of the reforms are in management management, which is becoming more flexible and innovative. But one significant reform, reported by the Wall Street Journal, came from big labor. The UAW is now agreeing to link wages and benefits to company performance instead of, as had become union practice, demanding increases regardless. Incredibly, when the farsighted UAW leader Walter Reuther proposed just that kind of arrangement to the auto industry back in 1946, they turned it down.

Medical laissez-faire

“At least 15 drug and medical device companies have paid $6.5 billion since 2008 to settle accusations of marketing fraud or kickbacks,” reports the Washington Post. These kickbacks were typically paid to the doctors who prescribed the drugs. Yet, reports the Post, “not one of the doctors has been prosecuted or disqualified by state medical boards.”

This reminds me of the time when I was approached by a local physician after I’d written several items expressing skepticism about some of the lawyers who bring medical malpractice cases. The doctor thought I might become an ally in opposing these lawsuits. I told him I was ready to support replacing malpractice litigation with a system of no-fault compensation for injured patients, but that I had one reservation: without the threat of lawsuits, I saw no way of punishing unethical or incompetent physicians, who, to my knowledge, were rarely (meaning very close to never) disciplined by their local medical societies or state licensing boards. I said it was a problem that could be solved by adding enough independent members to the doctor-dominated groups that govern accountability in the medical profession. If physicians would support such a reform, I said, I would be glad to join their effort. I never heard from that doctor again.

Who’s in charge

In case you live in the Washington area and worry what would happen in the event of another event like 9/11 or the Cuban missile crisis—during which the possibility of a nuclear attack on the city rose for a few days from the realm of possibility to the realm of probability—the Washington Post‘s Robert McCartney says your concerns are not unfounded. “Ten years after the Sept. 11 attacks, the Washington region still hasn’t decided exactly who’s responsible for ordering an evacuation of the District and its neighbors,” or how to “communicate such decisions to the public.” Who is in charge during an emergency? The answer is unclear. But not to worry: the Post says that “a regional working group is studying the matter.”

How the Washington, D.C., area got rich

In 2010, the Washington metropolitan area enjoyed the highest median income of any in the country. A major factor in the growth of our wealth has been government contracts.

The contracting out of the functions of the federal government was pioneered by the Pentagon as its military and civilian officials discovered that the process of contracting out enabled them to kill two birds with one stone, disguising the growth of their own bureaucracy while providing lucrative employment opportunities for their retirement years, which in the case of the military could begin in their early forties. Then, once Bill Clinton had declared an end to the era of big government, his Reinventing Government initiative had to focus on downsizing the number of federal employees. The result: other agencies quickly adopted the Pentagon’s solution of contracting out so that they could appear to downsize by transferring employees from direct hire to contract. Then the Bush administration responded to 9/11 with its giant national security program and the wars in Afghanistan and Iraq and their attendant reconstruction efforts, all of which provided vast new worlds of opportunity for contracting, especially with Republican officials loath to admit they were increasing the size of government.

Some contracts are for things government employees cannot do, like making planes or tanks, but many, including most of those in the Washington area, as Annie Gowen illustrates in a recent article in the Washington Post, are for personal services. And here is the problem: according to the Project on Government Oversight, “the government is now paying contractors nearly twice as much as it would have to pay federal employees to do the same job.”

The right time to propose

In only its second issue, March 1969, this magazine exposed one secret of the clever contractor. It was to make his proposal toward the end of a fiscal year, when the agencies usually had money they needed to spend, because otherwise it would revert back to the treasury.

Even though there have since been several reforms, opportunities for this tactic still exist: the Department of Energy, for instance, did not commit the final $4.9 billion of its solar loan funds until September 30, the final day of the 2011 fiscal year.

The post Tilting at Windmills appeared first on Washington Monthly.

]]>
27769
Sisyphus Gets to the Top https://washingtonmonthly.com/2011/10/23/sisyphus-gets-to-the-top/ Sun, 23 Oct 2011 12:50:31 +0000 https://washingtonmonthly.com/?p=27785 How America's forbidding political landscape made health care reform impossible for Clinton and nearly so for Obama.

The post Sisyphus Gets to the Top appeared first on Washington Monthly.

]]>

No other nation organizes its government as incoherently as the United States…. Its policies are set to run a legislative obstacle race that leaves most reforms sprawling hopelessly in a scrum of competing interests. Those which limp into law may collapse exhausted, too enfeebled to struggle through the legislative tangle which now confronts them, and too damaged to attack the problems for which they were designed. The humiliation of the will of government is popularly reckoned no bad thing.

Thus begins Peter Marris and Martin Rein’s landmark 1967 book, Dilemmas of Social Reform: Poverty and Community Action in the United States. Their critique of the structural impediments of American government could easily have been written today by some liberal blogger lamenting the shortcomings of health reform. It is certainly a theme of Paul Starr’s new book, Remedy and Reaction: The Peculiar American Struggle over Health Care Reform. A Princeton professor and Pulitzer Prize-winning author, Starr was a senior adviser to President Bill Clinton during the 1993 battle over health care reform. While he is an ardent liberal and a supporter of the 2010 health bill, Starr has maintained a critical distance from the Obama camp. This unsentimental perspective serves him well in this outstanding volume.


Remedy and Reaction:
The Peculiar American
Struggle over Health Care
Reform

by Paul Starr
Yale University Press, 335 pp.

Remedy and Reaction accomplishes several tasks in its brisk 300 pages. The first quarter of the book provides a tour de force 100-year history of American health care reform—a dimension often missing from current policy discussion. Encountering many useful nuggets of information along the way, one finds the need to revise many commonly accepted accounts of how we came to our present predicament.

You may have heard, for instance, that employer-based coverage was created during World War II to evade wage and price control—and, certainly, that history matters. Yet, as Starr relates, employer-based coverage started in earnest during the 1930s, as early Blue Cross plans addressed classic market failures in the provision of health coverage. Providing health insurance through large employers offered economies of scale and provided stable and favorable risk pools, mitigating the problems that perennially plagued insurance markets. Given this history, it’s important to keep in mind that any hasty effort to unravel employer-based coverage—such as candidate McCain’s 2008 plan—might prove quite harmful if these concerns were not addressed.

It’s common knowledge that, in signing the Affordable Care Act, President Obama succeeded at a task that had eluded his predecessors going all the way back to Theodore Roosevelt. Indeed, many presidents have sought to expand health services while in office, but, as Starr notes, only one previous president entered office with an explicit promise to provide near-universal health coverage through health care reform: Clinton, in 1993.

From his insider’s perspective, Starr describes the painful failures of the Clinton reform effort. Many liberals—not least many Obamans—blame one or both Clintons for the debacle: had they presented a bolder and simpler plan, had they shown more tactical savvy, had they dealt more effectively with key House and Senate leaders, had Daniel Patrick Moynihan been less destructive, had Bill Clinton been less of a cad, we would have enacted comprehensive health reform long ago.

If the experiences of the past three years didn’t suffice to debunk these aspersions, Remedy and Reaction certainly should. Viewed in historical context, President Clinton’s willingness to attack health care reform was more remarkable than the subsequent disappointing result, for three main reasons.

The first was changing partisan dynamics. In principle, Clinton’s efforts might have led to some sort of grand bargain among the diverse constituencies that make up our $2.6 trillion health care economy. That, in turn, might have resulted in an ideological compromise between liberals and conservatives, in which universal coverage was pursued within a framework of market incentives. In reality, this kind of transactional politics was supplanted by a much more poisonous partisan politics—which remains with us today. A new breed of ideologically motivated Republicans saw an opportunity to destroy the centerpiece domestic policy initiative of a Democratic administration, and they were unafraid to wield powerful procedural tools to accomplish this objective. Starr quotes political scientist Gary Jacobson’s astute summary: “The illusion of unified government put the onus of failure on the Democrats; the reality of divided government let Senate Republicans make sure the administration would fail.”

Second, as Starr notes, these same procedural tools allowed a skilled Republican minority to demoralize Democrats and heighten public cynicism and apathy—simply by dragging out the process. As commentators Ezra Klein and John Sides emphasized last year, and as John Hibbing and Elizabeth Theiss-Morse’s book Stealth Democracy: Americans’ Beliefs About How Government Should Work documents in greater detail, American voters hate the actual process of legislating. The electorate’s undiscriminating disdain for prolonged legislative bickering will always impose a large roadblock to ambitious reforms, many of which require prolonged bargaining. True to form, the Clinton plan slowly asphyxiated as it got stuck in Congress.

Third, beyond these generic roadblocks, Clinton faced other daunting obstacles rooted in the specific history of American health policy. Before Bill Clinton even reached grade school, President Harry S. Truman failed to achieve a near-universal program modeled after Social Security. After this defeat, liberals made two fateful decisions that brought permanent, largely unintended consequences. As Starr writes,

The United States took the critical steps in the formation of its health care financing system in the two post-World War II decades, when it turned decisively towards private, employer-based insurance and created separate programs for the elderly and for the poor. These were the years when the United States ensnared itself in a policy trap—a costly, extraordinarily complicated system which nonetheless protected enough of the public to make the system resistant to change.

In expanding coverage to the elderly, in measures that culminated with Medicare (and in providing large subsidies for capital investments), many supporters believed that they were taking the first steps to expanding help to the sick and the disadvantaged. That was, roughly speaking, the virtuous circle of Social Security. While that program began with serious gaps, it created favorable conditions for its own expansion. Medicare played out differently. Indeed, one could argue that the program has been a decidedly mixed blessing for American social insurance. It has proved a lifesaver for hundreds of millions of people, but the program’s high and growing costs (made worse by President George W. Bush’s poorly financed prescription-drug program) have led many policy makers and voters to regard any further expansion as fiscally irresponsible. (These fiscal concerns are, I believe, misdirected: Medicare has done no worse than the rest of the health care system in controlling spending growth. But the political reality remains.)

The shift in American interest group politics was equally acute. Private insurers had obvious reasons to resist Medicare’s expansion. More important, Medicare accelerated the emergence of the elderly as a distinct constituency. Meanwhile, many workers receiving employer-based coverage believed that they were getting better coverage than Medicare provides, often for free.

Although 50 million people remain uninsured, American health policy has thus fostered the rise of what Starr calls “protected publics.” As Cornell professor Suzanne Mettler has described in these pages (see “20,000 Leagues Under the State,” July/August 2011), these large constituencies often do not perceive the full costs of the substantial subsidies they receive. The present discounted value of Medicare benefits for seniors far exceeds what most beneficiaries have contributed to the system, and holders of employerbased coverage benefit from substantial tax expenditures which they scarcely know exist.

These protected publics also have reason to regard themselves in competition with other groups. Starr argues that millions of seniors, veterans, and others have come to see themselves as having earned their benefits, and many believe that their claims to public resources are more morally worthy than those of Medicaid recipients or the uninsured.

No comprehensive health care reform that would harm these protected publics is politically feasible. (Not coincidently, Starr argues, no other advanced democracy has followed such a trajectory, establishing separate programs for the elderly before pursuing broader efforts for near-universal coverage.) The combination of vested constituencies and a political system designed to thwart comprehensive reform locks liberals into incremental measures that attach new protections, new coverage, and new costs and quality control efforts onto an already complex and inefficient health care delivery system. These political realities hinder efforts to craft sound policy and lead instead to reforms that are incremental, ideologically moderate, forbiddingly complex, and unlikely to generate grassroots excitement. The resulting encyclopedic legislation invariably exposes political vulnerabilities. Difficult compromises must be struck through precisely the kinds of sausage making that voters disdain.

A fragmented health system particularly hinders efforts to deploy government’s bargaining power to control costs—much to the benefit of many entrenched constituencies. Millions of Americans see reasons to resist measures that might constrain the care or benefits they receive, but because the full costs of health care are hidden, few perceive corresponding direct benefits to controlling these costs.

Given all of these realities, it’s remarkable that the Affordable Care Act was passed last year. The single most important factor in its success was the presence of strong Democratic majorities: President Obama enjoyed much broader and more unified congressional support than President Clinton did, even after Scott Brown’s victory deprived Democrats of their sixtieth Senate vote. Clinton’s reformers, Starr writes, also “lacked one critical element” that Obama and his team possessed: “an agreed-upon remedy.” As Starr writes in perhaps his most intriguing chapter, “Rise of the Reform Consensus, 2006-2008,” much political heavy lifting was done before and during the presidential primaries, as Democratic policy elites and the major Democratic presidential contenders coalesced around “minimally invasive reforms” influenced by recent efforts in Massachusetts.

By the time Obama entered office there was already a broad Democratic consensus on what health care reform would look like. Institutionally radical alternatives such as the Wyden-Bennett bill were effectively sidelined, since they threatened too many protected publics. Starr is cynical regarding the public option and Medicare buyin proposals, writing that they were politically valuable, to “bring the left within the fold.” Yet they faced fierce opposition from the supply side of the medical economy, as well as quiet opposition from many House members— some liberal—who worried about the impact such policies would have on providers in their districts back home.

Mobilizing a larger and more cohesive majority, Democratic congressional leaders shepherded the legislative process far more successfully in pursuing the Affordable Care Act than was possible in the Clinton years. The strong Democratic majority gave health reform an aura of inevitability that was an invaluable asset in striking bargains with insurers and with others. Because Democrats possessed a large majority, much of the cross-party negotiations that might otherwise have occurred took place within the party, between Democratic liberals and Democratic conservatives and moderates. (Starr cites with some irony the “bipartisanship within one party” that resulted in last year’s reform.) ACA provisions such as the individual mandate had appeared in past Republican bills—Governor Pawlenty’s ironic sobriquet “ObamneyCare” may provide the best description of the final outcome. Yet perhaps unintentionally, partisan polarization and Democrats’ subsequent intraparty bargaining gave moderate and conservative Democrats a strong stake in the bill. Figures such as Max Baucus and Kent Conrad were the marginal votes. Their personal reputations were on the line in securing its final passages. Republicans acquired no such ownership, with predictable consequences.

Starr finds much to celebrate in health reform, and much to worry about in pondering the ACA’s uncertain future. He lucidly describes Democrats’ most important political error: the extreme backloading of the program’s main provisions to 2014. The key provisions of the new law are highly unlikely to be repealed once health reform becomes part of the fabric of American life, but the law was designed so that at least two congressional elections and one presidential election will pass before pillars such as health insurance exchanges are fully implemented. As Republicans continue to profit from the nation’s economic woes, Obama’s reelection is in doubt—and with it, the fate of health reform.

Starr is one of many commentators to say that backloading was a consequence of the administration’s foolishly low ten-year budget targets. He also notes the more subtle reality that Democratic lawmakers, mindful of recent experiences in Medicare prescription-drug coverage, were leery of any policy that might produce embarrassing administrative glitches before the 2012 election. Whatever the motivation, the future of last year’s reform is dependent on the results of the 2010, 2012, and (in many cases) 2014 elections and beyond.

Starr is disgusted—as I am—that the ACA’s substantive moderation does so little to temper Republicans’ extreme and often dishonest attacks on the new law. He is also frustrated—as I am—that so many progressives are disparaging of this historic (albeit incomplete) achievement.

Starr ends with a sober observation:

Repealing that law would not just mean denying insurance to more than 30 million people. It would also be a confession of political helplessness in the face of a problem that has nagged at the national conscience for a century. The search for a remedy would continue, but it would proceed under a shadow of uncertainty about whether Americans will ever be able to hold their fears in check and summon the elementary decency towards the sick that characterizes other democracies.

I closed this excellent book doubly worried—about the uncertain fate of health reform, and about the political lessons this story teaches. President Obama embarked on health reform with virtually unique political advantages. He risked catastrophic defeat, sacrificed more than a year of his presidency, and, remarkably, succeeded. Thus far, however, health reform has brought him and his party little political benefit. As a friend of mine said recently, the ACA’s passage was a “catastrophic victory.”

Future presidents will need to contemplate large measures to address our large national problems, from climate change to widespread unemployment. I fear these leaders will contract a kind of legislative “Vietnam syndrome” as they recall Obama’s and others’ difficulties in the struggle over health care reform. As the health debate has shown, our capacity for collective national action does not currently match the serious challenges we face. In health care and other areas, we will need to do better.


If you are interested in purchasing this book, we have included a link for your convenience.


The post Sisyphus Gets to the Top appeared first on Washington Monthly.

]]>
27785 Mar14-Starkman-Books
They Shall Reap the Whirlwind https://washingtonmonthly.com/2011/10/23/they-shall-reap-the-whirlwind/ Sun, 23 Oct 2011 12:48:31 +0000 https://washingtonmonthly.com/?p=27786 How religious zealots in the Israeli government are supporting a new generation of extremist settlers who hate the Israeli government.

The post They Shall Reap the Whirlwind appeared first on Washington Monthly.

]]>
Gershom Gorenberg begins his powerful and persuasive new book, The Unmaking of Israel, with a long-forgotten tale from the period immediately following Israel’s independence: In June 1948, Menachem Begin, the leader of the radical Irgun militia—which had carried out terrorist attacks on the British in Palestine and advocated seizure of “the entire Jewish homeland” on both sides of the Jordan River—resisted demands to hand over the group’s weapons to the new Israeli army. Begin and his Irgun fighters wanted to maintain their autonomy in the new country, a state of affairs that Israeli Prime Minister David Ben-Gurion believed would almost certainly lead to anarchy and civil war. “The Irgun saw itself as representing the purest Zionism, unwilling to concede any part of the Land of Israel,” Gorenberg writes. “The mainstream saw the Irgun as separatists and terrorists.” It was a battle over the future of the inchoate state, and it ended with violence: the Israel Defense Forces shelled the Altalena, a converted warship bringing guns and ammo to the separatists, killing a dozen men and forcing the rebels to surrender.


The Unmaking of Israel
by Gershom Gorenberg
Harper, 336 pp.

Ben-Gurion’s notion of the Israeli state has been wrestling with Begin’s more uncompromising vision ever since. And as Gorenberg argues, it has been losing ground. An American Orthodox Jew who made aliyah to Israel some thirty years ago, Gorenberg is the author of The Accidental Empire, an account of Israel’s reluctant colonization of the West Bank and Gaza in first two decades after the Six-Day War in 1967. In his latest book, he takes the narrative one step further, examining how the relentless expansion of Israel’s West Bank settlements in recent years has not only warped the values of Ben-Gurion’s secular, inclusive, and democratic state, but also altered Israel’s approach to Arabs within its pre-1967 borders. Gorenberg’s book is partly a polemic, filled with righteous anger. But it’s also a finely documented piece of reporting in which he shows how the collusion of three powerful forces—the civilian government, the military, and the growing ultra-Orthodox movement—has solidified the hold on the occupied territories and made the prospect of withdrawal fraught with danger. Israel is moving backward, he writes, “returning to the moment of a fragile state facing an armed faction dedicated to fantasies of power and expansion.”

Gorenberg begins with an account of the obfuscations and self-justifications that allowed Israeli to expand its settlement project in defiance of international laws and the objections of many of its own citizens. Through the creative use of Ottoman-era land records, and the careful burying of huge subsidies in the budgets of various ministries, Likud and Labor governments alike confiscated land, constructed housing, and built bypass roads linking them. Young families were lured with the promise of cheap, subsidized housing, and the population grew from about 50,000 after the Six-Day War to 300,000 today. Prime Minister Ariel Sharon disengaged from Gaza and a handful of remote West Bank settlements, and stopped issuing new housing permits, but his emotional attachment to the land remained unabated, and under his administration illegal outposts thrived. These outposts—often no more than a handful of mobile homes thrown up on a hilltop—typically were created by young radicals inspired by a divine vision of Israel’s destiny. As Gorenberg points out, true believers inside the government have quietly provided them with running water, electricity, and access roads, and routinely derailed attempts to close them down. “Cabinet ministers, officials and settlers have joined in pervasive disregard for the law and responsibility to democratic decisions,” Gorenberg charges.

Along with the illegal outposts has come the emergence of young extremists who reject the quasi-suburban comfort of previous generations of settlers and manifest disdain for the Israeli government, even while benefiting from its wink-and-a-nod support. “A new generation of settlers has come of age,” he writes, “as radical or more in its theologized politics, alienated from the institutions of the state that have so assiduously fostered its growth.”

The ideological fervor is often mirrored by the settlers’ chief protector, the Israel Defense Forces. Gorenberg writes of the hesder yeshivas—an IDF-sanctioned program for the ultra-Orthodox men who alternate Talmudic study with active duty. A spreading phenomenon in religious settlements, the hesders have helped forge a new generation of Orthodox Zionist soldiers. A 1990 IDF study revealed that just 2.5 percent of Israeli officers were graduates of Orthodox schools; in 2007, close to one-third of the new officers were. Many have inculcated their troops with the message that the West Bank is part of a Greater Israel, to be defended at all costs. Meanwhile, powerful ultra-Orthodox parties in the Knesset have ramped up subsidies for haredi scholars, creating a growing supply of religious ideologues. As Gorenberg writes,

[A] vicious circle is at work. Policing occupied territory and protecting protecting soldiers are military burdens, increasing the need for combat soldiers and officers who have no qualms about the occupation. To meet that need, the army depends on ever more recruits from the religious right. Yet this increases the danger of fragmenting the military when an Israeli government finally does decide to pull out of the West Bank.

Perhaps the most controversial part of Gorenberg’s book is his contention that Israel’s West Bank occupation has radically corrupted its relationship to the Arabs in Israel proper. Led by hard-line nationalists such as former Chief Rabbi Mordechai Eliahu and his son, Rabbi Shmuel Eliahu, ultra-Orthodox Zionists have “imported the settlement model,” he writes, to Jaffa, Lod, Akko, East Jerusalem, and other cities with mixed Arab and Jewish populations. These new “urban settlers,” as Gorenberg describes them, “were bringing a way of seeing the world back home, reimporting the message of ethnic struggle to each acre of land.” The methods include the hurling of stones at Arab cars, and even establishing radical Jewish academies inside Arab neighborhoods and using them as a toehold to drive out their Arab neighbors. There’s no question that attitudes have hardened in recent years, but Gorenberg doesn’t take into account other factors—economic pressures, jumpiness about the possibility of the declaration of a Palestinian state—that might be responsible for the tensions. And while the rise of Avigdor Lieberman, Israel’s hard-line foreign minister, surely reflects a changing Israeli polity, Lieberman’s call for loyalty oaths and plan to swap Arab corners of Israel for West Bank territory—thus depriving Arab Israelis of their Israeli citizenship—have not gained traction in either the Knesset or Israeli society as a whole.

Gorenberg trots out the usual prescriptions for Israeli-Palestinian peace. He views the one-state solution as untenable, certain to result in a “nightmare” in which Arabs and Jews “do battle while the most educated or well-connected members of each group look for refuge elsewhere.” And he sees no other recourse for Israel but a two-state solution, with a handful of Palestinian refugees returning to their pre-1948 homes, and a full withdrawal to pre-1967 borders with the exception of East Jerusalem and a handful of settlements, such as Maale Adumim, that were built on land contiguous with Israel. Gorenberg insists, however, that domestic groundwork needs to be laid—the dismantling of the hesder yeshivas; the ending of state subsidies for pre-army Orthodox academies; the dissolving of Netzah Yehuda, an ultra-Orthodox IDF battalion—before the settlements can vanish and “[t]he hallucinatory expectations that have warped Orthodox Zionism may begin to fade.” Given the intransigence of the current Israeli government, and the rightward drift of Israeli’s citizenry, the transition process is not likely to begin any time in the near future. But Gorenberg argues convincingly that the longer Israel waits, the more it risks the civil war that David Ben-Gurion feared might happen sixty-three years ago.


If you are interested in purchasing this book, we have included a link for your convenience.


The post They Shall Reap the Whirlwind appeared first on Washington Monthly.

]]>
27786 Mar14-Starkman-Books
Justice Served https://washingtonmonthly.com/2011/10/23/justice-served/ Sun, 23 Oct 2011 16:43:51 +0000 https://washingtonmonthly.com/?p=27787 John Paul Stevens

John Paul Stevens's Supreme Court tenure was marked by the firm belief that absolutism had no place on the bench.

The post Justice Served appeared first on Washington Monthly.

]]>
John Paul Stevens

On July 16, 2019 retired Supreme Court Justice John Paul Stevens passed away. This is a review of his memoir, originally published in 2011.

In an age of judicial philosophies, abstract methods of interpretation, and trite baseball metaphors, John Paul Stevens was a common-law judge. Justice Antonin Scalia practices textualism; Justice Clarence Thomas practices originalism. Chief Justice John Roberts is developing a sort of reactionary legalism. Even the Supreme Court’s liberals have gotten in on the game. In a head-scratching 2005 book, Justice Stephen Breyer professed his theory of “active liberty,” which has not exactly caught on as a beacon for progressive constitutionalists.

Mar14-Starkman-Books
Five Chiefs: A Supreme Court Memoir by John Paul Stevens Little Brown and Company, 340 pp. Credit:

Through the din of this nonsense one delighted to hear the strong plain chords of a Stevens, who harkened back to an earlier breed of jurist. His lights were not the socalled “neutral principles” hashed out in law review articles and perfected in warlike opinions by judge-partisans. They were centuries-old practices like judicial restraint, respect for the Court’s precedents and procedures, and, above all, an anachronistic faith in judges’ discretion. In his new book, Five Chiefs: A Supreme Court Memoir, Stevens favorably quotes Justice Potter Stewart, who famously said of obscenity, “I know it when I see it.” But where, cry the legal theorists, is the principle in that sort of decision making? Stevens might reply that it’s amazing how many cases a judge will get right when he has no dogma to uphold and no movement to lead.

Stevens’s retirement from the Supreme Court in 2010 after thirty-four years of service was a tremendous loss for the country. As the senior associate justice for sixteen years, he led the liberal wing through the Court’s highest and lowest moments since Watergate. The high point both for the institution and Stevens personally was the trio of war-on-terror cases in which the Court put a stop to President Bush’s lawlessness at Guantanamo Bay. Stevens wrote the two most important opinions—Rasul v. Bush (2004) and Hamdan v. Rumsfeld (2006)—and supervised a majority in a third, Boumediene v. Bush (2008). (He dissented in two other war-on-terror cases in 2004.) The low point was a pair of decisions that might best be described as institutionalized lawlessness. In Bush v. Gore (2000) the Court reached out and handed Bush the presidency, and in Citizens United v. Federal Election Commission (2010) it struck down most legal restrictions on corporate campaign spending. Stevens issued the two great dissents of his career in those cases, noting pointedly the damage the Court had done to itself. If only he were still there to help with the repairs.

The Court’s liberals stood behind Stevens in Citizens United—as they did throughout much of the 2000s. He proved a canny strategist and leader, assigning opinions in a way that preserved majorities and shaped future coalitions. He secured key victories in decisions limiting capital punishment and permitting affirmative action. In Five Chiefs he implies that he cultivated Justice Anthony Kennedy in gay rights litigation from the mid-1990s. Stevens assigned Kennedy to write the Court’s 1996 opinion in Romer v. Evans, which struck down a Colorado constitutional amendment that targeted homosexuals. He again gave Kennedy the honors in 2003’s Lawrence v. Texas, which invalidated Texas’s antigay sodomy law. It may simply be that Kennedy was the justice least sure of his majority vote and Stevens prudently gave him the assignment to solidify it. Then again, Stevens may have sensed that the subject matter would appeal to Kennedy, who never misses a chance to write for the ages. Regardless, in the gay marriage and Defense of Marriage Act cases that are sure to come, liberals can thank Stevens that we have a good chance at Kennedy’s decisive vote.

This recent leadership was a welcome surprise given that Stevens spent his early years on the Court as an unpredictable maverick. He arrived in 1975 as President Ford’s sole appointment and immediately displayed confident independence tempered with midwestern geniality. He politely declined to join the “cert pool,” by which the justices’ law clerks share the work of reviewing thousands of petitions for the Court’s attention. He dissented prodigiously and made a habit of filing concurring opinions to explain his quixotic views. As his biographers Bill Barnhart and Gene Schlickman note in John Paul Stevens: An Independent Life, during his first three terms Stevens “was the most prolific writer on the Court, authoring 65 dissents, 35 concurrences, and 36 opinions for the Court.” Uniquely among the justices, he did all his drafting himself. “John Paul Stevens has not yet begun to write,” went the saying at One First Street.

Stevens has a rare intellect, but unlike many of his colleagues he wore his learning lightly. Unlike Justices Breyer and Kennedy, he had no continental pretensions and did not look for the opportunity to speak a little French. Justices Roberts and Scalia are both brilliant in their way, yet they manifest that brilliance with disdain (Roberts) and shrill mockery (Scalia) of those who disagree. Stevens quietly but firmly pushed back, proving himself a match for any justice on the Court. In District of Columbia v. Heller (2008), which overruled a seventy-year-old precedent to hold that the Second Amendment creates an individual right to bear arms, he dissented with a historical analysis more persuasive than Scalia’s. In Five Chiefs Stevens bemusedly describes the “extensive and interesting discussion[s] of history” in Scalia’s opinions while making clear that such methodology is not the talisman that his brethren think. But he could play the game when he had to.

Another fine example of Stevens’s stolid fighting heft is his ninety-page dissent in Citizens United. His opinion was so thorough and devastating that the majority divided the task of responding to it among Roberts, Scalia, and Kennedy, each of whom took on a section. One of the main points of disagreement between the dissent and the majority was the conservatives’ assertion—in the face of 100 years of federal laws and Court decisions to the contrary—that the First Amendment does not permit distinctions between speech by corporations and speech by individuals. After listing an unanswerable litany of major distinctions— including the financial resources of corporations, their limited liability, and their perpetual “life”—Stevens wrote,

The Court’s facile depiction of corporate electioneering assumes away all of these complexities. Our colleagues ridicule the idea of regulating expenditures based on “nothing more” than a fear that corporations have a special “ability to persuade,” as if corporations were our society’s ablest debaters and viewpoint-neutral laws such as [McCain-Feingold] were created to suppress their best arguments.… In the real world, we have seen, corporate domination of the airwaves prior to an election may decrease the average listener’s exposure to relevant viewpoints, and it may diminish citizens’ willingness and capacity to participate in the democratic process.

Five Chiefs is a funny little memoir, as quirky and interesting as its author. Its conceit is a personal history of the Supreme Court arranged through the five chief justices Stevens has known. Two of them—Fred Vinson (who served from 1946 to 1953) and Earl Warren (1953 to 1969)—he knew very little. Vinson led the Court when Stevens clerked for another justice during the 1947-48 term, and Stevens occasionally gave him a ride in his beat-up car. Warren was chief during Stevens’s years of private practice; Stevens argued an antitrust case before him in 1962. Hence the book’s early chapters contain fewer personal recollections and more general remarks on the Court as an institution.

There are some notable early pages, though. Ever opinionated, Stevens levels criticism at two major decisions of the Warren Court. Brown v. Board of Education (1954), he writes, unquestionably reached the right result, but “[u]nlike most admirers of the opinion, I have never been convinced that the benefits of its unanimity outweighed what I regarded as two flaws in the Court’s disposition of the cases.” Namely, the Court held Brown over for an additional term to let the parties debate a remedy, and then it ordered desegregation to proceed “with all deliberate speed”— a famously baffling directive that led to southern foot dragging. Stevens also offers choice words about Griswold v. Connecticut (1965), which laid the foundation for Roe v. Wade by invalidating a state law banning contraceptives. Justice Douglas’s opinion relied not on the text of the Constitution but on “penumbras, formed by emanations” that surround the guarantees of the Bill of Rights and “help give them life and substance.” Stevens calls this “virtual incoherence” and would have reached the same result on less mystical grounds.

Stevens joined the Court during Chief Justice Warren Burger’s tenure. Burger has been widely portrayed as a vainglorious boob: pompous, ineffectual in leadership, and incompetent in assigning and drafting opinions. Stevens decorously spends pages praising Burger’s stewardship of the Court’s heritage by commissioning just the right painting and so forth. But then Stevens reinforces the prevailing image by describing the way Burger withheld his views in the justices’ conferences and assigned opinions to himself or to others who did not command a majority, causing confusion and acrimony as a result. Stevens is no Scalia: he does not come right out and call Burger an ass. As he did in his opinions, Stevens makes his point with the subtle but telling comment. He writes that when he joined the Court, Justice Potter Stewart suggested that he “keep in mind the possibility that either the Chief or Harry [Blackmun], or possibly both, might not adhere to the position that he expressed at conference.”

The best sections of Five Chiefs concern the Court under the leadership of William Rehnquist and John Roberts (1986-2005, and 2005 to the present). Stevens genuinely liked both men and found them to be excellent administrators. Here, as elsewhere, the biggest value of Five Chiefs is its anecdotal color in filling in our understanding of the Court and its members. In a section on Bush v. Gore, Stevens recounts a story about the night Bush’s petition to halt the Florida recount arrived at the Court. Stevens happened to bump into Justice Breyer at a Christmas party; “we had a brief conversation about the stay application. We agreed that the application was frivolous.” The two parted ways “confidently assuming that the stay application would be denied when we met the next day.” The Court’s conservative majority thought otherwise and halted the recount in a flurry of opinions. Stevens concludes, with an understatement that belies the power of his famous dissent, “To the best of my knowledge no Justice has ever cited any of them. What I still regard as a frivolous stay application kept the Court extremely busy for four days.”

Similar comments reveal the limits of Stevens’s regard for Roberts. Stevens, a Chicagoan, built a vacation home in Michigan City, Indiana, in 1961. In 1969, he writes, “John Roberts was a high school freshman in a boarding school in LaPorte, Indiana,” only a few miles away from Michigan City. Stevens swore in the young chief justice, who told the Senate that judges, like umpires, merely call balls and strikes. Every lawyer in the country who heard that statement knew it was cant—especially Stevens, who was there at Wrigley Field in 1932 to see Babe Ruth call his shot. An umpire can’t send you to Guantanamo for the rest of your life with a sack over your head.

While Stevens clearly respects the abilities and achievements of both Rehnquist and Roberts, he uses Five Chiefs to dismantle several of their decisions. For Rehnquist he focuses on the trigger-happy death penalty jurisprudence and, more esoterically, the late chief’s enthusiastic development of the doctrine of sovereign immunity, which prevents individuals from suing state or federal governments and has frustrated many a civil rights plaintiff. Stevens contends that Rehnquist’s Eleventh Amendment cases—in which the Court constitutionalized the sovereign immunity doctrine without much regard for the amendment’s text—was the worst mistake of Rehnquist’s tenure.

For Roberts, Stevens singles out a case that was decided after his own retirement. In Snyder v. Phelps (2011) the Court overturned a jury verdict in favor of a plaintiff whose son’s military funeral was heckled by religious fanatics bearing posters saying “God hates fags” and “Thank God for dead soldiers.” The father sued under the tort doctrine of intentional infliction of emotional distress, but the Court held, 8-1 behind Roberts’s opinion, that the protesters’ speech was protected by the First Amendment. Stevens makes clear that he would have joined Justice Alito’s dissent. Common-law judge that he was, Stevens eschewed simple line-drawing for a detailed analysis of each case’s complexities. In Five Chiefs he notes a critical distinction overlooked by the Snyder Court:

It is easy to gloss over the difference between prohibitions against the expression expression of particular ideas—which fall squarely within the First Amendment’s prohibition of rules “abridging the freedom of speech”—and prohibitions of certain methods of expression that allow ample room for using other methods of expressing the same ideas.

In other words, the Court could have prevented the protesters from speaking at a certain location—a funeral—without taking the prohibited step of preventing them from uttering a certain message. The protesters could have said the same vile things elsewhere.

Both Snyder and Citizens United are First Amendment cases, and in them Stevens argues for less speech rather than more. This does not exactly put him at the vanguard of liberal constitutionalism. Nor did his dissents in the flag-burning cases of 1989 and 1990, in which he criticized the Court’s judgment that federal and state laws protecting the flag are unconstitutional. It is far more common in our legal tradition to celebrate First Amendment absolutists like Hugo Black than jurists who treat that provision with anything like nuance.

And therein lies Stevens’s tremendous appeal as a judge—regardless of whether one agrees with all of his decisions. He was an absolutist about nothing. Absolute positions on the law—be they on the subject of free speech or the framers’ intent—often require the judge to set down reason and common sense so that he can hold a banner with both hands. It is therefore unsurprising that the one justice about whom Stevens has no kind words in Five Chiefs is Clarence Thomas, who is more rigid in his vision of the Constitution than perhaps any justice in the Court’s history. If the Tea Party has taught us anything, it is that the absolutists will shout past each other until the whole damn operation grinds to a halt. Without Stevens, the Supreme Court is that much more likely to do the same thing.

The post Justice Served appeared first on Washington Monthly.

]]>
27787 Mar14-Starkman-Books