November/December 2012 | Washington Monthly https://washingtonmonthly.com/magazine/novdec-2012/ Sun, 09 Jan 2022 06:10:31 +0000 en-US hourly 1 https://washingtonmonthly.com/wp-content/uploads/2016/06/cropped-WMlogo-32x32.jpg November/December 2012 | Washington Monthly https://washingtonmonthly.com/magazine/novdec-2012/ 32 32 200884816 Up from Independence https://washingtonmonthly.com/2012/11/13/up-from-independence/ Tue, 13 Nov 2012 16:00:39 +0000 https://washingtonmonthly.com/?p=20756 Harry Truman was a classic American striver, and a failure, until politics intervened.

The post Up from Independence appeared first on Washington Monthly.

]]>
“He was one of our great presidents, wasn’t he?”

So remarked a leading local attorney I know—a longtime pillar of the Republican Party. He was talking about Harry S. Truman, reflecting what has become a widespread consensus, one that few could have imagined when Truman left the presidency in 1953.

Born in 1884, Truman, in many respects, exemplified the American experience of the late nineteenth century as the nation made a transition to the more structured world of the twentieth. His father, John Truman, scion of a poor Kentucky family, had made his way to Missouri before the Civil War and managed a living as a farmer and livestock trader. John was known for his short temper and readiness to use his fists to settle disputes. He eventually went flat broke speculating in grain futures.


Citizen Solider: A Life of
Harry Truman

by Aida D. Donald
Basic Books, 288 pp.

Truman’s mother, Martha Young, one of several offspring of a prosperous freight hauler and landowner, was a notch or two higher socially than her husband. She had graduated from the Baptist College for Women, likely more a residential secondary school than an institution of higher education. She was the one who provided the culture in her family and encouraged her son to take up the piano. Both the Trumans and the Youngs had sympathized with the Confederacy during the Civil War, and both were staunch Baptists. (Indeed, Harry’s younger brother, John Vivian, was named for one of the founders of the Baptist Church in Kentucky.) And both families were Democratic partisans.

Harry grew to manhood in a rough-and-ready neo-frontier environment. But he did so as an effeminate boy wearing thick eyeglasses, taking piano lessons, and prohibited from contact sports for fear he would break his expensive spectacles. From adolescence on, he sought a manly identity, aspiring first to a military career, then entrepreneurial success, then, finally, politics. After a promising start as a clerk at Commerce Bank in Kansas City and a National Guard enlistee, Truman was pulled away to help his impecunious father manage a family farm owned by Grandmother Young and her son, Truman’s uncle Harrison. He spent a decade toiling there, joining local organizations, dabbling in politics, and courting Bess Wallace, a girl he had known and loved since they were small children in Independence. He also managed to lose what money he had managed to save or raise in ill-considered mining and oil ventures.

World War I saved him from a life of futility. After his Guard unit was called to duty, he became a captain of artillery, served with distinction in France, and discovered he could be a leader of men. When he returned, Bess married him. Partnering with an Army comrade, he established a haberdashery in downtown Kansas City. It lasted only three years, leaving him deeply in debt and seeking a living in politics.

His father had been affiliated with a Democratic Party faction linked to Kansas City boss Tom Pendergast. Pendergast’s nephew had been an officer in Truman’s Army unit and helped him secure a judgeship on the county court (actually an administrative commission). Soon Truman was the presiding judge, a post from which he would be the driving force behind the building of a modern road system and other progressive improvements. His personal honesty was widely acknowledged, but he had to function within an unsavory system of spoils politics made all the worse by machine alliances with the underworld. Anxiety, headaches, and unfocused anger all followed, setting a pattern for the rest of his life. Ultimately, Pendergast wound up in prison. Truman was held blameless, but could not escape the stain of his affiliation.

By then, he had been elected to the U.S. Senate, where he worked very hard, establishing himself as a populist New Dealer and an internationalist-minded backer of a strong defense. During World War II, he chaired a Senate committee that investigated fraud and mismanagement in defense industries. All the while, he maintained warm friendships with the more conservative Democrats. A living “Missouri compromise,” he was a widely acceptable candidate for vice president in 1944. Eighty-three days into Franklin D. Roosevelt’s fourth term, he was called to the White House and informed that the president was dead.

The years that followed were, as historian-journalist Robert J. Donovan has described them, “tumultuous”—what with the emergence of the Cold War and liberal-conservative gridlock at home. The little man from Missouri became a lightning rod for southern conservatives who resented his support of black civil rights, northern liberals who longed for Roosevelt, and Republicans who accused him of spending recklessly, tolerating corruption in government, coddling communists, and settling for stalemate in the Korean War.

Truman’s standing was at a low ebb when he left the presidency in January 1953, but as passions cooled and new generations moved on to fresh debates, a gaggle of historians placed Old Harry securely among the “near-great” American chief executives. David McCullough’s fine 1992 best seller Truman cemented the Man from Missouri’s new standing. The fumbling little fellow became a “man of the people” who battled the interests, espoused the cause of common Americans, sought liberty and justice for all, won a grand upset election victory in 1948, and contained the relentless expansionism of the Soviet Union. Academic evaluations of the presidents routinely placed Truman in the top ten.

Truman may have peaked in the rankings game. Robert W. Merry’s recent book on presidential ratings, Where They Stand, judges him a great success in a first term distinguished by victory in World War II, the establishment of a new national security structure, the extension of New Deal liberalism to black civil rights, and the containment of Soviet imperialism. But Merry then sees him as a flat-out failure in his second term with his mismanagement of the Korean War, a conflict that might have been avoided by a clear and consistent policy of either support or nonsupport for South Korea before 1950. Instead there was an invasion from the north, a snap decision to reverse declared policy and intervene, a pyrrhic victory over the North Koreans, Chinese intervention, and a maddening stalemate. All this destroyed the president’s standing and effectively neutralized the signal achievements of his first term.

Aida Donald, the former editor in chief at Harvard University Press, has produced a brief biography much in tune with Merry’s assessment. She gives us an American striver—born to an undistinguished family, ambitious and hardworking, unfailingly loyal to his kin, devoted to his wife and daughter, but unfocused and meandering in his ambitions, settling on politics as a last-resort vocation. Her Truman struggles throughout his life with stress and bouts of exhaustion brought on by the dogged pursuit of success.

She allots him due credit for his accomplishments, but is equivocal about Korea, deploring the militarization that followed in its wake and the establishment of “a permanent national security state at home.” Cold War grumps might quibble that the military aggression of North Korea, China, and the Soviet Union had more to do with those developments than the reactions of a president who had been trying to cut military spending to the bone.

This is a short book about a great life. It displays the strengths and weaknesses of a genre that has two markets—the college classroom and busy adults in search of a quick read. Such works typically attempt to synthesize a large body of literature and rarely have a significant primary research base. Donald does not seem to have set foot in the Truman presidential library, but does acknowledge the assistance of a researcher who worked for her there.

Unfortunately, she leaves the impression that she is the first person to make much use of Truman’s “Pickwick papers,” a series of often angry musings about his political career and associates produced on the letterhead stationery of Kansas City’s Pickwick Hotel in the early 1930s. In fact, they have been available to scholars for at least twenty years, and Donald makes intelligent, but not original, use of them.

Small books about great lives necessarily have to be selective and need to conflate events. Errors can result. There are some in this book; none are critical. The author’s arguments can be questioned, but none are outlandish. If at times she provokes thought, so much the better.

The post Up from Independence appeared first on Washington Monthly.

]]>
20756 Mar14-Starkman-Books
Brass Backwards https://washingtonmonthly.com/2012/11/13/brass-backwards/ Tue, 13 Nov 2012 15:47:55 +0000 https://washingtonmonthly.com/?p=20757 Thomas Ricks explains the declining competence of America's senior military commanders.

The post Brass Backwards appeared first on Washington Monthly.

]]>
Tom Ricks, the former Washington Post military correspondent who covered the wars in Iraq and Afghanistan for the better part of a decade, and currently edits the blog “The Best Defense” at ForeignPolicy.com, has become the go-to guy for understanding how the American military works. In 2006, Ricks published Fiasco: The American Military Adventure in Iraq, 2003 to 2005, a blistering (and definitive) indictment of George W. Bush’s Pentagon and its mishandling of the war in Iraq. Next, he wrote The Gamble: General David Petraeus and the American Military Adventure in Iraq, 2006-2008, a probing history of the surge. And now he has written a book that tries to explain what makes a great American general— that is, a general whom soldiers can follow, and not just to their deaths.


The Generals:
American Military Commanders
from World War II to Today

by Thomas E. Ricks
Simon and Schuster, 528 pp.

The genesis for this most recent book was atop a Sicilian ridge, where, on leave from covering Iraq, Ricks heard the story of Major General Terry de la Mesa Allen, a hugely successful World War II general who was relieved of leadership of the 1st Infantry Division (for lax discipline of his troops) soon after helping to win the Sicily campaign in July 1943. It wasn’t good enough just to be successful; the success had to come in the right way—otherwise, as the military leadership knew, disaster could loom later on. “I was stunned,” writes Ricks. “How could this be? [My] mind was still focused on [the Iraq] war, where even the most abject failure did not get a general fired.”

The Generals: American Military Command from World War II to Today is the history of this “remarkable group of men, the Army general officers of the past three-quarters of a century, and the wars they fought.” For Ricks, the World II generation really was the greatest; his heroes are George C. Marshall and Dwight D. Eisenhower, two men who displayed both sound judgment and a strategic vision, qualities their successors did not always possess in abundance.

As Army chief of staff on the eve of World War II, Marshall—who would later become secretary of defense, secretary of state, and the architect of the Marshall Plan—inherited a force that was, by his own account, that of a “third-rate military power.” It consisted of fewer than 200,000 soldiers, many relying on World War I arms and munitions.

A scant five years later, under Marshall’s command, the Army had grown to almost eight million troops, with forty divisions in Europe and the Mediterranean and twenty-one in the Pacific—a force that brought down the Third Reich and forced Imperial Japan to surrender. Marshall was the first general to attain the five-star rank.

He pulled this force together in two ways. First, he confronted President Franklin D. Roosevelt with the urgent need for a capable fighting force in Europe and the Pacific. Second, he ruthlessly pruned the dead wood from the military ranks, forcing out at least 600 officers even before the United States entered the fighting. “I was accused right away by the service papers of getting rid of all the brains of the army,” he said. “I couldn’t reply that I was eliminating considerable arteriosclerosis.” But Marshall also believed in giving worthy officers a second chance. At least five Army generals from World War II were removed from combat command but later assigned another division to lead. Nor were senior officers to be micromanaged. They were given enough rope to prove their mettle—or to hang themselves with.

Another key to Marshall’s success, Ricks suggests, is that he adhered rigidly to a classic model of civil-military relations, and studiously avoided any personal or social relationship with his boss. (Indeed, the first time Marshall ever visited Roosevelt’s home at Hyde Park, Ricks says, was for the president’s funeral.) In addition, a durable fire-wall always stood between Marshall’s public service and his political life—Marshall didn’t vote while he served.

Roosevelt and Marshall were possibly the best wartime civil-military team the nation has ever experienced. FDR used to joke that he thought Marshall deserved to command the D-day landings, but that Roosevelt wouldn’t be able to sleep a wink without him in Washington. Ricks writes that “there also is evidence that Marshall’s presence was required in Washington because he was the sole Army officer capable of reining in MacArthur—and even then, just barely.” Ricks sees Douglas MacArthur as perhaps the worst general of his era. Unlike Marshall, MacArthur routinely smudged the line between his military and political aspirations, seeing himself as an American Caesar, possessed of infallible judgment and iron will, to which presidents should bend.

And then there was George S. Patton, who way overstepped the bounds of military protocol—denouncing Russia, for example, in a speech in England as preparations for D-day were taking place. But Eisenhower understood that the brilliant, relentless commander was indispensable in chasing the Germans out of France. “If today’s Army remains wary of the daring, dramatic, outsize personality, the record of MacArthur (and, to a lesser degree, of Patton) is a big part of the cause,” comments Ricks. “The new model for American generalship would be a quite different, and blander, figure.” (Patton would also be especially admired by later President Richard Nixon, who regularly watched Patton during the Vietnam War.)

Ricks has high praise for Eisenhower, whom Marshall had steadily promoted and who was, in many ways, the template for what Marshall saw as a successful general. Like Marshall, Eisenhower emphasized teamwork and unity, making sure his subordinates got credit when credit was due.

This ethos began to change in the early 1960s. In particular, Ricks fingers General Maxwell Taylor, who served as chairman of the Joint Chiefs of Staff during the Kennedy administration, as a prime culprit for this syndrome in Vietnam. Taylor, Ricks writes, was the un-Marshall. He buttered up his superiors and consistently soft-pedaled the dangers of increasing America’s commitment to the war, partly by convincing President Kennedy that the Vietcong was not a serious military force—surely one of the greatest blunders in American military history. Generals who disagreed with Taylor were generally silenced; instead of getting them to work together, he exploited their mutual animosity by playing them off against each other. This was not how Eisenhower, who stressed cooperation, had behaved. Under the new system, Taylor figured out what he thought would maximize the military’s role—Vietnam looked like an opportunity to justify big budgets—and assured his civilian superiors that North Vietnam was easy pickings. Ricks’s indictment of Taylor is sweeping: “He made a habit of saying not what he knew to be true but instead thought should be said.” Ricks accuses Taylor’s disciple, William Westmoreland, of providing “false evidence” to Congress that his strategy of attrition was working successfully vis-a-vis the North Vietnamese. “Westmoreland,” Ricks observes, “would become the most prominent example of the Army’s shift from leadership to management.” (This shouldn’t have come as a surprise; Westmoreland was, after all, a graduate of the Harvard Business School, which he attended while on active duty.) By “management” Ricks seems to mean that the generals had become careerists rather than bold innovators who would figure out how to take on the kind of counterinsurgency warfare that would have been required in Vietnam. The biggest problem, in other words, was that Westmoreland lacked a fundamental grasp of the kind of war he was fighting, which meant that he was unable to lead the country to victory—a task that may have been insuperable from the outset. A war, however, is not supposed to be a proving ground for doctrines; it should be fought quickly and effectively—a lesson the U.S. had to learn all over again in Iraq.

The Army had no choice but to change its doctrines after Vietnam, particularly with the abolition of the draft. It did not change, however, its conception of generalship. Ricks singles out Colin Powell of the exemplar of the political general, someone who was “a master implementer lacking a real strategy to implement.” Ricks sees a lack of intellectual thought about the relationship between military and political ends as at the core of America’s difficulties in both Afghanistan and Iraq. Of Tommy Franks, he observes that “in a bizarre mutation of military thought, Franks seemed to believe—and to have been taught by the Army—that thinking was something others did for generals. In his autobiography he referred to his military planners, with a whiff of good ol’ boy contempt, as ‘the fifty-pound brains.’” One such general was Eric Shinseki, George W. Bush’s Army chief of staff, who had the temerity to tell Congress the truth about the Iraq invasion: that several hundred thousand more troops would be necessary than were being requested by Donald Rumsfeld and Paul Wolfowitz. As a result of that testimony, Shinseki saw his career at the Pentagon come to an end.

It was only in 2006, after the drubbing the GOP received in the midterm elections, that Bush began to reassess the military leadership. Ricks stresses that the selection of David Petraeus and Raymond Odierno to lead the effort in Iraq marked a decisive shift in the military’s culture—the two men insisted on “taking more risks, moving more aggressively, and despite [italics mine] suffering an increase in casualties, radically improving the morale of American troops.” Their example prompts Ricks to call for reforms of the military, including unconventional career moves such as sending leaders to live overseas in Third World countries for a “sabbatical,” reinstating a policy of quickly relieving incompetent officers, and making all command positions probationary for six months.

Is Ricks too hard on the American military? Most armies seem to blunder their way to victory, or squander the fruits of it in the aftermath. Maybe Ricks is indulging in nostalgia—it is almost impossible to favorably compare later generals to Marshall and his generation of commanders given the immense victory that was World War II.

Still, Ricks’s call for reform is persuasive. His own heroes are the outliers—leaders like Petraeus, who began the job of shaking up the military out of its complacency in Iraq and Afghanistan (though he was more successful in the former country than the latter). In the past, the lessons provided by someone like Petraeus might have been shunted aside, with the Army reverting to its former habits. But that’s unlikely to happen this time. The military is preparing for more unconventional warfare and knows that the days of limitless budgets are coming to an end, no matter what Mitt Romney and GOP lawmakers might promise. But as the military continues to reassess its performance, Ricks’s thunderous blast is likely to leave its own tremors behind. His book is not simply an acute account of the military’s difficulties; it is also a devastating one. One has only to read his dedication: “To those who died following poor leaders.”

The post Brass Backwards appeared first on Washington Monthly.

]]>
20757 Mar14-Starkman-Books
Memoirs of an Academic Fraudster https://washingtonmonthly.com/2012/11/13/memoirs-of-an-academic-fraudster/ Tue, 13 Nov 2012 15:39:39 +0000 https://washingtonmonthly.com/?p=20758 Inside the shadowy business of ghostwriting college students' papers.

The post Memoirs of an Academic Fraudster appeared first on Washington Monthly.

]]>
Academic paper mills—the companies that write papers for students—don’t really advertise. One doesn’t see their services in the backs of magazines or populating the margins of Web pages. If such companies market at all, it’s frequently done using spam text, with links, in the comments section of Web sites read by college students. On one such site recently, for example, “SolisSharon26” posted the following item:

Young people who are studying in the universities feel necessity for professional writing online because usually they do not have enough time so that deal with there assignments by themselves. Browse the site and you will find the firm which crew is accessible 24/7 to order essay.

I’m not sure I’d trust people who write like this with my credit card number, much less to take care of my Intro to American Government term paper. But there are more professional ads like this all over the Internet, where a cheating student can follow the link provided, send a fee, and in a few hours or days receive a paper. It’s pretty easy to picture the stressed-out or lazy students who buy this stuff. It’s harder to imagine the kind of people who make their living producing it.


The Shadow Scholar:
How I Made a Living
Helping College Kids Cheat

by Dave Tomar
Bloomsbury, 272 pp.

This world became a little less shadowy when, on November 12, 2010, the Chronicle of Higher Education ran an article, “The Shadow Scholar,” in which a writer using the pseudonym Ed Dante wrote that he’d been turning out American college students’ essays for the last decade. Dante had written some 5,000 papers. “I work at a company that generates tens of thousands of dollars a month creating essays based on … instructions provided by cheating students. On any day, I am working on upward of 20 assignments. You’ve never heard of me,” he explained, “but there’s a good chance that you’ve read some of my work.” At least if you are a professor.

A few readers thought Ed Dante was made up. One blogger wrote that the Chronicle piece, which became the publication’s most-read article, seemed to have been written by someone “skilled in the art of literary hoaxes.” In fact, he was very much a real person. Meet Dave Tomar, freelance journalist, Rutgers graduate, and Phillies fan.

Tomar used the Chronicle article as the basis for his new book, The Shadow Scholar: How I Made a Living Helping College Kids Cheat, the story of his life as an academic fraudster. Tomar wrote every day, and he wrote about anything. He wrote about the policies of the Jackson administration. He wrote lesson plans for gym teachers. He produced papers on cancer cell structure and how to develop appropriate study skills in elementary school children. He even wrote love poems and once helped someone edit her profile on Match.com. He’d do these pieces one right after another, routinely churning out five or six papers a day.

What could have been a depressing tale becomes, in Tomar’s hands, a funny and charming read. He writes of one Thanksgiving spent with a girlfriend’s family: through-out the meal, his girlfriend’s father berated him for helping people cheat; the next week, the girl’s mother called Tomar to ask him to write a paper for a friend’s daughter. Then there was the time he wrote an entire doctoral dissertation, 160 pages, for a psychology program. The graduate student gave him $4,000 and one page of instructions. It was, says Tomar, “like buying a used car on the specifications that it had four wheels and was blue.”

At a time when his friends were moving into condos and going to conferences and working at jobs with cubicles and 401(k) plans, Tomar was living a very different life:

On a romantic getaway, I sneak in a last-minute assignment while the lady gets dressed for dinner. When I ride the bus, I type furiously while apologizing to those around me for my flying elbows. I write papers in crowded bars. I write papers in the midst of drunken debauchery, pausing between paragraphs to hit the blunt going around the room. Sometimes during my Thursday-night poker games, I write a few sentences every time I fold a hand. Once I wandered through an antique garden in New Orleans searching desperately for a wireless Internet signal via which to submit my paper on toxicology. I battered my keyboard furiously at the edge of a hotel bed in Las Vegas, reasoning out an assignment on the cognitive psychology tool known as the Johari window just before the strippers showed up for a bachelor party. I wrote a paper on improving English curriculum design on a midnight flight to Chicago, buzzed on Valium, scotch, and acrophobia.

Reading this book is sort of like watching an indie movie take on the academic cheating industry. It’s a light romp—complete with irony, self-deprecation, fun regional adventures, and an understanding hipster girlfriend—through what one might ordinarily think of as one of the world’s worst jobs. Despite this lightness, The Shadow Scholar is ultimately an indictment not just of the paper mill industry—which has, let’s be honest, few real defenders—but of the contemporary higher education system, which allows the industry to flourish. For instance, says Tomar, he’d write a paper, often plagiarized, but with heavy use of a thesaurus, because the student has to submitit via a plagiarism detector software like Turnitin.com. The paper gets a passing grade, the student is happy, and presumably the professor is none the wiser. But it is hard to believe that colleges themselves are unaware that these tactics are widespread. And it is not clear, Tomar argues, that they have much motivation to investigate the problem. Indeed, there is a sort of sleaze triangle of academia at work here, with for-profit ghostwriting companies, for-profit plagiarism-protection sites, and universities—many of them for-profit these days—all making money in a fake academic exercise in which students pay for credits they did not earn.

Tomar suggests that today’s students are more likely to cheat because of a combination of factors unique to people born in the last twenty-five years. Students today, he writes, are characterized by a “sense of entitlement, a constant need for validation, and a mediocre work ethic.” At the same time, they expect fast and easy entertainment. They have short attention spans, and they’ve been constantly receiving gratuitous praise for minor accomplishments from their parents. Together with the convenience and power of the Internet, we’ve created students very eager to have someone else do their work.

This is rhetorically rather compelling, but is it true? The book’s major flaw is that the proof for the level of academic dishonesty Tomar is exposing remains frustratingly anecdotal. He amasses a lot of evidence about student loan debt and class size and grade inflation, but he fails to demonstrate that students today are really cheating more. After all, organized systems of plagiarism have existed on campus for decades. Any decent fraternity will have in its library a filing cabinet stocked with papers on everything from the German Renaissance to Dr. King’s March on Washington. If there is more cheating today, the simplest explanation is not that today’s students are lazier or more dishonest but that the Web makes academic fraud easier to engage in.

Certainly all of the students with whom Tomar interacted seem to be selfish, demanding, dependent, and eager to cheat. But that’s a self-selected group of people, and it’s neither fair nor accurate to say that they represent American young people as a whole. Indeed, there’s every reason to believe that this industry attracts the very worst American students, intellectually, morally, emotionally.

Even if Tomar’s thesis is a little weak, however, he does manage to evoke how academic dishonesty really works now. The increasing size of the administration of American higher education has made college mind-numbing and impersonal for many students. (“If you’d like to pay your semester student fees by credit card, please press one.”) What Tomar manages to demonstrate is that the faceless bureaucracy of the American university extends beyond the school and into its affiliated industries—even the illicit ones.

The post Memoirs of an Academic Fraudster appeared first on Washington Monthly.

]]>
20758 Mar14-Starkman-Books
Spread Too Thin https://washingtonmonthly.com/2012/11/13/spread-too-thin/ Tue, 13 Nov 2012 15:32:23 +0000 https://washingtonmonthly.com/?p=20759 Scholars have discovered that certain everyday food items have played pivotal roles in the history of civilization. Apparently, peanut butter is not one of them.

The post Spread Too Thin appeared first on Washington Monthly.

]]>
The average person, I would wager, knows approximately three things about peanut butter: One, it’s not actually butter. Two, it pairs well with jelly. Three, there’s probably a jar of it in a kitchen cabinet.

But Jon Krampner is not the average person. Krampner, a writer and peanut butter historian, can tell you that not only is peanut butter not butter, peanuts aren’t even nuts. (They’re legumes.) He can expound about other foods with which peanut butter has been paired at various points in history. (Cheesecake, pickles, and French dressing, to name three.) And he can tell you the year in which you were least likely to have a jar of peanut butter handy. (1980, the year of the great Peanut Butter Crisis, when a poor peanut crop led to peanut butter shortages and price gouging.)


Creamy & Crunchy:
An Informal History of
Peanut Butter, the
All-American Food

by Jon Krampner
Columbia University Press, 320 pp.

Krampner presents these peanut-related facts and more in Creamy & Crunchy: An Informal History of Peanut Butter, the All-American Food, his new book from Columbia University Press. (Full disclosure: I am employed by Columbia University, and an essay of mine was anthologized in a collection published by Columbia University Press.) The book is Krampner’s attempt to trace peanut butter’s evolution as a kitchen staple, consumer good, and cultural touchstone. “Peanut butter is the staple of childhood, and it’s a comfort food,” he writes in the preface. “In times of economic distress or emotional uncertainty (like the present), Americans turn to it.”

Creamy & Crunchy is the latest in a recent string of popular histories that purport to examine broader cultural trends through the lens of a particular foodstuff. We’ve read about salt, and how it explained the world. Then there was cod, and sushi. One imagines aspiring pop historians rushing to their local Safeway, frantically scanning the aisles to see if any of the products there might sustain an entire book.

But while Creamy & Crunchy is well written and at times very witty, it ultimately lacks the narrative drive to appeal to the casual reader, the scholarly heft to tempt the academic, and the shamelessness to attract those who are looking for scandalous gossip about George Washington Carver, the scientist-educator who has often been (wrongly) credited with the invention of peanut butter. It is, however, the perfect book for condiment bores, or your relatives in the nut butter industry.

When it first appeared in the 1890s, peanut butter was a high-class health food, served at sanatoriums to rich women looking to reduce their waistlines. The mixture was beloved by turn-of-the-century nutrition fanatics like John Kellogg, who attempted to patent a terrible-tasting “food compound” similar to peanut butter, and Dr. Schindler, first name unknown, who supposedly prescribed peanut butter as a laxative.

Around the same time, a St. Louis entrepreneur named George Bayle realized that peanut butter had potential as a snack food. Initially, Bayle combined ground nuts with processed cheese to form an unappetizing spread called “Cheese-Nut,” which, perhaps predictably, nobody liked. Eventually Bayle subtracted the Cheese from the Nut, and ended up with peanut butter.

Food historians disagree on whether either Bayle or Kellogg deserves to be remembered as the true inventor of peanut butter. But you could make the case that the boll weevil is most responsible for its rise to glory. The invasive pest arrived from Mexico at the turn of the century, decimating southern cotton crops and prompting desperate farmers to plant peanuts instead. The federal government did much to encourage peanut cultivation at the time, but Krampner barely addresses its efforts; he is more interested in dispelling the notion that George Washington Carver had anything whatsoever to do with peanuts’ ascendance. (Carver is portrayed as an “Uncle Tom” who dispensed puzzling and inaccurate advice about peanut farming.)

By the end of World War I, peanuts were a valuable cash crop and peanut butter had quintupled in popularity (“with no help from Carver,” Krampner notes). But it didn’t become a pantry staple until the 1920s, when hydrogenation entered the picture. If you’ve ever eaten “natural” peanut butter, you’ve noticed that peanut oil collects at the top of the jar. In addition to being gross (at least in my opinion), this also makes it easier for peanut butter to spoil. Hydrogenation prevents peanut oil and peanut solids from separating, thus lengthening its shelf life.

The process led directly to the rise of major national peanut butter brands, and Krampner spends several chapters profiling the Big Three: Peter Pan, the first mass-produced hydrogenated peanut butter, which, like its spritely fictional namesake, would never grow old; Skippy, known for its exacting quality standards and Norman Rockwell-penned advertisements; and Jif, which wasn’t technically peanut butter at all.

When Jif first appeared in the 1950s, the product was about 25 percent hydrogenated vegetable oil; “no one had ever tried to market as peanut butter something that had so few peanuts in it,” writes Krampner. Its popularity prompted a series of FDA hearings in 1965, during which the government decreed that a product needed to contain at least 90 percent peanuts in order to be called peanut butter. This is interesting stuff, and I wish Krampner did more with it, or tried to make some broader point about the regulatory environment during the rise of industrial food.

But, to its detriment, the book consistently avoids making broader insights, maintaining a frustratingly narrow focus on peanut butter alone. Not quite a work of journalism, not quite an academic history, Creamy & Crunchy ends up at times being a surprisingly shallow read. Krampner asks the big questions, like “Why do Americans love peanut butter?” and “Why isn’t peanut butter popular in other countries?” Unfortunately, his answers are simplistic: “We like the way it tastes” and “People in other countries don’t like the way it tastes,” essentially. Of course, there’s more to it than that—the federal government promoted peanuts as a foodstuff, whereas in Europe peanuts were pressed into peanut oil. Rather than explore that angle, Krampner spends his “Peanut Butter Goes International” chapter listing other countries in which peanut butter is eaten, and describing how it is eaten there. (In the Netherlands, for example, peanut butter is known as “peanut cheese”; George “Cheese-Nut” Bayle would undoubtedly have approved.)

I don’t want to be too critical. Krampner’s a great writer, which counts for a lot, and the book is a fun, easy, interesting read. But it just doesn’t cohere. After the Jif chapters Krampner completely loses his narrative thread, and you can feel him scrambling to list everything he learned about peanut butter. There’s an interesting chapter about the Peanut Corporation of America, which distributed salmonella-tainted peanut butter in the late 2000s. A chapter titled “Where Are the Peanut Butters of Yesteryear?” addresses industry consolidation while offering a wistful look at various defunct peanut butter brands. (Long’s Ox-Heart Peanut Butter, we hardly knew ye.) Throughout the book, Krampner includes several odd peanut butter-related recipes (my favorite being peanut butter garlic bread, which is just what you think it is), which you can make at your own risk. Any of these topics would have made for a killer magazine article. But they don’t come together here. The book could use some serious hydrogenization of its own.

Earlier this year, Beacon Press published a social history of white bread, which makes some sense, because there’s a case to be made that processed white bread is a foodstuff of some larger societal importance, its widespread adoption a lens on the rise of obesity and processed foods and the decline of the locavore diet. The same cannot be said for peanut butter. (Well, it can be said, but Krampner doesn’t say it.) On one level it’s refreshing that Krampner doesn’t claim that peanut butter is the key to Western civilization, or anything like that. But a book touting its subject as “the All-American food” ought to at least successfully argue that it is the All-American food, rather than just an All-American one. Krampner fails to argue that peanut butter is any more relevant than Spam, or Crisco, or any other domestic grocery items that come in cans.

Instead, he wallows in peanut butter arcana, and the chapters lag as Krampner spreads fact after fact after fact. Did you know that, besides creamy and crunchy, there used to be a coarse, grainy type of peanut butter? That former Texas Governor John Connally, wounded by Lee Harvey Oswald in 1963, once served as King Reboog (“goober” spelled backward) in the Floresville Peanut Festival? That you could call a peanut butter and jelly sandwich an “Appomattox,” because “it represents the peaceful coming together of peanuts, grown in the states of the old Confederacy, and grapes, grown in such Yankee precincts as the Northeast, Midwest, and Washington state”? That peanut-processing plants can be dangerous? (“Peanut skins are spontaneous combustion waiting to happen,” warns one industry lifer.)

This is all interesting stuff, and if you are looking to bone up before attending a peanut-themed bar trivia night, then this is the book for you. But otherwise, I have trouble imagining a wide audience for this well-written, well-researched, and utterly superfluous book. The best Krampner does in terms of a rationale for why Creamy & Crunchy exists is in the preface, where he says that “remarkably, given its widespread popularity, there hasn’t been a book about peanut butter on the burgeoning shelf of pop food histories. Now there is.” The question is whether there needed to be.

The post Spread Too Thin appeared first on Washington Monthly.

]]>
20759 Mar14-Starkman-Books
Last Call https://washingtonmonthly.com/2012/11/11/last-call/ Sun, 11 Nov 2012 19:09:56 +0000 https://washingtonmonthly.com/?p=20799

Industry giants are threatening to swallow up America's carefully regulated alcohol industry, and remake America in the image of booze-soaked Britain.

The post Last Call appeared first on Washington Monthly.

]]>

England has a drinking problem. Since 1990, teenage alcohol consumption has doubled. Since World War II, alcohol intake for the population as a whole has doubled, with a third of that increase occurring since just 1995. The United Kingdom has very high rates of binge and heavy drinking, with the average Brit consuming the equivalent of nearly ten liters of pure ethanol per year.

It’s apparent in their hospitals, where since the 1970s rates of cirrhosis and other liver diseases among the middle-aged have increased by eightfold for men and sevenfold for women. And it’s apparent in their streets, where the carousing, violent “lager lout” is as much a symbol of modern Britain as Adele, Andy Murray, and the London Eye. Busting a bottle across someone’s face in a bar is a bona fide cultural phenomenon—so notorious that it has its own slang term, “glassing,” and so common that at one point the Manchester police called for bottles and beer mugs to be replaced with more shatter-resistant material. In every detail but the style of dress, the alleys of London on a typical Saturday night look like the scenes in William Hogarth’s famous pro-temperance print Gin Lane. It was released in 1751.

The United States, although no stranger to alcohol abuse problems, is in comparatively better shape. A third of the country does not drink, and teenage drinking is at a historic low. The rate of alcohol use among seniors in high school has fallen 25 percentage points since 1980. Glassing is something that happens in movies, not at the corner bar.

Why has the United States, so similar to Great Britain in everything from language to pop culture trends, managed to avoid the huge spike of alcohol abuse that has gripped the UK? The reasons are many, but one stands out above all: the market in Great Britain is rigged to foster excessive alcohol consumption in ways it is not in the United States—at least not yet.

Monopolistic enterprises control the flow of drink in England at every step—starting with the breweries and distilleries where it’s produced and down the channels through which it reaches consumers in pubs and supermarkets. These vertically integrated monopolies are very “efficient” in the economist’s sense, in that they do a very good job of minimizing the price and thereby maximizing the consumption of alcohol.

The United States, too, has seen vast consolidation of its alcohol industry, but as of yet, not the kind of complete vertical integration seen in the UK. One big reason is a little-known legacy of our experience with Prohibition. From civics class, you may remember that the 21st Amendment to the Constitution formally ended Prohibition in 1933. But while the amendment made it once again legal to sell and produce alcohol, it also contained a measure designed to ensure that America would never again have the horrible drinking problem it had before, which led to the passage of Prohibition in the first place.

Specifically, the 21st Amendment grants state and local governments express power to regulate liquor sales within their own borders. Thus, the existence of dry counties and blue laws; of states where liquor is only retailed in government-run stores, as in New Hampshire; and of states like Arkansas where you can buy booze in drive-through liquor marts. More significantly, state and local regulation also extends to the wholesale distribution of liquor, creating a further barrier to the kind of vertical monopolies that dominated the United States before Prohibition and are now wreaking havoc in Britain.

Since the repeal of Prohibition, such constraints on vertical integration in the liquor business have also been backed by federal law, which, as it’s interpreted by most states, requires that the alcohol industry be organized according to the so-called three-tier system. The idea is that brewers and distillers, the first tier, have to distribute their product through independent wholesalers, the second tier. And wholesalers, in turn, have to sell only to retailers, the third tier, and not directly to the public. By deliberately hindering economies of scale and protecting middlemen in the booze business, America’s system of regulation was designed to be willfully inefficient, thereby making the cost of producing, distributing, and retailing alcohol higher than it would otherwise be and checking the political power of the industry.

When these laws were passed, America was a century closer to its English roots, and lawmakers remembered very clearly the effects that a vertically integrated alcohol industry had on pre-Prohibition America (and that it still has in the UK today). In the 1920s, Americans had learned the hard way that flat out banning drinking empowered the likes of Al Capone and was, on balance, unworkable. But it made no sense either to go back to the world of pre-Prohibition America, in which big, politically powerful liquor producers owned their own saloons and were therefore free to pour cheap booze into communities coast to coast, sweetening the doses with enticements ranging from rebates on drinks to cash loans, and frequently tolerating in-bar gambling and prostitution.

And so, for eighty years, the kind of vertical integration seen in pre-Prohibition America has not existed in the U.S. But now, that’s beginning to change. The careful balance that has governed liquor laws in the U.S. since the repeal of Prohibition is under assault in ways few Americans are remotely aware of. Over the last few years, two giant companies—Anheuser-Busch InBev and MillerCoors, which together control 80 percent of beer sales in the United States—have been working, along with giant retailers, led by Costco, to undermine the existing system in the name of efficiency and low prices. If they succeed, America’s alcohol market will begin to look a lot more like England’s: a vertically integrated pipeline for cheap drink, flooding the gutters of our own Gin Lane.

A moment’s thought makes it obvious that alcohol is different from, say, apples. Apples don’t form addicts. Apples don’t foster disease. Society doesn’t bear the cost of excessive apple consumption. Society does bear the cost of alcoholism, drink-related illness, and drunken violence and crime. The fact that alcohol is habit forming and life threatening among a substantial share of those who use it (and kills or damages the lives of many who don’t) means that a market for it inevitably imposes steep costs on society.

It was the recognition of this plain truth that led post-Prohibition America to regulate the alcohol market as a rancher might fetter a horse—letting it roam freely within certain confines, neither as far nor as fast as it might choose.

The UK, by contrast, spent most of the last eighty years fussing with the barn door while the beast ran wild. It made sure that every pub closed at the appointed hour, that every glass of ale contained a full Queen’s pint, that every dram of whiskey was doled out in increments precise to the milliliter—and simultaneously allowed the industry to adopt virtually any tactic that could get more young people to start drinking and keep at it throughout their lives. It is no coincidence that one of the first major studies to prompt a shift in Britain’s approach to liquor regulation, published in 2003, is titled Alcohol: No Ordinary Commodity.

The UK’s modern drinking problem started appearing in the years following World War II. Some of the developments were natural. Peace reigned; people wanted to have fun again; there was an understandable push toward relaxing wartime restrictions and loosening puritan attitudes left over from the more temperance-minded prewar years.

But other changes were happening that deserved, but did not get, a dose of caution. As the nation shifted to a service and banking economy, and from agricultural and industrial towns to modern cities and suburbs, social life moved from pubs to private homes and shopping moved from the local grocer, butcher, and fishmonger to the all-in-one supermarket. In the 1960s, loosened regulations led to a boom in the off-license sale of alcohol—that is, store-based sale for private consumption, as opposed to on-license sale in public drinking establishments. But whereas pubs were required to meet certain responsibilities (such as refusing to serve the inebriated), and had their hours of operation strictly regulated (for example, having to close their doors temporarily in the afternoon, to prevent all-day drinking), few limits were placed on off-licenses.

Supermarkets, in particular, profited from the new regime. They were free to stock wine, beer, and liquor alongside other consumables, making alcohol as convenient to purchase as marmalade. They were free, also, to offer discounts on bulk sales, and to use alcoholic beverages as so-called loss leaders, selling them below cost to lure customers into their stores and recouping the losses through increased overall sales. Very quickly, cheap booze became little more than a force multiplier for groceries.

When the supermarkets themselves subsequently underwent a wave of consolidation, the multiplier only increased. Four major chains—Tesco, Sainsbury’s, Asda, and Morrisons—now enjoy near-total dominance in the UK, and their vast purchasing power lets them cut alcohol prices even further. Relative to disposable income, alcohol today costs 40 percent less than it did in 1980. The country is awash in a river of cheap drink, available on seemingly every corner.

Part of the problem, too, was that Britain’s “tied houses”—drinking establishments that are owned by liquor producers—have remained, in one form or another, a dominant part of the country’s drinking landscape. From the time brewing industrialized in the late 1700s, brewers were permitted to operate their own pubs, which they owned outright or whose owners signed exclusive retail agreements with them in exchange for inventory discounts, no-interest loans, and other assistance. The result of this system, which also existed in the U.S. before Prohibition, was a glut of pubs, since each brewer needed its own tied house in a given neighborhood, and a race to the bottom ensued, with each pub competing to offer lower prices and lure customers in with extras like gambling and prostitutes. The problem of this beer-fueled mayhem—of the lager louts smashing up storefronts, beating up foreigners, and glassing one another—became so acute in the 1980s that Parliament finally acted to break up the tied houses, passing legislation in 1991 known as the Beer Orders.

But intense industry lobbying quickly watered down these reforms, and the result was a bitter farce. In the end, brewers were allowed to keep many of their tied houses, and wound up effectively controlling the rest through exclusive retail agreements and other corporate maneuvers. Some brewers simply split in two, with one side retaining the brewing operations and the other responsible for sales. Many other brewers instead sold off their brewing operations and repurposed themselves as giant landlords-cum-barkeepers, while continuing to enjoy exclusive—and lucrative—relations with their former partners. The Beer Orders thus had the unintended consequence of actually catalyzing comprehensive conglomeration and vertical integration, as a handful of giant firms snapped up thousands of independent pubs. This “rationalization” of the industry delivered economies of scale previously unknown, and soon drinkers in England found that booze was even easier to come by than it had been before. Far from vanquished, the lager lout had entered his heyday.

In the United States, the problem so far has not been one of vertical integration like that found in the UK. Here, the story so far has been mostly about horizontal integration—of one brewer buying another.

To be sure, the typical American beer drinker might have a hard time realizing the extent of horizontal consolidation that has already occurred. The shelves of your average gas-station convenience store offer not just Bud and Busch and Miller and Coors but Stella and Hoegaarden and Shock Top and Rolling Rock. At any decent grocery, Kirin of Japan sits beside Boddingtons of Ireland, Peroni of Italy beside Pilsner Urquell of the Czech Republic. Basses shadow Red Hooks in the lea of Goose Islands. Blue Moon shines down on it all.

But all is not as it appears. Two giant companies— Anheuser-Busch InBev and MillerCoors—own, bankroll, produce, control, or have distribution rights to all of these brands and hundreds more. The truly independent brewers in the nation—there are about 2,000 of them, from tiny local outfits to national brands like Samuel Adams—account for just 6 percent of the market.

Almost all the rest belongs to Anheuser-Busch InBev and MillerCoors, which now together capture nearly 80 percent of beer sales in this country. Smaller conglomerates including Pabst, Heineken, and Diageo (owner of Guinness) take up much of the remainder, but even this doesn’t capture how consolidated the market has become. Pabst, for example, does not brew its own beer: that process is contracted out to Miller.

The market forces that eventually led to this massive consolidation among American brewers took root in the mid-1970s with the passage of the Consumer Goods Pricing Act of 1975, which made it illegal for producers to set minimum prices for their goods at retail. This was ostensibly “pro-consumer” legislation: the practice of allowing producers to set their own prices limited certain types of price competition, and so could be viewed as “hurting” consumers in an economic sense. But, of course, in this case we’re talking about consumers of alcohol and not apples, and when it comes to alcohol, cheaper is not necessarily better.

No longer required to set across-the-board prices for their goods, breweries learned that they could manipulate the much smaller wholesalers to extract more favorable terms, brand support, and profit by offering lower prices to those that did their bidding. The threat of higher prices could be used to force a wholesaler to drop competing brands. Conversely, lower prices might be offered to a wholesaler who promised to push a given brand more forcefully. This ability to use pricing to “discriminate” among wholesalers gave producers another valuable return: detailed knowledge of their wholesalers’ acceptable margins. That could be used to extract profit right up to the maximum feasible limit.

Something of a countertrend to consolidation seemed to appear in the 1980s, which saw a boom in small independent craft brewers. Examples include the founding (among others) of such well-known brands as Sierra Nevada (1980), Sam Adams (1984), and Harpoon (1986). Smaller brands and brewpubs added to the mix. But few of these brewers succeeded in gaining significant market share, or even in maintaining their independence. Since big brewers had been freed up to use price discrimination to reward and punish wholesalers, they could passively pressure wholesalers into keeping competitors—particularly small, independent brewers—off the market. Meanwhile, after the election of Ronald Reagan, the Justice Department cut back sharply on enforcement of U.S. antitrust law, setting in motion an unparalleled period of consolidation across virtually all American industries, including the beer industry.

In 1980, forty-eight breweries served the fifty states, and the largest of them had only a quarter of the market. Today, again, the market is overwhelmingly dominated by two: Anheuser-Busch InBev and MillerCoors.

Here’s how it went down:

Stroh Brewery Company, founded in 1850, entered the 1980s as the eighth-largest brewery in the nation. But after a sleepy first 130 years, during which it marketed a single brand, director Peter Stroh had come to recognize that “it’s either grow or go.” Released from antitrust constraints by the new Reagan regime, grow they would. In 1981, Stroh bought Schaefer, a big New York regional, and moved to seventh. Two years later, Stroh took over Schlitz, leaping and to fourth place. By the mid-’90s, the company had also swallowed up Augsburger and G. Heileman, then the fifth-largest brewer in America.
Coors, famously secretive in its business dealings, began the Reagan era as the fourth-largest brewer in America, with a reputation for high quality and an almost chic image in the vast East Coast market as a great beer you could only buy west of the Mississippi. Then, in 1981, Coors crossed the river, crashed through the East Coast, and hurdled across the Atlantic. In 1994, Coors purchased El Aguila in Spain and founded Jinro-Coors in South Korea. And in 1997, Molson, Foster’s, and Coors partnered to bring the Silver Bullet to Canada for the first time. Coors was now number three.

Miller entered the 1980s riding the tremendous success of its innovative Miller Lite brand. Already the second-largest brewer in America, the company set its sights on expanding, purchasing Jacob Leinenkugel in 1988, and in 1992 bought distribution rights to 20 percent of Canada’s Molson. Distribution rights to Foster’s and several other top imports followed later in the decade. With a market share of 21 percent, Miller had solidified its position as number two.

Anheuser-Busch, like Coors, was run by a family famous for its intensely private control of its business. The company entered the Reagan era as the number one brewer in America, and spent the next decade consolidating that position by leveraging its size, mostly via internal brand diversification, and by aggressively expanding its presence abroad. As the 1990s drew to a close, Anheuser-Busch remained by far the top brewer in the United States, with nearly 50 percent of the market, and one of the biggest brewers in the world. It is a testament to the size of the global beer market that even those eye-popping mergers left vast opportunities for other companies to play the same game. Three are of interest here:

In 1987, two of Belgium’s leading brewers, Artois and Piedboeuf, joined together as Interbrew. For fifteen years they quietly ate up dozens of other brands, and by 2001 they were the second-largest brewer on the planet.

In 1999, Brazil’s two largest brewers, Antarctica and Brahma, joined forces as AmBev, instantly dominating that country’s market and moving quickly to buy up smaller brands throughout South America.

And during the 1990s, South African Breweries, virtual monopolists at home with 98 percent of market, moved decisively into eastern Europe, Russia, India, and China, establishing a formidable position on three continents.

So the 1990s drew to a close with four major players in America and three abroad—seven giant brewing conglomerates for six billion people. The contest to own the world’s beer market had entered its endgame.

In 1999, Stroh was split up and sold off.

Six left.

In 2002, South African Breweries bought Miller, creating SABMiller.

Five left.

In 2004, Interbrew and AmBev merged, forming InBev.

Four.

In 2005, Coors and Molson merged to form Molson Coors.

Three.

In 2007, Molson Coors and SABMiller created the joint venture MillerCoors to produce and distribute their products in the United States as a single entity.

Two and a half.

And in 2008, in a blockbuster $52 billion deal, InBev bought Anheuser-Busch to form Anheuser-Busch InBev. At the stroke of a pen, half the U.S. beer industry came under the control of an even more powerful firm—one with a huge inventory of international brands ready to ride Budweiser’s coattails into the American market. Then, in June 2012, Anheuser-Busch InBev announced plans to pay $20 billion to acquire the 50 percent of Grupo Modelo that it does not already own.

Two.

And soon one?

Industry analysis have recently floated the idea that Anheuser-Busch InBev might purchase MillerCoors. But even in the lax antitrust environment that currently prevails, it is almost impossible to imagine a single company being allowed to control—overtly—80 percent or more of the domestic beer market. This means both Anheuser-Busch InBev and MillerCoors have, for all intents, reached the limit of their horizontal expansion. As in the UK, the only direction to go now is vertical, with the first target being the wholesalers—the second tier of the three-tier system.

Prior to the 2008 takeover, Anheuser-Busch generally accepted the regulatory regime that had governed the U.S. alcohol industry since the repeal of Prohibition. It didn’t attack the independent wholesalers in control of its supply chain, and generally treated them well. “Tough but fair” is a phrase used by several wholesale-business sources to describe their dealings with the Busch family dynasty. Everyone was making money; there was no need to rock the boat.

All that changed quickly after Anheuser-Busch lost its independence. The executives from InBev who took over the company did things quite differently. During the negotiations to buy Anheuser-Busch, InBev made it clear that the Busch family would have to go, and at the old headquarters in St. Louis other changes soon followed. Executive offices were literally torn out and replaced by an open floor with matching desks. The private-jet fleet was put on the block. Company cars disappeared. So did 1,400 jobs, retiree life insurance, and contributions to the employee pension plan. Managerial pay was reduced to equal or less than the average for similar jobs in other industries, with bonuses tied strictly to performance. Salaried workers lost little perks like free beer every month, and hundreds of staff BlackBerrys were recalled. Cost cutting was the new imperative.

Then, after eliminating everything it could at home, the new regime turned to squeezing more out of its increasingly nervous partners, the wholesalers. And, today, with only one remaining real competitor, MillerCoors, the pressure it can put on its wholesalers is extraordinary. A wholesaler who loses its account with either company loses one of its two largest customers, and cannot offer his retail clients the name-brand beers that form the backbone of the market. The Big Two in effect have a captive system by which to bring their goods to market.

Here’s how it works in practice. In 2011, Anheuser-Busch InBev (“A-B”) sent out a Wholesaler Family Consolidation Guide to each of its contractors. The language is blunt:

Do you share the same vision as A-B on issues of importance to the industry, including support on legislation that can affect our competitive position? …

Are you selling competitive products in a fellow A-B wholesaler’s territory?

The introduction to the guide begins:

We ask all wholesalers to use the guide’s self assessment tool to objectively consider their capabilities and goals. Wholesalers who aspire to be an Anchor Wholesaler can identify any gaps they have in these qualities and build a plan to address them. Some wholesalers might remain committed to their current market, but realize further acquisitions are not right for their business. Others might decide now is the best time to consider whether a sale is in their best interest.

There are many aspects of an aligned wholesaler, and an explicit focus on our portfolio of brands is paramount. Those who are aligned with us only acquire brands that compete in segments underserved by our current portfolio and that bring incremental sales, not brands that have a negative impact on the A-B portfolio.

The guide emphasizes the last point: an aligned wholesaler is one who “shares the company’s long-term vision for how to operate successfully and grow business in conjunction with Anheuser-Busch InBev’s strategy.” So distributors are caught in an impossible bind: they either do the brewer’s bidding, including selling their businesses to favored “Anchor Wholesalers,” or they lose Anheuser-Busch InBev as a client.

And if the wholesalers try to push back? Anheuser-Busch InBev will get rough. In Arkansas, to take a prime example, a state inquiry revealed that the company was charging as much as $5 more per case (a huge margin against the average price of around $15) to some wholesalers, an obvious effort to run them out of business. In addition, through a second practice called reachback pricing, the company retroactively reset the value of its wholesale contracts once its wholesaler’s retail terms were known. The technique allowed it to reduce wholesalers’ profit margins. And when the state legislature took up a bill to make these practices illegal, Anheuser-Busch InBev filed a letter of protest “on behalf” of its wholesalers, in effect forcing those who disagreed with its practices to identify themselves if they chose to give the motion their public support.

Anheuser-Busch InBev’s efforts failed in this instance; the bill passed. But the door is open for similar behavior in other states. All that’s required is to get their legislatures to fall for familiar Chamber of Commerce arguments about regulation “hurting” businesses and consumers. Moreover, in some big states (notably California and New York, home to almost one in five Americans) brewers have already succeeded in finding loopholes that allow them to own wholesalers directly, giving them the chance to make vertical integration cut-and-dried rather than just a matter of strong-arm business practices. And given other trends toward consolidation at the retail level of American economy, there is, as we’ll see, every indication they will do just that.

For a long time, brewers weren’t interested in distribution, because distribution was a challenging, tight-margin enterprise. Those who did it had to manage hundreds or thousands of accounts, maintain a fleet of delivery trucks, store products in expensively refrigerated warehouses, get new stock onto shelves and remove the expired stuff daily (usually eating the cost as they did so), and, in some cases, maintain the taps at their contracted bars and restaurants. In short, they ran a very complex show. But with the emergence of national chain retail stores, much of the complexity and cost of distribution has been eliminated.

Just as England’s four major supermarkets now dominate alcohol sales there, so major all-in-one box stores, like Walmart and Costco, now dominate beer sales in the U.S. And these stores typically manage their own logistics, gathering inventory at centralized distribution centers and stocking all their shelves in a region from there. So it would be no big task for Anheuser-Busch InBev to run a fleet of trucks from its breweries to the big-box distribution centers—and that is precisely the plan. Anheuser-Busch InBev’s CEO Carlos Brito openly declared it to investment analysts from UBS in 2009, saying that the company was aiming at making 50 percent of its sales directly to retailers. (Aware that at least some people believe that this would or should be illegal under federal law, spokespeople quickly claimed that his statement was being misinterpreted.)

But to Anheuser-Busch InBev, as well as to MillerCoors, achieving de facto if not actual vertical integration is too tempting a goal to give up. Such control allows for the elimination, in literal, physical terms, of almost all competing brands on store shelves. And if eliminating middlemen leads to greater “efficiencies” and therefore lower costs, both companies can build the market for alcoholic beverages by manipulating prices and more aggressively marketing to consumers—which is exactly what happened, with obviously disastrous effects, in the UK.

And so the onslaught continues, by direct and indirect means, with few Americans having even the vaguest idea of what’s going on. In Ohio, for example, MillerCoors tried unsuccessfully to negate the contracts that its component companies, SABMiller and Coors, had already signed with distributors, with the goal of forcing them to renegotiate terms with the more powerful merged venture. In California, the state attorney general declared MillerCoors’s efforts at wholesaler exclusivity a violation of state law. In Illinois, Anheuser-Busch InBev stands accused by the state’s distributors of holding an illegal interest in a top Chicago-area wholesaler. If Anheuser-Busch InBev wins the case, now being heard by the Illinois Liquor Control Commission, the company may be emboldened to argue for similar rights in other states. (On October 31, after this article went to press, the Commission ruled in favor of Anheuser-Busch InBev, effectively permitting beer makers to self-distribute in Illinois.)

In fact, by exploiting existing weaknesses in some states’ commerce laws, Anheuser-Busch InBev owns fourteen distributorships in ten states (New York and California, as mentioned above, plus New Jersey, Ohio, Massachusetts, Colorado, Oregon, Oklahoma, Kentucky, and Hawaii) and is part owner of two more. The biggest beer producer in America, Anheuser-Busch InBev is now by volume the biggest beer distributor, too.

At times, the Big Two don’t even have to lead the fight. Costco spent $22 million last year in a successful ballot initiative campaign that allows them to stock their shelves directly from wholesale warehouses, effectively eliminating the protective inefficiencies of the second-tier distribution system. Such mutually beneficial efforts by big-box stores and the Big Two are no surprise: they all work on a high-volume, low-margin profit model. And though three-tier laws prohibit direct collaboration between them, it’s also no accident that in a March interview with the trade publication Beer Business Daily, Anheuser-Busch InBev Vice President Dave Almeida described in perfect detail how retailers can maximize their profits by replacing craft brews with “premium” beer—its term for its mass-produced light lager. Synergy: it’s coming to a store near you.

Horizontal integration of alcohol production. Vertical integration of distribution and retail. Loosened local regulations. National chain stores. Streamlined marketing. Volume pricing. Alcohol as an ordinary commodity. America resembles Britain more and more each passing day. How do you like them apples?

In recent years, the UK has started to reverse course as it struggles with its epidemic of alcoholism. After ten years of study and against vehement industry protest, a conservative, Tory-led Parliament now appears serious about passing reforms aimed at weakening vertical monopolies in the British alcohol industry and forcing the cost of drinking upward through minimum-price laws. Eighty years late, Great Britain is recognizing the hard-learned lesson that our forebears enshrined in the 21st Amendment: that alcohol truly is no ordinary commodity, and must be handled with care. We would do well to recall that wisdom ourselves.

The post Last Call appeared first on Washington Monthly.

]]>
20799
Obama’s Game of Chicken https://washingtonmonthly.com/2012/11/09/obamas-game-of-chicken/ Fri, 09 Nov 2012 20:08:09 +0000 https://washingtonmonthly.com/?p=20817

The untold story of how the administration tried to stand up to big agricultural companies on behalf of independent farmers, and lost.

The post Obama’s Game of Chicken appeared first on Washington Monthly.

]]>

In May 2010, Garry Staples left his chicken farm in Steele, Alabama, to take part in a historic hearing in Normal, an hour and a half away.

The decision to go wasn’t easy. The big processing companies that farmers rely on for their livelihood had made it known that even attending one of these hearings, much less speaking out at one, could mean trouble. For a chicken farmer, that’s no trivial thing. Getting on a processing company’s bad side can deal a serious blow to a farmer’s income—and even lose him the farm entirely. Still, Staples, a former Special Forces commander, and a number of other farmers decided to risk it. Many felt it was their only chance to talk directly to some of the highest-ranking officials in the country, including Attorney General Eric Holder and Agriculture Secretary Tom Vilsack, about the abusive practices now common in their industry. It was a chance, finally, to get some relief.

Staples and other farmers described a system that is worse in certain respects than sharecropping. It works like this: to do business nowadays, most chicken farmers need to contract with a processing company. The company delivers them feed and chicks, which farmers raise into full-size birds. The same company then buys those same birds back when they are full grown. The problem is that the big processing company is usually the only game in town. So it can—and usually does—call all the shots, dictating everything from what facilities a farmer builds on his farm to the price he receives for his full-size chickens.

As Staples explained, a processing company can require a farmer to assume substantial debt to pay for new chicken houses, tailored to the company’s exact specifications. Staples said he himself had borrowed $1.5 million. Then the company will offer that same farmer a sixty-day contract that can be changed or terminated by the company for any reason at any time. If a farmer gets fed up with the chronic uncertainty and tries to negotiate better terms, the company can punish him by sending lousy feed or sickly chicks, thereby depressing his earnings. Or the company can simply undercount the full-grown chickens’ weight. Whatever the particular abuse, because there are now so few processing companies—often only one or two in a farmer’s geographic area—there’s little way out of the cycle. For many chicken farmers in America, the only real option is to accept the terms, even if those terms are slowly driving them out of business. And even if those terms keep them from publicly speaking their minds.

Staples told the crowd at the hearing that he feared that Pilgrim’s Pride, the processing company with which he contracts, might punish him for voicing his troubles. Later, Christine Varney, the government’s chief antitrust regulator at the time, who was sitting in front of an American flag, spoke up. “Mr. Staples, let me say, I fully expect you will not experience retaliation by virtue of your presence here today,” she said, handing him a piece of paper with her phone number on it. “But if you do, you call me.” The hearing erupted into applause.

The message seemed to be clear: the highest brass in the Obama administration was listening closely to how America’s independent farmers are pushed around by big companies, and they were no longer going to tolerate it.

For the next seven months, Holder, Vilsack, Varney, and other officials from the Departments of Justice and Agriculture toured the country, hearing from more farmers and rural advocates. Along the way, they learned about concentration in the seed, pig, cattle, and dairy industries, as well as in poultry. During this same period, the USDA also worked on revising and updating the main law that regulates the livestock industries to prevent many of the unfair and deceptive practices that now threaten the dignity and survival of farmers and ranchers. From dairy farms in Wisconsin to cattle ranches in Montana, hopes soared.

But today, two years on, almost nothing has changed. Big processing companies remain free to treat independent poultry, cattle, and dairy producers largely as they please. “You had farmer after farmer after farmer telling the same story, basically pleading for help, and absolutely nothing has come of it,” said Craig Watts, a poultry farmer from Fairmont, North Carolina, who drove 512 miles to attend the hearing in Alabama. Staples agreed. “We had really thought something might change.”

A generation ago, it seemed that Americans had solved the problem of monopoly in agriculture. Following the election of President Woodrow Wilson in 1912, the government gradually weakened the plutocrats’ stranglehold over most of America’s agricultural business.

The government’s primary tools were two pieces of law. One was antitrust law, which included the Sherman Antitrust Act of 1890 and the Clayton Act of 1914. In 1919, for instance, the Federal Trade Commission wielded the Clayton Act to reduce the power of the “Big Five” meatpacking companies. These companies, the FTC noted, “had attained such a dominant position that they control at will the market in which they buy their supplies, the market in which they sell their products, and hold the fortunes of their competitors in their hands.”

The other main piece of law was the 1921 Packers and Stockyards Act, signed by President Warren Harding. It broadly prohibited unfair and discriminatory conduct in the marketplace and established standards by which to hold meatpacking companies and stockyards accountable. Often called the “Farmer and Rancher Bill of Rights,” the act made it illegal for big meatpackers to pay farmers less than market value for their livestock or to arbitrarily advantage some farmers at the expense of others. As one congressman noted at the time, the Packers and Stockyards Act was “a most comprehensive measure,” possibly extending “farther than any previous law into the regulation of private business.”

Over the next few decades, independent ranchers and farmers thrived under the protection of these two bodies of law. For the most part, farmers were able to sell their products relatively freely on the open market, and prices were established transparently through open bidding, in public auctions attended by many buyers and many sellers. The effect on the structure of the market was dramatic. In 1918, the five largest meatpacking companies in the country controlled 55 percent of the meat market. By 1976, the four largest controlled only roughly 25 percent of it.

Over the last quarter century, this progress has been reversed. Today, the top four meatpacking companies control 82 percent of the beef market—an unprecedented share of the pie.

The worst abuses in today’s livestock industries can be
traced back to two fundamental changes in the structure of the market, acting in combination.

Until the 1950s, most chicken farmers did business the same way their grandfathers had. They bought their chicks, feed, and assorted supplies from various dealers, raised the birds, and then hauled them to a marketplace, where they would sell to whichever butcher offered the best price. This system worked until World War II, when the government’s decision to ration red meat, but not chicken, catalyzed a boom in Americans’ poultry consumption. By 1945, Americans were eating three times the poultry they had been eating just five years earlier. This new appetite for chicken continued after the war. Farmers, though, had a hard time managing production given the short life cycle of chickens, and the result was drastic price fluctuations and volatility in the poultry market.

In the midst of this rapid change, many of the companies that supplied farmers with chicks and feed introduced a new way of doing business: the contracting model. Under this arrangement, farmers would buy all their chicks and feed from a single supplier, raise the birds, and then sell them back to the same company, which had already agreed, according to a contract, to buy the birds at market price. The contracting model, which promised to stabilize prices, hence income, for both farmers and processing companies, took off like wildfire. In 1950, 95 percent of broiler producers were selling into the traditional open market; by 1958, 90 percent were selling on contract. Gradually, the hog and cattle industries adopted the contracting model too.

Some farmers and ranchers mistrusted this new system. At a 1958 meeting in Des Moines, one hog farmer voiced the central worry: “Will we be able to control our own farming?” But through the 1960s and ’70s, such worries seemed largely unfounded. If a farmer didn’t like the terms o-ered by one company, he could, at the end of the contract period, simply switch to another. The basic balance of power between the farmers and
the companies remained in place.

The change that finally upended this balance came in 1981. A group of Chicago School economists and lawyers working in the Reagan administration introduced a new interpretation of antitrust laws. Traditionally, the goal of antitrust legislation had been to promote competition by weighing various political, social, and economic factors. But under Reagan, the Department of Justice narrowed the scope of those laws to promote primarily “consumer welfare,” based on “efficiency considerations.” In other words, the point of antitrust law would no longer be to promote competition by maintaining open markets; it was, at least in theory, to increase our access to cheap goods. Though disguised as an arcane legal revision, this shift was radical. It ushered in a wave of mergers that, throughout the course of the following decades, would transform agriculture markets.

Although the change was strongly opposed by centrists in both parties, a number of left-wing academics and consumer activists in the Democratic Party embraced the new goal of promoting efficiency. The courts also soon began to reflect this political shift. In 1983, after Cargill, the nation’s second-largest meatpacker, moved to purchase Spencer Beef, the third largest, a rival meatpacker named Montfort filed a lawsuit claiming that the acquisition would harm competition in the industry. In a 6-2 decision three years later, the Supreme Court ruled in favor of Cargill. The decision set a precedent limiting competitors’ ability to challenge mergers, and helped catalyze a rapid series of buy-ups across the agriculture industry. In 1980, the four biggest meatpacking companies in the country controlled 36 percent of the market. Ten years later, their share had doubled, to 72 percent.

As mentioned above, today the share of the market controlled by the four biggest meatpackers has swelled to 82 percent. In pork, the four biggest packers control 63 percent. In poultry, the four largest broiler companies—Tyson, Pilgrim’s
Pride, Perdue, and Sanderson—control 53 percent of the market. In all these sectors—but especially poultry—these numbers greatly understate the political effects of concentration. At the local level, which is what matters to the individual farmer, there is increasingly only one buyer in any region.

The practical result of all this consolidation is that while there are still many independent farmers, there are fewer and fewer processing companies to which farmers can sell. If a farmer doesn’t like the terms or price given by one company, he increasingly has nowhere else to go—and the companies know it. With the balance of power upended, the companies are now free to dictate increasingly outrageous terms to the farmers.

At the hearing in Alabama in 2010, poultry farmers laid out how the arrangement now works. Staples, for example, described how processing companies routinely demand equipment upgrades that push independent farmers into heavy debt. In order to keep up with the companies’ facility requirements, farmers often must mortgage their farms and homes. With contracts often lasting only sixty days, and no real option to switch processing companies at the end of the contract period, farmers must either accept the terms they’re given—and stay on the company’s good side—or risk bankruptcy. “[W]ith the contracts that we’re offered now it’s either a take-it or leave-it situation,”
Staples said.

Tom Green, another Alabama farmer at the hearing, recounted what happened when he contested a contract that included a mandatory arbitration clause that would take away his right to a jury trial if a dispute arose. When he took issue with the clause, the processing company refused to work with him. Absent other options, Green and his wife, Ruth, lost their farm. “Ruth and I chose to stand up for our principles,” Green, a former infantryman and pilot in Vietnam, said at the hearing. “We did not give up a fundamental right to access the public court … which is guaranteed by our Constitution, regardless of price. I had flown too many combat missions defending that Constitution to forfeit it. It was truly ironic that protecting one right, we lost another. We lost the right to property.”

Of all the abuses farmers described to officials in Alabama, the one they kept returning to was the “tournament system,” a payment scheme designed, according to the processing companies, to promote efficiency among farmers. Unlike a traditional market, where every pound of chicken of the same grade fetches the same price, the tournament system allows companies to pit one farmer against another by ranking each farmer based on how he performs in “competition” against his fellow farmers. The idea is that the healthier and heavier the chickens a farmer produces with a set amount of feed, the higher he’s ranked in relation to the entire set of farmers who deliver their birds to the same processing plant on that same day. The higher he’s ranked, the more a processing company pays him per pound.

One problem with the tournament system is that no standards regulate the quality of feed and chicks that processing companies deliver to farmers, which means there’s no way for a farmer to know if he’s getting the same inputs as the other farmers against whom the company makes him compete. Another problem is that the processing companies often weigh the full-grown chickens behind closed doors, out of the sight of the farmer who raised them. This enables the companies to favor or punish whichever farmers they, or their local foremen, choose. Any farmer who complains about the system, or about the specific provisions of a contract, or who even signs some sort of petition that a processing company doesn’t like, risks seeing his “earnings” arbitrarily cut.

Farmers are still expected to own their own land and to bear all the risks of investing in facilities, like chicken houses, just as they did when they sold into fully open and competitive markets. But almost all the authority over how they run their farm and what they earn now belongs to the companies. “A modern plantation system is what it is,” said Robert Taylor, a professor of agriculture economics at Auburn University who has worked with poultry farmers for close to three decades. “Except this is worse, because the grower provides not just the labor, but the capital, too.”

In most other industries, labor law protects workers from such forms of manipulation and exploitation. Farmers, though, aren’t protected under labor law because—at least until recently—it was assumed that open market competition enabled them to take their business to another buyer. Today, however, even as they become more like employees, laboring for a single company, the law still treats farmers as if they were their own masters. “The shift to vertical integration means that farmers no longer own what they are producing,” explains Mark Lauritsen, director of the food processing, packing, and manufacturing division at United Food and Commercial Workers, the union that represents workers across many industries, including agriculture and food processing. “They are selling their labor—but they don’t have the rights that usually come with that arrangement.”

The specific type of contract and the payment scheme offered by companies vary by sector, and the hearings indicated that the worst practices are generally found in the poultry industry. What applies across the board—in cattle ranching and dairy and hog farming—is the stark and growing imbalance of power between the farmers who grow our food and the companies who process it for us, and how this imbalance enables practices unimaginable in any competitive market.

Watts, the farmer who drove from North Carolina to attend the Alabama hearing, says he and his fellow poultry farmers are independent only in name. “What I can make through my work is entirely dictated by many hands before it ever gets to me,” he said in an interview. “My destiny is no longer controlled by me.”

Farmers and activists have been fighting to restore fair agriculture markets since the 1980s with little to show for it. Both Democratic and Republican senators have periodically introduced legislation to level the playing field for independent farmers and ranchers, but those measures have repeatedly collapsed under the weight of corporate lobbies.

Most consequentially for farmers, the once-groundbreaking Packers and Stockyards Act has been weakened over the decades by both the courts’ and the executive branch’s narrow interpretation of its broad, sometimes ambiguous language. As a result, the act is no longer sufficiently powerful to protect their rights. The administration of George W. Bush essentially halted enforcement of the act entirely. In 2006 the USDA’s own inspector general reported that the agency responsible for enforcing the act, the Grain Inspection, Packers and Stockyards Administration (GIPSA), had been deliberately suppressing investigations and blocking penalties on companies violating the law. The inspector general found that Deputy Administrator JoAnn Waterfield was hiding at least fifty enforcement actions in her desk drawer.

In 2008, independent farmers seemed at last to have caught two big breaks. First, in the 2008 Farm Bill, Congress instructed the USDA to revise and update specific issues that the eighty-year-old act either had never addressed or had left overly vague. As the agency regulating the Packers and Stockyards Act, the USDA, and, more specifically, its subsidiary body GIPSA, already had the power to revise and supplement its laws. Now it had a political mandate to do so, too.

The second big break came during the 2008 campaign, when Senator Barack Obama spoke directly about the need to address such abuse of independent farmers. Four days before the Iowa caucus, he even organized a conference call with independent farmers to discuss their concerns. In the primary, the farmers’ votes swung toward Obama, helping him beat Hillary Clinton and making him a serious contender for the nomination. In the general election, the appeal may have helped Obama win some rural, traditionally Republican counties in Colorado and North Carolina.

Some farmers and activists criticized Obama’s choice of Vilsack, a former governor of Iowa, to lead the Agriculture Department, mainly because of his close ties to biotech companies, including Monsanto. But the administration soon balanced this out by appointing Mississippi rancher and trial attorney Dudley Butler to head GIPSA. Farmers and ranchers trusted Butler, who had been a private lawyer for thirty years and had long been on the front lines representing chicken farmers against processing companies.

In August 2009, eight months into Obama’s first term, the administration announced plans for a series of hearings the following year—the most high-level examination of agriculture in decades, overseen by the new antitrust chief, Christine Varney. At the opening event in Ankeny, Iowa, in March 2010, Attorney General Holder spoke boldly, assuring the crowd that reform was now a Cabinet-level priority. “Big is not necessarily bad, but big can be bad if the power that comes from being big is misused,” he said. “That is simply not something that this Department of Justice is going to stand for. We will use every tool we have to ensure fairness in the marketplace.”

Over the next nine months, officials held another four full-day hearings, in Alabama, Wisconsin, Colorado, and Washington, D.C., to investigate the poultry, dairy, cattle, and seed industries, as well as to look at the discrepancy between the price consumers pay for food and the price farmers receive for producing it. Each hearing featured several panels with a range of perspectives, and each included time for comments from many of the thousands of farmers, ranchers, industry representatives, activists, and academics who attended. In addition to the hours of testimony collected publicly, the administration provided computers in adjacent rooms where those reluctant to speak out could privately register their concerns and fears.

The administration also consulted experts like Taylor, the professor at Auburn University. At one point, the USDA sent an entire team of economists and lawyers to Alabama with a full day’s worth of questions. “It was clear these were conscientious, committed officials who had spent a lot of care investigating the issues,” Taylor said.

During the course of the hearings, the USDA also began to address Congress’s 2008 Farm Bill instruction that the department revise and update elements of the Packers and Stockyards Act. By midsummer, the USDA had rolled out a series of far-reaching revisions, addressing many of the farmers’ concerns. One of the proposed changes would have specifically banned company retaliation against farmers who tried to negotiate the terms of a contract. Another would have required any company that forced farmers to make capital investments to offer contracts long enough for the farmers to recoup some minimum amount of that investment. This series of proposed updates and revisions to the Packers and Stockyards Act later came to be known collectively as the “GIPSA rules.”

While updating an old law might not sound like a big deal, farmers widely regarded the proposed GIPSA rules as serious game changers. “Before, they would throw us a little bone once in a while,” Watts said. “But with these rules we knew they meant business.”

Because the USDA has the legal authority to revise the rules under the Packers and Stockyards Act, Congress didn’t actually have to formally vote on the new rules. Congress has the right to discuss them and request additional information, but it has no direct authority over them. In the Senate, Tom Harkin, Chuck Grassley, and Tim Johnson, longtime advocates of reform in the agriculture industry, voiced their support for the proposed updates. Many House members, however, began to attack the rules, especially once the processing companies came out strongly against them.

In July 2010, less than a month after the USDA published its proposed rules, the House Agriculture Committee, which was led by Michigan Minnesota Democrat Collin Peterson, called a hearing to question USDA officials on the revisions. At the hearing a group of mostly Republican lawmakers, joined by Jim Costa of California and a few other Democrats, assailed the proposed rules for their wide-reaching impact. They accused the USDA of ignoring the concerns of industry groups like the National Cattlemen’s Beef Association and the National Chicken Council, which represent processing companies like Cargill and Tyson. After the House hearing, the USDA agreed to extend the period for public comments on the proposed rules from the regular sixty days to a total of 150.

Then, in October, House members—led by Peterson, Agriculture Committee Ranking Member Frank Lucas (Republican from Oklahoma) and Livestock, Dairy, and Poultry Subcommittee Chairman David Scott (Democrat from Georgia) and Ranking Member Randy Neugebauer (Republican from Texas)—delivered a letter to Vilsack. The letter argued that the USDA, despite nationwide hearings and dozens of investigations, interviews, and fact-finding missions, had not sufficiently justified the need for some of the new farmer protections, and urged the agency to subject the rules to more thorough economic analysis. The letter was signed by sixty-eight
Republicans and forty-seven Democrats.

In the November 2010 midterm elections, a surge of successful Tea Party candidates handed Republicans control of the House. In the aftermath of the election, the administration continued its reform efforts. If anything, by the last of the five hearings in December the tone of the reformers had become more radical, centering on the political and moral nature of what many American farmers now suffer. “We’ve got to be looking at power,” explained Bert Foer, head of the American Antitrust Institute, at the hearing. “We’ve got to be looking at the negotiating realities that occur in the marketplace and not simply what the effect on the consumer price is going to be.”

But in the new year, a new political reality set in. In January 2011, Obama appointed Bill Daley, former commerce secretary and top executive at JPMorgan Chase, as his chief of staff. Part of a wider post-election shake-up at the White House, Daley’s appointment signaled that the administration was now intent on compromising with Republicans, especially on economic issues. Many Republicans, though, viewed the election as a mandate for even more radical obstruction.

In February 2011, the House Agriculture Committee again pushed Vilsack on the economic analysis of the proposed Packers and Stockyards rules, and over the next few months various subcommittees orchestrated hearings for trade groups to voice their objections. According to one industry report, paid for by the National Meat Association, the proposed USDA rules would levy a $1.64, billion blow to the meat industry and lead to 22,800 job losses. The report also claimed that the rules would, over time, decrease beef, pork, and poultry production across the board.

In May 2011, Costa, the Californian Democrat, Reid Ribble, a House Republican from Wisconsin, and Lucas, now the chairman of the Agriculture Committee, circulated a letter asking Vilsack to withdraw all proposed rule changes entirely. “[W]e are confident that any such rule will not be looked upon favorably by Congress,” the congressmen wrote. Though their letter was signed by 147 members—more than a third of the House, including twenty-five Democrats and thirty Tea Party Republicans—the USDA didn’t accede to the request. But officials did begin to water down the proposed rules.

The next month, in June 2011, the House Appropriations Committee included a crucial rider in its funding bill. The rider was designed to strip the USDA of the funds it needed to finalize and implement the strongest of the proposed rules. Farmers and activists tried to fight the rider, which was backed by corporate livestock and poultry lobbies. Advocacy groups flew in farmers from around the country to meet with members of Congress, and 6,000 people called in to the White House to express their support. During a debate over the rider, Ohio Democrat Marcy Kaptur, the only representative to come out strongly in favor of the rules, slammed the House for “standing with the few big meatpackers and against the many thousands and thousands of producers.” Even the American Farm Bureau, a group that often champions policies favorable to agribusiness, wrote an open letter to Congress opposing the rider.

But the farmers and activists found that they were now largely alone. By late 2011, the administration was in full retreat. “The White House and USDA became very timid and really didn’t do much to disabuse the critics spreading untruths about the reforms,” said Patrick Woodall, research director with Food & Water Watch, which organized some of the efforts in support of the proposed rules. “They all fell silent.”

The Senate supported the Packers and Stockyards revisions in its appropriations bill in September 2011. But the House, as Woodall put it, “went on a full-out offensive,” holding hostage everything from food stamps to food-safety measures. “Nobody wants to have to defend a policy position where the victims are low-income kids, and that’s where the balance ultimately was,” Woodall said. Even Senators Harkin and Johnson, who only a month earlier had strongly voiced their support for the GIPSA rules, backed down.

By November 2011, it was clear that the reformers had lost. The rider had passed. The rules as they had been intended were dead. The most ambitious, far-reaching campaign to reform the agricultural industry in forty years was over, less than two years after it had begun.

In early December, the USDA published four watered-down revisions and updates to the Packers and Stockyards Act. The only full-fledged rule to come into effect prohibits mandatory arbitration clauses in poultry farmers’ contracts—vindication for many, including Tom Green and his wife, Ruth, but hardly a sweeping victory. The other three revisions are vague “guidelines” for the USDA. None of them explicitly prohibit arbitrary and exploitative conduct by the processing companies under the notorious tournament system.

In January 2012, Butler resigned from the USDA. Then in May, the DOJ quietly published a report summarizing the five nationwide hearings conducted in 2010. The report detailed both a lack of competition in the industry and abusive behavior. It went on to claim that the DOJ couldn’t act to address these wrongs because, no matter how outrageous the conduct of the processing companies, their actions did not amount to “harm to competition” as defined by the current antitrust framework.

Administration officials who took part in the hearings say two factors thwarted their attempts to protect farmers from exploitation by processing companies. One was a deliberately obstructionist Republican-controlled House set on derailing countless reforms, not only in agriculture, and on protecting big industry from any tightening of regulation.

The other factor the administration blames is the weakened state of America’s antitrust laws. In the past, antitrust law was used to promote competition and to protect citizens from concentrated economic power. But today, enforcers say they are handicapped even when confronting markets that are no longer competitive. “However desirable, today’s antitrust laws do not permit courts or enforcers to engineer an optimal market structure,” the DOJ wrote in its recent report on the 2010 agriculture hearings. Far-reaching actions—like the Wilson administration’s challenge of the meatpacking industry ninety years ago—are, they say, simply unimaginable under today’s narrow antitrust framework.

Varney, who has since left the DOJ for private practice, says that the Justice Department pushed the law as much as it could under her tenure. “If you overreach in the courts you will lose, and the very behaviors you are calling illegal will be validated by the court,” she said. “This is not about a fear of taking risks or a fear of losing. It’s a fear of setting the producers back.”

One wonders, though, whether the administration’s actions—taken as a whole—did not set the farmers back as much as would a loss in court. By documenting the big processing companies’ exploitation of independent farmers, then failing to stop that exploitation and retreating in almost complete silence before entirely predictable resistance from the industry, the administration, for all intents, ended up implicitly condoning these injustices. The message to the processing companies is, after all, absolutely clear: you are free to continue to act as you will.

It is no stretch to assume that, from the perspective of the White House, the choice to abandon an apparently failed effort to protect independent farmers from such abuses may have seemed politically pragmatic. But over the longer term, it may prove to have been a strategic political failure. By raising the hopes and championing the interests of independent farmers against agribusiness, the administration effectively reached out to the millions of rural voters who don’t normally vote Democratic but whose ardent desire to reestablish open and fair markets for their products and labor often trumps any traditional party allegiance. Instead of translating that newfound trust into political capital, the administration squandered whatever goodwill it had begun to earn. Worse, the administration’s silent retreat amounts to a form of moral failure. Having amply documented the outrageous abuse of fellow citizens, it decided it was not worth expending more political capital to right this wrong.

The message to the farmers, it seems, is also clear. “A lot of farmers have gone pretty quiet around here,” Staples said, “from being scared.”

The post Obama’s Game of Chicken appeared first on Washington Monthly.

]]>
20817
Drone On https://washingtonmonthly.com/2012/11/09/drone-on/ Fri, 09 Nov 2012 15:51:30 +0000 https://washingtonmonthly.com/?p=20818 It’s probably a matter of when, not if, al-Qaeda in Yemen successfully strikes the U.S. Yet the drone attacks currently keeping the organization at bay are also helping recruit more terrorists. Can you say “no-win situation”?

The post Drone On appeared first on Washington Monthly.

]]>
Early last year, wandering through the turbulent carnival of Change Square in Sana’a, Yemen, I found myself sharing a tent with an old jihadi, his tangled beard glowing orange in the filtered afternoon light. He said he’d fought in Afghanistan against the Soviets—the infidels,” he called them, still spitting the word after twenty-five years—and would do it again, no question. But when I raised the topic of al-Qaeda in the Arabian Peninsula, which is based in Yemen and the most dangerous of the diffuse terrorist network’s regional organizations, the old jihadi glowered. “Those young men are fighting a different war than we were,” he said, refusing to meet my eye. “It’s on a different scale, for different ends.”


The Last Refuge:
Yemen, al-Queda, and
America’s War in Arabia

by Gregory D. Johnsen
W.W. Norton & Company, 352 pp.

Then, for quite a while, my notes are sparse. The old jihadi and I talked about U.S.-backed drone strikes, and U.S. support for Israel and “the hypocrisy of the West,” until, eventually, we came back around to al-Qaeda. This time, he looked right at me. His generation had fought for Islam so they could “come home and live,” he said. “The young men of al-Qaeda today don’t care about living. For them, fighting is life,” he said. “Go and tell the Americans it’s never going to be over.”

That old jihadi’s chilling prediction emerges as one of the major themes in writer
Gregory D. Johnsen’s excellent new book, The Last Refuge: Yemen, al-Qaeda, and America’s War in Arabia. Part modern history, part explanatory narrative, it begins in the chaotic aftermath of the Soviet defeat in Afghanistan in the late 1980’s and ends in a smoldering al-Qaeda stronghold in southern Yemen earlier this year. In the intervening quarter century, we watch from the sidelines as Johnsen describes the birth and bloody unification of North and South Yemen in the early ’90’s and the simultaneous emergence of al-Qaeda in the region, first as a controversial boys’ club for wannabe jihadis, and then as a deadly and increasingly well-oiled global force.

The young men who’ve formed al-Qaeda in the Arabian Peninsula (AQAP) in the last fifteen years are indeed, as the old jihadi in Change Square suggested, more fanatical, more uncompromising in their vision of jihad, and broader in the scope of who constitutes their enemies, than ever before. Many of these young men were educated in Yemen’s radical religious schools in the ’70, ’80s, and ’90s, and had “grown up on stories of the jihad in Afghanistan,” Johnsen writes, “watching grainy videos from the 1980s as they listened to preachers extol the glory of fighting abroad.” By 2006, the generational shift that started at the end of the war against the Soviets in Afghanistan had widened into a schism, with today’s al-Qaeda leaders giving the old guard an ultimatum: either you’re with us in global jihad, or you’re an enemy, too. “It was time for them to pick a side,” Johnsen writes, summarizing a 2006 audiotape by Qasim al-Raymi, AQAP’s military commander.

In weaving together the emergence of modern al-Qaeda, the increase of former President Ali Abdullah Saleh’s power, and the sporadic, but persistent, role that the U.S. military, diplomats, and policymakers have played in both, Johnsen moves deftly between decades, continents, and languages. Major events in U.S.-Yemeni relations—like the bombing of the USS Cole in Aden in 2000, which left seventeen dead, the botched so-called “underwear bomber” attack on an airplane over Detroit on Christmas Day 2009, and the cartridge bombs sent via FedEx and UPS that were intercepted on their way to the U.S. in 2010—act as landmarks upon which the larger narrative hangs. We are treated to vivid, behind-the-scenes accounts of Saleh’s blustery frustration with the U.S.’s seemingly capricious disbursal of aid (leaving Washington after a diplomatic trip in 2005, Saleh “finally lost it, screaming at aides and firing his entire team of economic advisors within minutes of takeoff”); of AQAP’s fitful attempts to strike U.S. targets in its early years (in one botched attack in 2002, a young al-Qaeda operative accidentally, and quite literally, shot himself in the foot); and of the windfalls and bumbling missteps of the U.S.’s ongoing intelligence operations in Yemen’s tribal hinterlands (in May 2010, an American drone accidentally killed the deputy governor of Marib, who shares a last name with an al-Qaeda fighter. “How could this have happened?” an incredulous President Obama exclaimed).

Part of the success of this book lies in the extraordinary detail of the narrative. Johnsen, a PhD candidate at Princeton University and one of the most well-read bloggers and analysts on the subject of Yemen, relies for his research primarily on jihadist forums, al-Qaeda videos, audiotapes, and publications, and Western and Arab journalists’ published interviews and accounts of major events. The result is that while many of Johnsen’s anecdotes are not new or groundbreaking, they do offer contextualization, a glimpse of the larger picture—an invaluable quality, particularly in the story of Yemen, which is almost always parceled out to readers in bite-sized breaking news stories. For those who follow Yemen, the book delivers the same deep satisfaction of seeing a finished $1,000-piece puzzle intact on a table. You may have touched each of those pieces before, but you didn’t see the whole picture until now. You’ll want to open your palms to it, drag your fingers across its seamless grooves.

For example, when the teenage suicide bomber Abdu Muhammad al-Ruhayqah blew himself up in Marib in 2007, killing eight Spanish tourists and two Yemeni drivers, the story neither begins nor ends there. Pages before, Johnsen has already introduced us to Ruhayqah, as he is “napping in a grove of fruit trees” before the attack. “Lying on top of a thin blanket with his hair curling around his ears, Ruhayqah looked like a child,” Johnsen writes. Later, we see the explosion captured by an al-Qaeda cameraman, who “watches the smoke tumble upward like a raised fist before dissipating and eventually dispersing,” and we see the investigators spending days “painstakingly collecting body parts.” Later still, we witness the ensuing diplomatic scramble when Saleh, fearful of a backlash from the international community, marshaled his forces and surrounded an al-Qaeda safe house, leaving a “bloody mess of clothes and limbs inside the mud hut”—and in doing so launches a different diplomatic maelstrom, just on the domestic front.

Perhaps the most poignant of the many tragedies that arise in Johnsen’s retelling of the last twenty-five years in Yemen is how, nearly eight years ago, the U.S. had almost routed al-Qaeda in Yemen. With half its members killed and the other half in prison or marooned in isolated outposts around the country, al-Qaeda in Yemen was in its death throes. But, mired in both Iraq, which was worse than ever, and Afghanistan, which wasn’t improving, the Bush administration pulled its attention away from Yemen. Like not finishing all the prescribed antibiotics, the U.S. allowed those surviving al-Qaeda militants to return, and grow into a stronger, harder-to-kill version of what they’d been before. In a devastating chapter, “Resurrecting al-Qaeda,” Johnsen recounts the rebirth of al-Qaeda in 2006 and 2007 under the leadership of Yemen-born Nasir al-Wihayshi, “a tiny, frail-looking twenty-two-year-old with a sharp nose and sunken cheeks,” who is still the head of AQAP today.

In 2009, when the Obama administration turned its attention to Yemen, it drew upon the same cocktail of targeted drone strikes and cruise missile attacks that had helped the Bush administration beat back al-Qaeda in Yemen in the early part of last decade. But in many ways, it was too late. Under Wihayshi, al-Qaeda in Yemen has been rebuilt into a diffuse group of cells that communicate with a central leadership but operate independently on the ground. Starfish-like, chopping off one arm—or killing a handful of leaders in a drone strike—no longer kills the center. “The surgical approach Obama and [chief counterterrorism adviser John O.] Brennan favored no longer seemed to be working. The U.S. kept killing al-Qaeda operatives in Yemen, but AQAP continued to grow,” Johnsen writes.

The broader discussion of targeted drone strikes and cruise missile attacks—alternatively referred to as signature strikes, terrorist-attack-disruption strikes, or TADS—are in some ways the best part of this book. Like all other major events in Yemen’s recent history, Johnsen offers a comprehensive description of U.S. signature strikes, some of which were extraordinarily effective in killing al-Qaeda leaders, some of which killed scores of innocent civilians, some of which killed U.S. citizens, and nearly all of which led to a sandstorm of unintended consequences. For example, in November 2002, a U.S. intelligence team tracked a cell phone belonging to Abu Ali al-Harithi, the so-called “godfather” of al-Qaeda in Yemen, and, within four hours, targeted and killed him and five of his companions in a car. At the time, Saleh was allowing the U.S. to pursue drone attacks within Yemen’s borders, so long as they were kept secret, but on November 3—two days before the 2002 U.S. midterm elections—the Bush administration broke its promise. “The Hellfire strike was a very successful tactical operation,” Deputy Secretary of Defense Paul Wolfowitz told CNN that night, Johnsen writes. In other words, what could have been an unequivocal victory in the U.S. war against al-Qaeda instead sent U.S.-Yemeni relations into a tailspin, torpedoed Saleh’s credibility on the ground, and handed AQAP readymade fodder for recruiting tapes for years to come.

As has been well documented in the news lately, one unintended consequence of drone attacks in Yemen, and elsewhere, is that they tend to have the effect of galvanizing popular opinion against the U.S. and driving new recruits straight into al-Qaeda’s arms. In his detailed account of strike after strike, Johnsen makes clear that it’s more complicated than killing bad guys. When al-Qaeda fighters are killed by U.S. strikes, they are often quickly replaced from AQAP’s growing ranks; when strikes succeed in driving al-Qaeda from certain towns or regions, fighters simply resettle elsewhere. In early September, when a U.S. drone missed its mark and killed thirteen civilians, including three women, a local activist quoted by CNN put it succinctly: “I would not be surprised if a hundred tribesmen joined the lines of al-Qaeda as a result of the latest drone mistake.” In 2010, the Obama administration estimated that al-Qaeda in Yemen had “several hundred” members; as of this year, the State Department puts that number at “a few thousand.”

In the last pages of the book, Johnsen describes the aftermath of one U.S. air strike earlier this year, which had been aided by three spies working with the U.S. and Saudi Arabia. When al-Qaeda fighters discovered the spies in their ranks, all three were sentenced to death. At one of the public executions—a crucifixion—al-Qaeda leadership had asked that a young child named Salim, the son of one of the men killed in the U.S. air strike, witness the ceremony. “Dressed in a light blue robe with childish curls in his hair and most of his baby teeth still in place, Salim looked to be about six years old,” Johnsen writes, and then describes the grisly scene in which a man is nailed to a cross and lashed to a street post. “As the crowd surged forward for a better view, one of the men picked Salim up and put him on his shoulders. ‘That’s the traitor who killed my father,’ the boy said, pointing at the crucified man.”

Johnsen, by and large, does not offer guidance on the efficacy or morality of drone strikes or cruise missile attacks, or U.S. policy in Yemen over the last two decades. In general, he is a stater of facts, not a purveyor of opinions, and his book, by extension, offers the same. The Last
Refuge
is a cogent insight into what the U.S. has done in the past twenty-five years—a bird’s-eye view on those successes and failures, in all their shades of horrid gray—but it does not dispense advice to U.S. policymakers or predict the future. You can’t blame Johnsen from shying away from the crystal ball, but the resulting lack of a clear policy solution—indeed of any workable policy solution besides the status quo—is the most frustrating part of the book. By the end, we want nothing more than to be led by the hand down a prescriptive path to victory and peace, but, as Johnsen makes clear, there is no such path. In Yemen, there are no silver bullets.

If the U.S. stops its targeted drone and missile strikes, AQAP will begin to grow and metastasize as it did before. But continuing on this path of militarization doesn’t seem to be working either. At best, the U.S.’s signature strikes are a stopgap measure, temporarily disrupting AQAP activities while failing to neutralize the root of the problem. According to a 2008 report by the RAND Corporation, more than 80 percent of the 268 terrorist groups that ended between 1968 and 2006 were eliminated after police or intelligence agencies infiltrated them or after they reached a political solution with the state; only 7 percent were eliminated by military force. In Yemen, where local police and intelligence networks are unreliable and underfunded, and local officers are sometimes in bed with al-Qaeda, counterterrorism options are severely limited. In the coming months and years, the U.S. will no doubt continue to pursue regular military strikes and increase its intelligence efforts on the ground. It should also continue to back Saudi and Arab-led counterterrorism efforts and ratchet up development projects in Yemen’s extraordinarily impoverished villages in an attempt to win—or at least have a dog in the fight—in the battle for Yemeni hearts and minds.

As it stands, AQAP is stronger and more sophisticated now than ever before, having come close to attacking the U.S. on its own soil three separate times. In the wake of the chaos following the Arab Spring last year, al-Qaeda in Yemen was able to overrun Zanjubar, a town in southern Yemen, and pillage its military laboratories. “It later used those and other materials to ‘transform the modest lab’ which had produced the 2009 underwear bomb and the 2010 cartridge bombs into a ‘modern’ one,” Johnsen writes. “By early 2012, al-Qaeda had plenty of bombs; what it lacked was individuals with passports that would allow them to travel freely in the West.” In another attempted attack on the U.S. in April, al-Qaeda handed a bomb to a British undercover agent who had been posing as a young suicide bomber, and instructed him to blow himself up on a plane bound for the U.S. The bomb, with two triggering mechanisms and no metal parts, was more sophisticated than anything AQAP had used before. While that particular attack was thwarted, the upshot is grim. It very well could be a matter of when, not if, AQAP is able to success fully strike the U.S. or one of its allies. What might happen next is anybody’s guess.

With a new generation of young men, including boys like Salim, wrapped in his light blue robe, being radicalized in al-Qaeda’s shadow and U.S. policy failing to fatally cripple al-Qaeda’s diffuse network, we are left at the end of The Last Refuge with a clear picture of a daunting, messy future that echoes that old jihadi’s prediction in Change Square last year. This new generation is indeed fighting “on a different scale, for different ends,” and while it may not last forever, it’s clear the U.S.’s war in Arabia isn’t going to end anytime soon.

The post Drone On appeared first on Washington Monthly.

]]>
20818 Mar14-Starkman-Books
Act of Recovery https://washingtonmonthly.com/2012/11/09/act-of-recovery/ Fri, 09 Nov 2012 15:45:13 +0000 https://washingtonmonthly.com/?p=20819 Only one national reporter, Michael Grunwald, bothered to take a detailed look at how well the $787 billion stimulus was spent. What he discovered confounds the Beltway conventional wisdom.

The post Act of Recovery appeared first on Washington Monthly.

]]>
Twenty-eight days after taking the oath of office, Barack Obama signed the American Recovery and Reinvestment Act, otherwise known as the stimulus, a $787 billion measure to combat the economic cataclysm then engulfing the American economy. Soon, the main stream narrative coalesced into two opposing camps. Conservatives, denying economic consensus and their own previous beliefs (Mitt Romney proposed the largest stimulus of 2008 campaign), settled on a message that the Recovery Act was a colossal waste of money that would create no jobs and do no good. Liberals, led by economist-pundits like Paul Krugman, insisted that the stimulus was far too small and that the Obama administration had committed high-order negligence in not securing a bigger one.


The New New Deal:
The Hidden Story of
Change in the Obama Era

by Michael Grunwald
Simon and Schuster, 528 pp.

A number of books by journalists have subsequently been based on the latter point, including Escape Artists: How Obama’s Team Fumbled the Recovery, by Noam Scheiber, and Confidence Men: Wall Street, Washington, and the Education of a President, by Ron Suskind, both written in the principals-in-the-room style used so successfully by Bob Woodward. They start from a similar premise—the economy remains weak, and therefore the stimulus failed in its stated goal to restore robust employment—and attempt to explain what went wrong in the negotiation of the stimulus. But—incredibly—very few journalists have taken a deep, sustained look at the stimulus itself to see what is in it and how it’s been working.

That is part of what makes Michael Grunwald’s The New New Deal: The Hidden Story of Change in the Obama Era such an achievement. An award-winning senior national correspondent for Time magazine, Grunwald is the only reporter in America who has invested the time and shoe leather necessary to really understand the stimulus as a program—not just how the legislation came together, but how the money was allocated and overseen and whether the projects and programs it has funded were worthwhile. In doing so he debunks much of the received Beltway wisdom about the program. It is the rare book that finds an enormous untold story hiding in plain sight, like a coworker discovering there has been a rhinoceros standing in your building’s lobby for the past four years.

After all, the stimulus was not some piddling appropriation for the Department of Commerce. It was a bill of stupendous size—it made up 4 percent of annual GDP, 50 percent bigger than the entire New Deal in constant dollars—designed to confront head-on the largest economic crash since the Depression. With the possible exception of Obamacare, it is the president’s signature achievement (though, due to its unpopularity, not one he talks about).

The drama of its passage gripped the country for weeks. It offered the largest tax cuts in U.S. history, heavily weighted toward the poor and middle class. It provided badly needed extensions of unemployment insurance, food stamps, and aid to states, which were set to cut their budgets at the worst possible time. It created a $90 billion clean energy fund, which bought, for starters, the world’s largest wind farm, largest photovoltaic array, and largest solar thermal power plant, and jump-started the advanced battery manufacturing industry. It contained the $4.5 billion “Race to the Top” education reform program that has generated at least as much change in state and local education policies as George W. Bush’s No Child Left Behind law. It put $20 billion toward digitizing our medical records, a precondition for bringing down health care costs, the number one driver of long-term federal budget deficits. And for the most part it has worked—according to the Congressional Budget Office, the Recovery Act saved up to 3.3 million jobs. Had it not passed, the economy could well have fallen into full-blown depression.

Yet the mainstream narrative quickly lost sight of these gigantic facts. Grunwald is frustrated by this, and much of the book is taken up trying to explain how in heaven’s name public understanding of the stimulus became so badly garbled. At least part of the blame goes to the Obama team, which lost the propaganda war. While the Republicans were busily engaged in a relentless, nihilistic, scorched-earth opposition, portraying the stimulus as a vast waste of money and a socialist plot, endlessly repeating lies or debunked distortions, the Obama team simply didn’t fight back. (In fairness, Grunwald notes, the administration did have other things on its plate—two ground wars, a collapsing auto industry and banking system, and efforts to pass health care and financial reform, to name a few. And it was always going to be tough to convince voters experiencing disastrous economic times that without the stimulus things would have been worse.)

The national media comes off badly in Grunwald’s telling, and rightly so. Practically all the reporters who bothered to look into the Recovery Act at all have spent most of their time chasing down phantom boondoggles (and Grunwald conclusively demonstrates that there has been vanishingly little fraud or abuse in the implementation of the act). The press poured its effort into covering Solyndra, one of a handful of stimulus projects that haven’t panned out (though the charge that political favoritism was involved is bunk, as Grunwald convincingly shows). But it has given almost no attention to thousands of other stimulus expenditures, thereby missing some amazing success stories.

For example, the administration dusted off a tiny Bush-era experiment in preventing homelessness and ramped it up sixtyfold using stimulus dollars, ultimately helping 1.2 million Americans on the verge of being evicted from their homes. If even half of them had ended up on the street, the nation’s homeless rate would have doubled. Instead, in the midst of the Great Recession, the rate actually declined slightly. Or consider that the stimulus financed the largest dam removal in U.S. history, to restore salmon runs on the Elwha River, as well as three huge infrastructure projects in Manhattan alone. There have been plenty of stories about the administration’s ill-fated efforts to fund high-speed-rail projects in Florida and Wisconsin, where Republican governors turned down the money. But there’s been almost no national coverage of stimulus-funded projects now under way in Illinois and Missouri that will shave an hour off Amtrak travel times between Chicago and St. Louis and St. Louis and Kansas City—not quite high-speed-rail-level improvements but possibly enough to make passenger rail a viable alternative to flying on those routes. And there are other stimulus investments that could have more dramatic long-term impacts. For instance, the Department of Energy agency ARPA-E has funded some truly bold green energy research, including a company that developed the most energy-dense lithium battery in the world, an innovation that could trim $5,000 from the price of the next generation of electric cars.

One would think these kinds of stories, available to anyone with a phone and a computer, would be catnip to the press. But whatever the reason, they haven’t been; absent this book, there has been no serious journalistic accounting of a breathtakingly huge and groundbreaking bill. (Personally, I find it unsurprising that Grunwald chooses to live in Miami Beach—well outside the reach of Beltway groupthink.)

The book also contains a well-crafted Woodwardian story of how the stimulus was negotiated and passed. Grunwald puts paid to the idea that the Obama team could have somehow wrung more stimulus money out of Congress. With nearly all GOP senators implacably opposed, a handful of persuadable moderates—Arlen Specter, Olympia Snowe, Susan Collins, and Ben Nelson—had all the leverage. They used it to set a completely arbitrary, non-negotiable $800 billion upper limit on the stimulus, a number far below the $1.2 trillion recommended by Christine Romer, chairman of the president’s Council of Economic Advisers. The only plausible way of obtaining more funds would have been a quick evisceration of the filibuster right at the start of the 2009 Congress, something Majority Leader Harry Reid ruled out even in 2010. The original stimulus should have been bigger, and the administration miscalculated in thinking that it could persuade Congress to pass a second big one later if needed (Larry Summers, in an on-the-record interview with Grunwald, even credits Paul Krugman for correctly predicting this effect). Still, Obama did manage to squeeze out a few mini stimuli in the form of extensions of the payroll tax cut and unemployment benefits, which put total stimulus spending closer to Romer’s estimate.

Grunwald does sometimes get a bit overexcited about the scale of the Recovery Act’s transformation of the economy; the medium- to long-term effects of the act have obviously not yet been firmly established. But slightly overselling the stimulus’s longer-term impact doesn’t negate the importance of what Grunwald has delivered. The depth of his reporting and his understanding of the enormity of the Recovery Act’s accomplishments are both impressive. One of the most telling moments in the book comes when several Obama aides, veterans of the Clinton administration, pause their frantic work on the act to recall the knock-down, drag-out fight the Clinton administration had in 1993 over a relatively piddling $19 billion stimulus— a fight the administration lost. They note how strange it feels to remember the bitter arguments over line items of a few million dollars—the kinds of things that were inserted and deleted by the hundreds in the process of negotiating the Recovery Act. By American governance standards, the stimulus was colossal on a scale very difficult to grasp. We should be grateful that Michael Grunwald, at least, has gotten out his tape measure.



The post Act of Recovery appeared first on Washington Monthly.

]]>
20819 Mar14-Starkman-Books
The Conservative War on Prisons https://washingtonmonthly.com/2012/11/09/the-conservative-war-on-prisons/ Fri, 09 Nov 2012 13:52:02 +0000 https://washingtonmonthly.com/?p=20821

Right-wing operatives have decided that prisons are a lot like schools: hugely expensive, inefficient, and in need of root-and-branch reform. Is this how progress will happen in a hyper-polarized world?

The post The Conservative War on Prisons appeared first on Washington Monthly.

]]>

American streets are much safer today than they were thirty years ago, and until recently most conservatives had a simple explanation: more prison beds equal less crime. This argument was a fulcrum of Republican politics for decades, boosting candidates from Richard Nixon to George H. W. Bush and scores more in the states. Once elected, these Republicans (and their Democratic imitators) built prisons on a scale that now exceeds such formidable police states as Russia and Iran, with 3 percent of the American population behind bars or on parole and probation.

Now that crime and the fear of victimization are down, we might expect Republicans to take a victory lap, casting safer streets as a vindication of their hard line. Instead, more and more conservatives are clambering down from the prison ramparts. Take Newt Gingrich, who made a promise of more incarceration an item of his 1994 Contract with America. Seventeen years later, he had changed his tune. “There is an urgent need to address the astronomical growth in the prison population, with its huge costs in dollars and lost human potential,” Gingrich wrote in 2011. “The criminal-justice system is broken, and conservatives must lead the way in fixing it.”

None of Gingrich’s rivals in the vicious Republican presidential primary exploited these statements. If anything, his position is approaching party orthodoxy. The 2012 Republican platform declares, “Prisons should do more than punish; they should attempt to rehabilitate and institute proven prisoner reentry systems to reduce recidivism and future victimization.” What’s more, a rogue’s gallery of conservative crime warriors have joined Gingrich’s call for Americans to rethink their incarceration reflex. They include Ed Meese, Asa Hutchinson, William Bennett—even the now-infamous American Legislative Exchange Council. Most importantly, more than a dozen states have launched serious criminal justice reform efforts in recent years, with conservatives often in the lead.

Skeptics might conclude that conservatives are only rethinking criminal justice because lockups have become too expensive. But whether prison costs too much depends on what you think of incarceration’s benefits. Change is coming to criminal justice because an alliance of evangelicals and libertarians have put those benefits on trial. Discovering that the nation’s prison growth is morally objectionable by their own, conservative standards, they are beginning to attack it—and may succeed where liberals, working the issue on their own, have, so far, failed.

This will do more than simply put the nation on a path to a more rational and humane correctional system. It will also provide an example of how bipartisan policy breakthroughs are still possible in our polarized age. The expert-driven, center-out model of policy change that think-tank moderates and foundation check-writers hold dear is on the brink of extinction. If it is to be replaced by anything, it will be through efforts to persuade strong partisans to rethink the meaning of their ideological commitments, and thus to become open to information they would otherwise ignore. Bipartisan agreement will result from the intersection of separate ideological tracks—not an appeal to cross them. This approach will not work for all issues. But in an environment in which the center has almost completely evaporated, and in which voters seem unwilling to grant either party a decisive political majority, it may be the only way in which our policy gridlock can be broken.

Republicans’ rhetorical campaign against lawlessness took off in earnest during the 1960s, when Richard Nixon artfully conflated black rioting, student protest, and common crime to warn that the “criminal forces” were gaining the upper hand in America. As an electoral strategy, it was a brilliant success. But as an ideological claim, the argument that America needed more police and prisons was in deep tension with the conservative cause of rolling back state power. The paradox flared up occasionally, as during the National Rifle Association’s long-running feud with the Bureau of Alcohol, Tobacco and Firearms during the 1990s. But for the most part, conservatives lived with the contradiction for forty years. Why?

For one, it worked political magic by tapping into a key liberal weakness. Urban violent crime was rising sharply during the 1960s and liberals had no persuasive response beyond vague promises that economic uplift and social programs would curb delinquency. The conservatives’ strategy also provided an outlet for racial anxieties that could not be voiced explicitly in the wake of the civil rights movement. Sometimes, the racial appeals were impossible to miss, as when Ronald Reagan warned that “city streets are jungle paths after dark” in his 1966 California gubernatorial campaign. More often, anti-criminal chest-thumping played into the division of society between the earners and the moochers, with subtle racial cues making clear who belonged on which side.

Meanwhile, the more threatened ordinary Americans came to feel, the angrier they became at elites who appeared to side with the criminals, and the more they revered the people designated as society’s protectors. As a result, conservatives came to view law enforcement the same way they had long seen the military: as a distinctive institution whose mission somehow exempted it from the bureaucratic failures and overreach that beset school districts, environmental agencies, and the welfare office. Yet the two surging wings of the conservative movement—libertarians and religious conservatives—have since each found their own reasons to challenge long-standing orthodoxy about crime.

Antitax activist Grover Norquist appeared last year at a Washington confab on criminal justice billed as the “Last Sacred Cow” briefing. For years, Norquist said, conservatives were too busy rolling back government extravagances to worry about the workings of essential operations like crime control. But conservatives can no longer afford to direct their critique of government only at their traditional targets, he told his audience. “Spending more on education doesn’t necessarily get you more education. We know that—that’s obvious. Well, that’s also true about national defense. That’s also true about criminal justice and fighting crime.”

Once you believe that prisons are like any other agency, then it is natural to suspect that wardens and prison guards, like other suppliers of government services, might submit to the temptations of monopoly, inflating costs and providing shoddy service. And, of course, conservatives have long made such arguments to justify their pet project of bidding out incarceration to for-profit businesses. But the prisons-as-government critique has acquired a new force that makes the privatization debate almost irrelevant. Far from shilling for corporate jailers, conservatives now want to shrink the market. “We certainly don’t need to be building new prisons, whether they’re public or private,” said Marc Levin, an analyst at the conservative Texas Public Policy Foundation. The American Legislative Exchange Council, long a proponent of privatizing prisons, no longer has an official position on that issue (nor does it have any prison corporations left as members). Instead, it is pushing bills that would reduce prison populations. For fiscal hawks, the point now is not to incarcerate more efficiently or profitably, but to incarcerate less. They are making that leap with a boost from two other camps: evangelicals and experts.

Over the last two decades, religious conservatives have increasingly come to see prisoners as people worthy of compassion and capable of redemption. “These people have committed crimes, but they’re still human beings, created in the image of God. Can we help them restore what’s left of their lives?” asks Tony Perkins, president of the Washington, D.C.-based Family Research Council. Perkins has doubted the efficacy of incarceration since serving as a guard in a Louisiana lockup as a young man. Though that experience also made him skeptical of jailhouse conversions, Perkins said, religious outreach behind bars has the benefit of making prisoners seem like real people— much as the pro-life movement has done with unborn children. “As more and more churches are involved in prison ministries, they begin the process of rehumanizing the criminal.”

Meanwhile, the tide of professional opinion is turning away from what had been a depressing consensus that warehousing prisoners was the best society could do. For many years, the hope that “rehabilitation” could change people’s behavior was dismissed as a liberal fantasy. The role of prisons was much simpler: to incapacitate reprobates and deter opportunists. The dean of this school of thought, former Harvard and University of California, Los Angeles, professor James Q. Wilson (who died this year), put it like this: “Many people, neither wicked nor innocent, but watchful, dissembling, and calculating of their chances, ponder our reaction to wickedness as a clue to what they might profitably do.” Social service approaches to criminal “wickedness” not only did not work, but they symbolized a society unwilling to stand up against violations of the law. Increase incarceration, conservatives argued, and potential criminals will get the message.

But in recent years, experts in criminal justice have become more optimistic about alternatives to prison. A promising example is Hawaii’s Opportunity Probation with Enforcement (the HOPE program, now hopscotching to other states; see Mark A. R. Kleiman, “Jail Break,” Washington Monthly, July/August 2009). HOPE has been shown to significantly cut drug offending by hitting users who are on parole or probation with swift, certain, and moderated sanctions, such as a few days of jail time, rather than arbitrary and draconian parole revocations. New technologies from rapid-result drug tests to GPS monitoring have also bred optimism, and professionals are even beginning to feel better about their ability to predict an offender’s risk of recidivism. Because these approaches emphasize control more than therapy, they don’t seem squishy or soft on crime, even as they make it easier to let criminals out of prison.

The world has also changed in ways that favor fresh thinking. In the 1990s, Democrats diluted the Republican electoral advantage on crime by pushing their own set of tough measures. Then Arkansas Governor Bill Clinton oversaw the execution of a brain-damaged convict during his 1992 presidential campaign, and once elected president he pushed through a cast-iron crime bill that combined longer sentences, restrictions on gun purchases, and more cops on the street. While the subsequent drop in crime gave the GOP fodder to argue that punitive policies work, it has also drawn the venom out of the issue. And since the 1990s, terrorism has displaced crime as the nation’s top security preoccupation and honeypot for law-and-order zealots. If you consider all these issues together, it makes sense that conservatives have more space to rethink their positions on crime. And so, with jailers newly suspect, inmates ripe for redemption, and alternative discipline ascendant, conservatives have decided prisons are a lot like schools: hugely expensive, inefficient, and in need of root-and-branch reform.

Such second thoughts are creating the first significant opening in years for a criminal justice overhaul. Neither Republicans nor Democrats can reform the system alone given the continuing fear of being tarred with the “soft on crime” label, said Gene Guerrero, a policy analyst at the Washington office of George Soros’s Open Society Foundations. It can only happen, he said, “if there is real leadership from both sides and if the reforms are developed and move forward on a bipartisan basis.”

Still, it’s conservatives who bring the most muscle to the job. A handful of liberal organizations have valiantly kept alive the argument for reform even through the dark days of the 1980s and ’90s—places like the American Civil Liberties Union, Open Society Foundations, and the Public Welfare Foundation. By and large, however, it is conservative institutions who now pay the most attention to criminal justice, Guerrero said. In rare cases, Democratic politicians have proved willing to take up the cause, as when Michigan Governor Jennifer Granholm directed an overhaul of that state’s parole system during her first term— though her second-term push for broader reform legislation fizzled (see Luke Mogelson, “Prison Break,” Washington Monthly, November/December 2010). But most Democrats are still terrified of appearing timid before voters and are therefore loath to lead the way. At best, they can be persuaded to go along if the right gives them cover.

The right’s belated awakening to America’s incarceration crisis may seem little more than an obvious extension of libertarian and socially conservative philosophies. But logic rarely determines how movements put together their various ideological commitments. Making and changing positions is tough, entrepreneurial political work, especially when long-held, electorally successful ideas are being called into question.

Few people have done as much to subvert the conservative orthodoxy on crime as Pat Nolan, a former California state legislator who now works at the jailhouse ministry Prison Fellowship. Called “the most important person to make any of this happen” by Julie Stewart of Families Against Mandatory Minimums, Nolan has been so effective as a revisionist precisely because he was weaned on the traditional politics of law and order.

Nolan grew up in LA’s Crenshaw Boulevard neighborhood during the 1950s. “Everyone in my family and all of our neighbors had been victims of crime,” says Nolan. “I came from a family that was pretty pro-police, feeling as [though] they were kind of beleaguered.” When his family moved to nearby Burbank, Nolan signed up for the Police Explorers, a group for kids interested in law enforcement careers. He also joined Young Americans for Freedom, the conservative activist group that rallied behind Barry Goldwater in 1964. As a Republican California state assemblyman in the 1970s, ’80s, and ’90s, Nolan helped push through some of the nation’s most draconian sentencing laws. While he did visit prisons to investigate conditions there, he recalls, “I was very much the ‘We need more prisons’ type.”

That changed after Nolan got to see prison from the other side of the bars. In 1993, Nolan was indicted on seven counts of corruption—including accusations that he took campaign money to help a phony shrimp-processing business the FBI dreamed up as part of a sting. He ultimately accepted a plea deal and was sentenced to thirty-three months in prison for racketeering. Nolan maintained his innocence, but said he would take the plea to avoid the risk of longer separation from his family. Before he left, Nolan recalls, a friend told him, “View this time as your monastic experience”—a chance to follow generations of Christians who have retreated from daily life to work on their faith. Nolan, who is Catholic, resolved to follow that advice.

While Nolan was locked up, a mutual acquaintance put him in touch with Chuck Colson, the biggest name in prison ministry. Colson, a former Nixon aide, had gone to the clink for Watergate-related crimes and experienced what he described as a religious transformation behind bars. After his release in 1975, Colson founded Prison Fellowship, which provides religious services and counseling to inmates and their families. By the time Colson died this past April, he had become a star in the evangelical community, rubbing shoulders with the likes of Billy Graham, Rick Warren, and James Dobson.

Nolan enrolled his kids in a Prison Fellowship program for children of inmates and began corresponding with Colson. Even before Nolan got out, he had an offer to run the group’s policy arm, which had been languishing.

“I’d really been praying about, ‘Okay, Lord, what’s the next chapter in my life?’” Nolan recalls. “I’d seen so much injustice while I was inside that I felt I really wanted to address that. My eyes had been opened.” Nolan is devoting the rest of his life to opening the eyes of his fellow conservatives, getting them to see the tragic cost of putting so many Americans under lock and key.

When Nolan first arrived in Washington, the only real foothold reformers had in the conservative movement was with a small band of libertarians at places like the Cato Institute and Reason magazine, who objected to the prohibitionist overreach of the drug war but were treated as wildly eccentric by mainstream conservatives. To find allies with unquestioned right-wing credentials, Nolan prospected among two groups with whom he had credibility: evangelicals who admired Prison Fellowship, and his old friends from Young Americans for Freedom, some of them longtime crime warriors themselves.

Colson had already persuaded evangelicals that prisoners were appropriate objects of personal compassion, but had yet to find an angle that would convince the faithful that the criminal justice system was fundamentally flawed. Nolan hit upon two perfect issues in short order.

The Supreme Court opened the first window in 1997 by striking down most of a federal law intended to expand the religious freedoms of prisoners. The specter of wardens putting bars between inmates and God energized social conservatives. Prison Fellowship threw itself into the fight, and a revised law was passed in 2000.

Around the same time, Reagan administration veteran Michael Horowitz was casting about for a cause to show that conservatives have a heart. Previously known for his advocacy on issues like human trafficking and peace in Sudan, Horowitz decided to make protecting the victims of prison rape the next step in what he called his “Wilberforce agenda,” after the famous British evangelical abolitionist.

Prison rape was a natural issue to express conservatives’ humanitarian impulses. Evangelicals who think homosexuality is immoral can easily be persuaded that homosexual rape under the eyes of the state is an official abomination. More importantly, Horowitz had put his finger on a nightmare of massive proportions. Human Rights Watch had gathered evidence suggesting an epidemic of torture to which many wardens were turning a blind eye. Last May, the U.S. Justice Department estimated that more than 209,000 prisoners suffered sexual abuse in 2008 alone.

Horowitz proposed a bill designed to have cross-partisan appeal, with provisions for penalizing lagging states and shaming recalcitrant wardens. Evangelicals were sold right away. “Everyone has basic human rights, even if they are being dealt with and sanctioned for inappropriate social behavior, and prison should not take those away,” the Southern Baptist Ethics and Religious Liberty Commission’s Shannon Royce would explain to the Washington Post.

Horowitz focused on negotiations with a skeptical Justice Department and state corrections officials, while Nolan worked the corridors of the Capitol. The Prison Rape Elimination Act passed both houses of Congress unanimously in 2003.

Nolan then used this big win as a springboard to an issue where the moral lines were more blurred: helping released prisoners adjust to life back home and stay out of trouble by pumping money into “reentry” programs. Republican Congressman (and now Senator) Rob Portman agreed to champion legislation that would become known as the Second Chance Act. President George W. Bush endorsed the idea in his 2004 State of the Union Address, after lobbying by Prison Fellowship and Portman’s office, according to Nolan. Hammering out the bill took several more years, but the Second Chance Act was finally passed with solid conservative backing in 2007.

These measures all had bipartisan support, but they were not the product of centrists: the top Senate backers of the Prison Rape Elimination Act were Ted Kennedy and Alabama’s Jeff Sessions, who spent a dozen years as a tough-as-nails U.S. attorney and is ranked the Senate’s twelfth most conservative member by the National Journal. Liberal reformers did bargain with conservatives behind the scenes—the biggest example was an agreement that the Second Chance Act remain silent on funding faith-based reentry programs. But Nolan’s conservative allies were confident that bipartisan reform efforts brokered by Prison Fellowship would remain consistent with conservative principles, thanks to groundwork laid by the previous religious freedom and prison rape efforts.

Even as the Second Chance Act edged forward, Nolan was tapping old friendships to pull together more conservative dissenters. David Keene—then head of the American Conservative Union, now president of the National Rifle Association—was tracking post-9/11 encroachments on civil liberties and turning a wary eye to criminal justice. Richard Viguerie, a direct mail pioneer in the conservative movement, was a longtime death penalty opponent. Nolan began calling them for advice. Soon, antitax activist Norquist was being looped into the conversations, as was Brian Walsh, a Heritage Foundation analyst who studied the rapid expansion of federal criminal law. The group started holding regular meetings to brainstorm ideas. They toyed with proposing a federal criminal law retrenchment commission similar to the base-closure commission of the 1990s, or pushing congressional judiciary committees to demand jurisdiction over any bills that created new crimes.

Despite all of Nolan’s progress, it soon became obvious that the juice on criminal justice reform would not come from Washington. The real potential lay in the states, where a combination of fiscal conservatism and budget pressure was beginning to crack the status quo. The opportunity to turn those tremors into a full-blown earthquake would come from a very unlikely place.

“Don’t Mess with Texas” bumper stickers have long found their most extreme confirmation in the state’s criminal justice system. Over the last two decades, Texas has been one of the most avid jailers in the nation. It was home to the largest prison-conditions lawsuit in American history, a thirty-year ordeal that infuriated conservatives and led them to plaster the state with posters calling for the impeachment of Judge William Wayne Justice. And of course, no prison cooks have taken as many last-meal orders as those in the Lone Star State—until officials recently did away with that perk for the condemned. But even as Texas continues to buff its toughest-on-crime reputation, it is also becoming, unexpectedly, a poster child for criminal justice reform.

A handful of liberal organizations have valiantly kept alive the argument for prison reform even through the dark days of the 1980s and ’90s. But most Democrats are still terrified of appearing timid before voters and can be persuaded to go along if the right gives them cover.

It started in 2005, when Tom Craddick, the first Republican speaker of the state legislature in more than a century, appointed Jerry Madden, a conservative from Plano, to run the House Committee on Corrections. As Madden recalls, the speaker’s charge to him was clear: “Don’t build new prisons. They cost too much.”

Madden was a corrections novice with a disarming, aw-shucks manner; his Senate counterpart, Democrat John Whitmire, was an old hand whose resume included being robbed at gunpoint in his garage. The greenhorn and the veteran soon agreed on what ailed the Texas criminal justice system: it was feeding on itself. Too many people flunked probation and went into prison. And too many prisoners committed new offenses shortly after being released, landing them back behind bars. To tackle the first problem, Madden and Whitmire suggested cutting loose veteran probationers who had proved reliable, thus allowing officers to focus their time on people at higher risk of screwing up. The legislature signed off, but Governor Rick Perry vetoed the bill.

At the start of the 2007 legislative session, legislative analysts predicted that Texas was on track to be short 17,700 prison beds by 2012 because of its growing inmate population. The Texas Department of Criminal Justice’s response was to ask legislators to build three new prisons, but Madden and Whitmire had other ideas. Not only did they bring back a revamped version of their probation proposal—they also took aim at the revolving-door problem by cranking up funding for programs such as in-prison addiction treatment and halfway houses. This time, Perry relented (persuaded at least in part, the duo contends, by a high-stakes meeting they held with him shortly before the opening of the legislative session). Since then, the prison population has not increased, and last year, the TDCJ closed a prison for the first time in decades.

Budget shortfalls do not explain this shift. In 2007 Texas was basking in a huge projected surplus, and the Great Recession was still a year away. Instead, Madden and Whitmire had different winds at their backs. For one thing, the policy context favored reform. One legacy of the state’s prison litigation trauma is that Texas has strict restrictions on overcrowding (unlike, say, California). Under Texas law, when the system approaches capacity, corrections staff must seek certification from the attorney general and the governor to incarcerate more prisoners. The approval process forces state leaders to confront the choice between more prisons and more diversion programming. The political environment had also changed since the GOP completed its takeover of state politics in 2003. As a longtime observer of the state’s criminal justice notes, “Now … all the tough guys are Republicans. They don’t want to be outdoing each other on this stuff.”

Texas was not the first state to experiment with common sense. Several others had begun tinkering with their criminal justice systems in the wake of the 2001 recession. When the fiscal belt tightened on a swelling inmate population in New York, for example, corrections officials prevailed upon then Governor George Pataki to take steps leading to earlier releases. But none of these initiatives reverberated like the Texas reforms.

The Texas turnaround created a golden opportunity to rebrand prison reform nationally. “People think if Texas does something, by definition it’s not going to be soft,” said Adam Gelb, director of a criminal justice initiative at the Pew Charitable Trusts. “There’s just this instant, deep credibility on the crime issue for Texas.” In 2005, the Texas Public Policy Foundation (TPPF)—the state’s premier conservative think tank—hired Marc Levin to become its first-ever crime wonk. The position was financed by Tim Dunn—a deeply conservative oilman, Republican donor, and Colson-inspired critic of the criminal justice system. Levin promptly threw himself into the Texas debates of 2005 and 2007, but his biggest contribution came later in building momentum for prison reform among conservatives across the country.

The TPPF is one of the most prominent members of the State Policy Network, which connects free-market think tanks in every state. Founded in 1992, the Arlington-based SPN zaps ideas—like Wisconsin-style restrictions on public employee pensions—from one member organization to another. Levin was and remains the only full-time crime analyst at any SPN member organization. As a result, he quickly became the go-to guy on the issue among state-level conservatives, fielding calls from curious colleagues, cowriting editorials and policy briefs, and making presentations at conservative conferences. Eventually, he decided to convert the effort into a formal campaign he called Right on Crime.

When Nolan heard about Right on Crime, he contacted Levin to offer his support—and his Rolodex. Nolan rounded up the members of his informal working group and other conservative luminaries to endorse a revised approach to crime control. Among the signatories: Keene, Viguerie, Gingrich, former Attorney General Ed Meese, and former drug czars Asa Hutchinson and Bill Bennett. Political scientist and long-time prison proponent John DiIulio is there, too, as is Grover Norquist. The Family Research Council’s Tony Perkins and other social conservatives also signed on. Right on Crime backers say explicitly that their goal was to lend their reputations to the effort and give conservatives political cover to launch reforms. “We wanted to create an atmosphere in which, amongst conservatives, there would be total legitimacy,” Nolan said.

Perhaps the surest sign that conservatives were embracing the new model came from the American Legislative Exchange Council—the conservative network of state legislators. In the 1990s, ALEC had peddled mandatory minimums, prison privatization, and the like to its members in statehouses across the country. But in 2007, ALEC hired Nolan’s friend Michael Hough to run its criminal justice task force, and Nolan soon persuaded ALEC to endorse the Second Chance Act. Within a few years, the trio of Hough, Nolan, and Madden had brought ALEC to the point of pushing out model bills based on propoals borrowed from Gelb’s criminal justice project at Pew, which has been dispatching teams of sentencing wonks to state capitals around the country to help reformers develop specific plans. All this work was done through the same ALEC committee whose advocacy for “stand-your-ground” laws prompted a backlash in the wake of the Trayvon Martin killing. ALEC announced in April that it would disband the committee, but, in fact, it ended up giving the panel a new mandate. The committee now focuses exclusively on sentencing reform and has dropped all of its unrelated model bills, from mandatory minimums to prison privatization, Hough said.

With conservatives less willing to defend the lock-’em-up status quo, prison reform now seems to have the momentum of an issue whose time has come. States from Kentucky to Pennsylvania to North Carolina have passed bipartisan criminal justice overhauls, preventing thousands of prison commitments. And the wave continues. In May, Georgia Governor Nathan Deal was on the verge of tears at a signing ceremony for legislation designed to keep nonviolent offenders out of prison. When his Ohio counterpart, John Kasich, signed a similar bill in June, he said it would “result in the saving of many, many lives.”

To be sure, the new conservative critique has so far largely overlooked the most glaring problem in American criminal justice—its profound racial skew. African Americans account for some 40 percent of the U.S. prison population, three times their proportion of the general population. The liberal legal scholar Michelle Alexander, whose 2010 book compares mass incarceration with Jim Crow, argues that the system will only be dismantled with a return to 1960s-style movement politics.

But it is also important not to underestimate how much the emerging conservative reform movement can do. For starters, conservatives did step into the terrain of racial justice when they took the lead in 2010 to reduce the disparity in federal sentences for crack and cocaine offenses. And reframing criminal justice in terms of efficacy and cost has already prevented many thousands of unnecessary prison terms.

Moreover, this line of argument can also open the door to more radical critiques. Just listen to Tim Dunn. The conservative Texas oilman declaims that the “purpose of the criminal justice system should be to secure liberty and promote justice between people rather than to enforce the power of the state over the lives of its citizens.” Or take Mark Meckler, co-founder of the Tea Party Patriots. “We’re destroying a significant portion of our own population, especially in the inner cities,” Meckler has written. Meckler and Dunn have appeared on MSNBC to endorse the work of David Kennedy, a liberal criminologist who has criticized the failure of the drug war in inner-city communities. And Meckler vows on his blog, “I’m all in on the fight for criminal justice reform here in the U.S.”

The story of how conservatives began to change their positions on incarceration holds lessons far from the world of prisons. Advocates of policy change, their funders, and well-meaning pundits regularly bemoan the ideological stiffening that bedevils efforts at bipartisan cooperation. The usual answer to hyper-polarization is to somehow rebuild the center. But the power of party activists (especially on the right) to control primary elections and discipline politicians who step out of line is not going to go away anytime soon. The center, it seems, will not hold—in fact, it barely even exists anymore.

The lesson of the slowly changing politics of crime on the right is that policy breakthroughs in our current environment will happen not through “middle-path” coalitions of moderates, but as a result of changes in what strong, ideologically defined partisan activists and politicians come to believe is their own, authentically conservative or liberal position. Conservatives over the last few years haven’t gone “soft.” They’ve changed their minds about what prisons mean. Prisons increasingly stand for big-government waste, and prison guards look more and more like public school teachers.

This shift in meaning on the right happened mainly because of creative, persuasive, long-term work by conservatives themselves. Only advocates with unquestioned ideological bona fides, embedded in organizations known to be core parts of conservative infrastructure, could perform this kind of ideological alchemy. As Yale law professor Dan Kahan has argued, studies and randomized trials are useless in persuading the ideologically committed until such people are convinced that new information is not a threat to their identity. Until then, it goes in one ear and out the other. Only rock-ribbed partisans, not squishy moderates, can successfully engage in this sort of “identity vouching” for previously disregarded facts. Of course, there are limits to how far ideological reinvention can go. As political scientist David Karol has argued, it is unlikely to work when it requires crossing a major, organized member of a party coalition. That’s something environmentalists learned when they tried to encourage evangelicals to break ranks on global warming through the idea of “creation care.” They got their heads handed to them by the main conservative evangelical leaders, who saw the split this would create with energy-producing businesses upon whom Republican depend for support.

But that still leaves plenty of issues on which bipartisanship will be possible—as long as it doesn’t feel like compromise for its own sake. Defense spending, for example, is already being slowly transformed by the newly energized libertarian spirit in the Republican Party. On these matters, liberals are in a bind—while they may dearly long for partners on the right, they can’t call them into being, and getting too close to conservative mavericks may tarnish their vital ideological credentials. In this confusing world where those on the extremes can make change that those in the center cannot, liberals will have to learn that they sometimes gain more when they say less.

The post The Conservative War on Prisons appeared first on Washington Monthly.

]]>
20821
How We Could Blow the Energy Boom https://washingtonmonthly.com/2012/11/08/how-we-could-blow-the-energy-boom-2/ Thu, 08 Nov 2012 15:13:12 +0000 https://washingtonmonthly.com/?p=20849

America’s vast new surplus of natural gas could lead to great prosperity and a cleaner environment. But if we don’t fix our decrepit, blackout -prone electric grid, we could wind up sitting in the dark.

The post How We Could Blow the Energy Boom appeared first on Washington Monthly.

]]>

For the first time in four decades, spanning the last eight presidents, America is poised to break free of its energy crisis. The country finds itself suddenly awash in domestic energy, especially new supplies of natural gas extracted from shale rock. The economic windfall is already enormous. According to a recent study by energy analysts, consumers saved more than $100 billion in 2010 alone as a direct result of the natural gas boom. Economists at Bank of America calculate that the boom contributes nearly $1 billion per day to the economy, equal to 2.2 percent of GDP—roughly the same as the economy’s rate of growth in recent years.

The environmental windfall is also substantial. Relatively clean-burning natural gas is rapidly replacing coal as our primary source of electricity, leading to reductions in greenhouse gas emissions. Gas-fired power plants are also more easily integrated with renewable energy sources such as wind or solar, giving that industry a boost. The potential benefits of the gas boom also include the promise of a manufacturing revival in the U.S. based on the comparative advantage of lower energy costs, and the opportunity for the country to overcome its chronic trade deficits. This “energy dividend” could be, in other words, the biggest game changer in global politics and economics in a generation.

Yet this bright shiny future is hardly assured. While natural gas deposits could very well yield enough to sustain our energy needs for another century, there are many reasons to fear that we won’t succeed in maintaining adequate supplies of economically available natural gas, or in putting enough of it to optimal use—generating electricity.

More fundamentally, in order to capitalize on today’s energy dividend, we need to meet a second essential precondition: repairing, expanding, and modernizing our overstrained electrical grid. As it stands, the grid—an interconnected network of 360,000 miles of transmission lines and substations linking more than 6,000 power plants to customers nationwide—is an inefficient and increasingly blackout-prone tangle of 1950s technology. In its present dilapidated state, it is not only imposing unacceptable and avoidable environmental costs due to its inefficiency, it is also making us vulnerable to an array of threats that could dramatically impair the U.S. economy tomorrow—regardless of how much surplus energy we have in the ground.

The gas boom could bring us nearly limitless potential for building a greener and more prosperous future. Yet without long-term planning and bold political leadership to fight for the right policies, America may wind up awash in cheap energy, while American homes and businesses are stuck in the dark.

Many people will come to this subject concerned about the environmental consequences of “fracking”—that is, hydraulic fracturing—of natural gas, and it’s certainly an important issue. But as this magazine has argued (see Jesse Zwick, “Clean, Cheap, and Out of Control,” May/June 2011), with the right regulations we can reduce the adverse environmental consequences without fundamentally altering the total volume of natural gas produced.

Properly deployed, natural gas has the potential to enable America to overcome our dependence on coal, an extremely dirty fuel, as our primary source of electricity. The result will be profoundly positive for the environment simply because natural gas is substantially cleaner than coal. When used to generate electricity, natural gas emits about half as much carbon dioxide, produces some 80 percent less nitrous oxide pollution, and releases negligible sulfur dioxide, particulate matter, and mercury compared to coal. Electricity generation accounts for 40 percent of all carbon dioxide emissions in the United States, so this transition alone provides us with a clear path over the next several decades to make dramatic reductions in greenhouse gas emissions.

Another, often overlooked, environmental benefit of using natural gas to produce electricity is that it facilitates the development of other energy sources that cause no emissions. Unlike coal-fired plants, those fueled by natural gas are easy to “turn up” and “turn down.” This means that they work well in concert with wind or solar power—when winds are calm or the sun isn’t shining, natural gas turbines can ensure that electricity generation remains constant. In 2010, Florida Power and Light brought online a new breed of hybrid power plant connecting one of the country’s largest solar thermal power plants with an existing natural gas complex.

Utilities nationwide are already accelerating the shift toward natural gas. So far this year, gas-fired generating plants have accounted for about 45 percent of national electricity production, up from 30 percent in 2008. Old coal-fired power plants have been decommissioned this year at a record pace. Much of the electricity supply that we’ve lost in the process of decommissioning coal plants is being replaced by increased usage of existing natural gas plants, which run at a much higher level of efficiency than coal plants, and currently run less than half of the time. America is therefore presented with a golden opportunity to accelerate the transition away from coal and achieve major greenhouse gas emission reductions, without needing to build all new power plants.

Yet the more our electricity supply comes to depend on natural gas, the more that fears about the long-term price and supply of gas could begin to retard the rate at which power companies convert from coal. Investors and industry executives remember the first big “dash to gas” by the electricity industry during the 1990s, when they invested billions of dollars in new natural gas-fired power plants. By the end of the decade, surplus natural gas supplies were gone, and many utilities and electricity generators were sent reeling by skyrocketing gas prices.

One reason for those fears is that gas drilling has historically been a boom-and-bust business. When the price of gas is adequate, wildcatters heed the call to “drill, baby, drill” and supply is assured. When the price is low, however, huge reserves of gas effectively disappear because it costs more to get this gas out of the ground than the market will pay. Today’s new drilling technologies make the cost of producing gas inherently cheaper, but they do not ensure that enough gas will always be economically available to serve as a reliable replacement for coal, nor do they change the long-term boom-and-bust nature of natural gas supplies and prices.

Shortsighted business practices or government subsidies could also end up squeezing the amount of gas available for the generation of electricity. Unlike coal, which is almost exclusively used for electricity generation, natural gas has many economic applications. Almost one-third of America’s natural gas supply today is used for industrial feedstock in chemical plants and in fertilizer and other industries, and another 36 percent is used in commercial and residential spaces for heating and cooking. New England is increasingly using natural gas in place of home heating oil, and countries like Canada and Mexico are driving increased demand for exports of American natural gas. Electricity generators, which now account for a third of natural gas use, will have to compete in the marketplace with these and other uses for secure long-term supplies.

Our supply of natural gas currently outstrips demand and will continue to in the immediate future. While the best long-term outcome for the country is to maximize the amount of gas available to natural gas power plants for electricity generation, the natural gas industry is at the moment looking for ways to expand into new markets—perhaps analogous to the aluminum industry’s invention of the need for aluminum siding on brick houses in the 1950s. Congress and the administration should be wary of proposals to subsidize potentially large diversions of natural gas to stimulate demand. For example, a bipartisan bill Congress considered this year would provide billions of dollars in tax credits to boost deployment of natural gas-powered cars and trucks, and to subsidize a program to build out natural gas fueling infrastructure across the nation. While there is a place for feet and certain heavy-duty trucking vehicles powered by compressed natural gas, the relative inefficiency of natural gas engines—and the federal government’s long history of failed energy boondoggles—would argue against making natural gas the government-backed fuel of choice for passenger vehicles.

The important point is this: the opportunity for the electricity industry to lock in maximum amounts of long-term gas supply at a time of low prices is like the opportunity for American consumers to refinance their thirty-year mortgages when interest rates are low. Policymakers should favor this outcome, because it will benefit American electricity consumers with lower prices for decades to come.

Policymakers should also make sure that our existing regulatory apparatus is updated to reflect the growing interdependence of the natural gas and electricity industries. Currently, the Federal Energy Regulatory Commission (FERC) regulates both industries separately. Congress should direct FERC to develop a long-term integrated resource plan for the two industries together. The plan should be updated every five years and include a twenty-year outlook for supply and regulatory issues, to ensure that no natural gas supply shocks disrupt the American bulk electricity system due to a lack of foresight. FERC must be empowered, for instance, to ensure that both the gas and electricity industries take precautions to prevent short-term pipeline service interruptions resulting from severe storms, terrorist attacks, or other events. With an increasing dependence on natural gas for electricity, such pipeline disruptions could mean blackouts and power shortages for an entire region for days on end.

This brings us to the second, and much more menacing, precondition for capturing the full potential benefits of the current natural gas supply boom: we must fix our decrepit, vulnerable, and long-neglected electrical grid. Today, the average substation transformer in the U.S. is forty-two years old—two years older than its expected life span. A recent Department of Energy report warned that 70 percent of the largest high-voltage power transformers—each weighing up to 800,000 pounds—are more than twenty-five years old, and subject to an increased risk of failure. As of now, replacing one of these enormous transformers, should it be attacked, or simply break down, can take twenty months or longer. Even without any major attacks or breakages, most of the equipment on the grid is already so antiquated that roughly 500,000 Americans lose electricity for at least two hours every single day.

And, of course, the disruptions are often far worse than that. Many readers no doubt suffered through the latest major system failure: the blackout this past June, catalyzed by a freak “derecho” storm, left a million Americans from Indiana though central Appalachia to the toniest suburbs of D.C. without power during 100-plus degree weather. In the immediate aftermath of such storms, the press and angry customers see downed wires and (sometimes rightly) blame utilities for failing to respond quickly enough. But the culprit that causes such blackouts to linger for days is often a system-wide problem—the poorly maintained and overstrained local electric grid. In many places, the local grid is in such bad shape that even a minor disruption—a single downed power line, for example—can create a domino effect well beyond the damaged area. For example, in the aftermath of the derecho, several areas of suburban Maryland remained dark for days longer than other areas, not because they were hit harder, but because the storm damaged antiquated equipment in many substations, including hundreds of transformers, which triggered multiple failures down the line.

Similar problems extend into the bulk transmission segment of the grid. For example, in August 2003, a series of line failures in northeastern Ohio set off a cascade of power outages across the United States and Canada. The lack of adequate redundancy in transmission lines meant that those initial failures rippled through the system, knocking a total of 265 power plants offline, darkening an area of more than 9,000 square miles, contributing to almost 100 deaths, costing an estimated $6 billion, and leaving roughly fifty million people in the dark for up to four days. It was the largest blackout in North American history. Although utilities and regulators have since added new “fail-safe” procedures to reduce the domino effect of such outages across wide geographies, the root causes—grid congestion, old transformers, poor interconnections—remain an endemic problem throughout the entire U.S. electrical supply chain, from bulk transmission centers to local distribution lines.

Even beyond basic maintenance, the grid has also become increasingly vulnerable to software viruses and cyber attacks. A new unit within the National Security Agency, the U.S. Cyber Command, found that cyber attacks on the electric grid and other strategic infrastructure increased by a factor of 17 from 2009 to 2011. Terry Boston, president and chief executive officer of PJM Interconnection, a regional energy transmission consortium covering the mid-Atlantic and mid-western areas, wrote recently that while a calamitous cyber attack on the grid is not inevitable, we should never “trust the security of our energy infrastructure to luck.” A Homeland Security official said the department had constructed a scenario under which a successful terrorist attack on just six critical substations could cause blackouts in most of the country east of the Mississippi River. In September of this year, Congress also held hearings to determine the degree to which the entire national electricity grid is vulnerable to electromagnetic pulses from high-altitude thermonuclear devices and from the sun.

Mass power outages don’t just disrupt our day-to-day lives. Blackouts are estimated to cost the American economy about $150 billion each year in interrupted production, destroyed or lost products (like computers fried during power surges), and other costs—an average of more than $500 per person. The increasing dependence of the economy on high-quality, uninterrupted electricity is underscored by the fact that fully 40 percent of all electricity used in the U.S. now goes to power computer chips and automated manufacturing, and applications ranging from personal computers and “Cloud” storage to so-called mission-critical machines, which are used in manufacturing, health care, and air traffic control. Experts predict that by 2015, nearly 60 percent of our electricity will go to such uses.

These days, companies like Amazon and Google are forced to rely on banks of backup diesel generators to keep their servers up and running, turning what would otherwise be a “clean” business into a major source of air pollution. Some years ago, when I was chairman of the Clean Technology Venture Network, I listened to Andy Grove, a founder of Intel, address a group in San Francisco. He said that for Intel, which has hundreds of millions of dollars invested in high-tech equipment and highly paid engineers, the cost of electricity isn’t the issue. It’s the dual risk of power outages and poor electrical quality that threatens devastation of the company’s bottom line.

For the government, the bottom line is this: we need to repair and maintain the entire infrastructure of the grid, and protect it against new threats that could cause catastrophic failure. Congress should give FERC an explicit mandate to set age and reliability standards for critical components of the grid, and make sure that there are sufficient inventories of such components. This is necessary to ensure that weak links in the entire transmission and distribution chain are not created by the failure of some utilities to replace badly antiquated equipment or undertake necessary maintenance. FERC also needs to have sufficient power to deal with a national emergency that might befall the entire bulk electricity system. Congress should clarify such authority and direct FERC to develop national emergency plans that require utilities, power companies, and others to prepare for cyber and physical attacks. And, with the right regulatory framework, the investments necessary to meet all these challenges will be made by the private sector, not the government.

To its credit, the Obama administration has, over the past four years, become increasingly aware of the problems facing our basic grid and taken some useful steps to address them. Recently, for example, FERC has issued a number of administrative rulings to promote more investment in new transmission lines in congested areas and to new renewable energy sources, like wind farms. In addition, the administration has funded a plethora of such renewable energy sources, including three of the world’s largest renewable power plants: a wind farm, a photovoltaic solar array, and a solar thermal plant. And in August 2012, President Obama signed an executive order titled “Accelerating Investment in Industrial Energy Efficiency,” which could help reduce transmission congestion and the need to build new power plants in the future. Steps like these are crucial. But they are not enough.

Thus far, the administration has neglected the basic infrastructure repairs necessary to keep the current grid up and running. It has also neglected the critical linkages between gas and grid. Instead, it has concentrated on adding what are in effect digital bells and whistles to a broken machine.

Take, for example, the American Recovery and Reinvestment Act of 2009, which allocated more than $90 billion in government investments and tax incentives “to lay the foundation for the clean energy economy of the future,” as a Department of Energy document states. Only $4.5 billion of this money went to what the DOE calls “grid modernization,” and $3.5 billion of that went to subsidize the installation of fifteen million “smart meters”—digital devices that automate the management of electricity for households and businesses. As Michael Grunwald notes in his new book on the effects of the stimulus bill, The New New Deal (see Ryan Cooper’s review in this issue), handing out smart meters without addressing the problems with the basic grid is like “handing out iPhones before there was a 3G network.” Replacing aging transformers may not be politically sexy, but just restoring the existing “dumb” grid to its condition in, say, 1974 has to come before adding smart grid features.

Once we’ve made adequate progress on that front, however, there is indeed the opportunity to take the grid to a whole new level of intelligence—one that will truly leverage our windfall of gas, while also vastly reducing the number of unsightly transmission lines and ugly generating plants that would otherwise need to be built. The smart grid involves the installation of sensors, digital controls, and analytical tools that can be used to automate, monitor, and control the two-way flow of electricity, from power plants to power lines to the electrical sockets in your kitchen. A reliable, secure grid, outfitted with smart components, would usher in a new era of electricity-demand management and end-user efficiency that could cut up to 22 percent of U.S. energy consumption.

With a smart grid, for example, it becomes possible to lower substantially the amount of electricity needed to run household appliances, from washing machines to refrigerators. Chips embedded within such appliances will communicate with the grid, taking advantage of when surplus electricity is available and thereby reducing spikes in usage.

In the renewable energy industry, utilities could use the intelligent networks to incorporate, in real time, the variable output of several renewable sources, such as solar and wind, and to dispatch backup natural gas turbines as needed. Existing digital technology could also enable utilities to remotely manage widely distributed power sources. This could even include drawing on the energy stored in the batteries of electric vehicles when they are plugged in and already fully charged.

Electrical infrastructure has typically been built to meet so-called peak load demands—the highest amount of electricity needed at peak times on peak days. Today, this leaves a large portion of the electricity industry’s capital stock vastly underutilized most of the time—the average natural gas power plant today is only running about 42 percent of the time. But with a smart grid enabled for real-time pricing of electricity, peak load demand could be smoothed out, and wasted generation reduced. Smart grid enhancements could also allow transmission lines to send 50 percent to 300 percent more electricity through existing energy corridors. These outcomes could reduce congestion on the now-overloaded parts of the grid, and reduce the number of expensive new high-voltage transmission lines that we need to build or replace.

Finally, a smart grid could facilitate the move toward more locally distributed electricity generation, like community cooperative solar farms, which are growing in popularity in states that require utilities to dispatch such local suppliers. By reducing the amount of electricity that has to travel from distant power plants to consumers, this in turn would also reduce the huge amount of energy that is lost in transmission and distribution, as well as the amount of pollution caused by generating the wasted energy in the first place.

If we are able to generate a large portion of our electricity using cheap natural gas and then distribute that electricity through efficient and cost-saving smart grid technologies, today’s children may well grow up to enjoy a higher standard of living than their parents, and the planet will benefit as well. But that’s a big “if,” especially if we don’t have even a basic plan for how gas and grid will work together.

The United States is at the cusp of what very well could be the biggest political and economic windfall in a generation. But to realize this windfall, we must ensure that natural gas is maximized as a source of electricity generation, and we must commit to a modern regulatory structure that mobilizes major investments in a reliable, secure—and, yes, “smart”—grid. While relatively straightforward, this is not an easy path. It will require presidential leadership to explain to the public what’s at stake and to provide a broad vision of a national energy strategy. The president will also have to face down an army of entrenched special interests to avoid squandering America’s energy dividend. If too many priorities and hare-brained schemes divert our political resolve, we may find that we have blown the energy boom.

The post How We Could Blow the Energy Boom appeared first on Washington Monthly.

]]>
20849