Today is Super Tuesday and we are all supposed to vote for the candidate who will bring “change” to Washington.  But what kind of change is actually possible?  Democrats wistfully long for an alternative universe in which Gore or Kerry had beaten Bush, but let’s seriously consider how our world would be different:

If a Democrat had been President for the past eight years….

1. No occupation of Iraq (although they all voted for it, Democrats would probably not have taken the initiative to falsify so much intelligence) .  Some of our contributors disagree, and have claimed that 

Secretary of State Holbrooke would not even have had Colin Powell’s minimal qualms about such a war.  And think about the pressure Gore would have received from the right, who still controlled Congress (not to mention the AM radio waves).  Gore would have been forced to show his toughness, his mettle.  I think our tendency to imagine Gore would have acted differently is a superimposition of the new and improved Nobel Laureate Gore on the old politician Gore.  Also, let’s not forget that Lieberman, who is a clone of Dick Cheney on issues of foreign policy, would have been vice president.

2. Serious action on global warming
3. A more forward thinking and lucid foreign policy which better preserves American hegemony
4. Marginally increased foreign aid to the world’s poor
5. No torture of detainees or spying on Americans

How it would be the same

1. Absurd wealth concentration: 2% of the world’s population controls 50% of the wealth and 50% of the world’s population controls less than 1% of the wealth
2. Well over 1.5 billion people living on less than 1 dollar a day.
3. Occupation of Afghanistan
4. Unconditional support for Israel
5. No serious attempt to deal with the consumerism and commodification which create the industrial conditions responsible for global warming.
5. All foreign aid is given out through the same, horrible USAID.
6. We continue to pursue free trade policies that manufacture poverty

This is just a short list.  There are obviously many more items for both.  I hope these lists can illustrate that when we say this election is a nonevent, it is not to suggest that there is literally no difference between parties or candidates.  The question is one of emphasis.  Most Americans believe the differences listed above are VERY important.  Indeed, they are all that matters, as to address problems on the bottom list is “unrealistic.”

The disagreement on the Left about whether or not to vote then comes down to a question of what voting means, what it entails as a personal statement and as a moral action.  Such a discussion is beyond me right now.  I am toying with some Marxist/utilitarian/deontological comparisons, but none are wholly coherent.

Rather than a philosophical analysis, how about some feedback: Are you choosing to vote or to “sit this one out?”  Why?

By Andrew Hartman

The trade issue is central to the 2008 election.  I think some Democrats have improved their outlook on trade, namely John Edwards.  However, we could all benefit from a more historical and international perspective.  

I am not against trade in its crude sense.  You have apples, I have oranges, let’s trade.  But this is not the issue.  The common claim that free trade leads to more peace and prosperity is patently false. This argument, most famously known as Thomas Friedman’s “golden arches” theory, goes like this: countries that have McDonalds, McDonalds being symbolic of a country committed to free trade, don’t bomb each other.  Of course, Friedman’s cutesy formulation—part of his sloppy apologetics for corporate globalization, The Lexus and the Olive Tree—was blown to bits when US-led NATO bombed the crap out of Belgrade, golden arches and all (it should be noted that US planes also bombed the Chinese Embassy and Serbian television during that war).  But regardless of the Belgrade bombings, Friedman is easily unmasked as a charlatan with no concept of international economic history, which would be fine if he wasn’t so damn influential.

Let’s momentarily ignore the domestic consequences of free trade—NAFTA—and instead look at its international history, focusing on the nineteenth century.  In this larger historical context, free trade should be thought of as a euphemism for unfettered capitalistic expansion, which has also been called imperialism, or globalization (pick your poison).
 
Historians John Gallagher and Ronald Robinson, in a famous essay “The Imperialism of Free Trade,” argue that a distinction between free trade and imperialism is made by those who only conceptualize empire as something formal—people who study “those colonies colored red on the map, [which] is rather like judging the size and character of icebergs solely from the parts above the water line.”  Take Britain in the 19th century, which the US now models itself after in many ways.  Throughout the nineteenth century, Britain maintained its hegemony in a vast array of regions by either informal, indirect means, or by formal, military means, which included annexations—a means usually associated with the later stages of the nineteenth century, which included the “scramble for Africa.”  As Gallagher and Robinson point out, “refusal to annex are no proof of reluctance to control.”  We need to think about US trade policy in this context, in the context of the US as an imperialist power.  This will allow us to get over the false assertion that “free” trade brings peace and prosperity, which is one of the central counters to those who critique NAFTA-like legislation for destroying the wages of American workers.

Karl Polanyi, in his epic The Great Transformation, best refutes the standard argument that a free market unleashes the forces of progress and innovation.  For him, human innovation is at its best when it is organizing new forms of protection against the intrusiveness of capitalism, when it is raising barriers against the unstoppable beast—innovative barriers like unions!   The worst element of the nineteenth century, which is now the worst element of the twenty-first century, was its crude utilitarianism—in Polanyi’s words, “the self-healing virtues of unconscious growth.”  This was the thinking behind the gold standard—the international monetary system created according to the (il)logic of a self-regulating market.  This (il)logic worked something like this: the Gold Standard was a way for each country’s accounts to balance without the heavy hand of government; a system that regulated according to the “market” rather than according to democratic processes, which, the theory went, disrupted the efficiency of the system.  But the Gold Standard system, although considered the heyday of global capitalism (until the 1990s), was inherently unstable, and is, according to Polanyi, what led to the world wars.  The shock of contracting economies led to new protective barriers, which led to imperialist expansion (overseas barriers), which in turn led to war.  Some peace and prosperity. 
 
I would suggest reading Eric Hobsbawm’s four-volume history of the modern world.  The titles alone refute the argument that “free” trade brings peace and prosperity.  1) The Age of Revolution: 1789-1848 (which details how the bourgeoisie came to rule European society, and eventually the world, which meant “free” trade was about to be globalized). 2) The Age of Capital: 1848-1875 (this was the first heyday of global capital, the age of the Gold Standard).  3) The Age of Empire: 1875-1914 (when informal empire—“free” trade—failed to take hold everywhere, formal empire stepped in).  4) The Age of Extremes: 1914-1991 (the cost of globalized capitalism… a century of war like none other). 
 
Let’s hope the 21st century proves to be more peaceful than the last.  But… no justice, no peace. 

 

By Andrew Hartman

 Every four years, those on the left debate tactics regarding the presidential election.  This year is no exception, despite the singularly poor Bush administration, which would seemingly make any Democratic administration an important reprieve.  However, not everyone is inclined to agree.  Take, for instance, Penn political scientist Adolph Reed, Jr, who recently argued in the pages of The Progressive that we on the left should “Sit Out the 2008 Election.”

Reed’s article, which argues that the Democrats as currently constituted don’t deserve our support, sparked a heated debate on a listserv I edit.  Some argued that Reed was too pessimistic in disavowing the current Democratic candidates.  I argued that Reed was essentially correct.  Here is how I framed my argument.

 

I made four basic points:

 

1)     In response to those who think Reed is too pessimistic, it should be pointed out that Reed is NOT arguing that we should expect more out of the Democratic Party.  In fact, he’s making quite the opposite point, that we should quit dedicating so much of our time, energy, and money to national election cycles, precisely because we can’t expect the Democratic Party and its candidates to pay attention to our desires.  Instead, we should focus on building a strong movement that will compel the Democrats to take our demands more seriously.  This is the only method for success, indeed, the only instances of effective social reform in US history have been the products of such movements: the Populists compelled some regulation of the new corporate behemoths at the turn of the 19th century; the CIO and other working-class organizations forced the hand of FDR, and we got the New Deal; and the vast civil rights movement shut down de jure Jim Crow.  As Reed writes in the article:” Electoral politics is an arena for consolidating majorities that have been created on the plane of social movement organizing. It’s not an alternative or a shortcut to building those movements, and building them takes time and concerted effort… [T]hat process cannot be compressed to fit the election cycle.”

It must be remembered that Reed’s audience is the left, progressives, radicals. He’s writing to those that struggle over whether it makes more sense to support the Democrats on the basis of them being superior to the alternative, the Republicans—which they undoubtedly are—or more sense to opt out of the two-party “duopoly,” as Ralph Nader has long referred to it.  It’s a question of tactics.  In the context of tactics, Reed might be wrong.  The United States is a deeply conservative country, and there is currently no real possibility for structural changes along the lines of the New Deal or Great Society minus a serious crisis.  And there’s no guarantee such a crisis would result in a leftward shift.  I think a rightward shift (democratic fascism!) is more likely.  In this context, perhaps we’re better off swallowing our pride and voting for the Democrats.  And by “our” I mean the left, what’s left of us. 

2)    This leads to my second point.  Despite the fact that Reed might be wrong tactically, this does not detract from how correct he is in terms of historical and political analysis.  If some Democratic partisans seem viscerally offended by Reed’s analysis, that’s probably because it’s spot-on.  I would suggest that partisan attachment to the Democratic Party has limited the ability of some to see the forest for the trees, to ignore the historical arc of US political history. 

The Democrats have shifted to the right since 1976, at least on issues of substance, like economic and social policy.  Now, you might argue, this makes sense since the nation itself has shifted to the right.  The Democrats have interpreted Clinton-style triangulation—tacking to the center, which has been moving to the right—as the only means to electoral success.  This is shortsighted and just plain wrong. The Democrats are locked into an electoral approach that is doomed to fail.  They’ve been cutting the rug out from under those who traditionally guaranteed them a majority, namely, unions. The triangulation approach that the party has pursued since Carter has never produced an electoral majority.  In fact, Gore and Kerry got higher percentages of the vote as the 2000 and 2004 losers than Clinton got as the 1992 winner and were very close in 1996. Minus Ross Perot, we would not have had a Democratic President since Carter (who, it must be said, was a conservative Democrat).  

3)    What is the evidence that the Democrats have shifted to the right?  Let’s examine the Clinton administration’s record in terms of economic and social policy.  Clinton is as much to blame as Reagan for the intense polarization of wealth that has grown larger than any such gap since the 1920s.  This is due largely to the fact that politicians have rewritten economic policy at the behest of corporations, who serve only one master: shareholders.  Let’s take NAFTA (1994).  This terrible piece of legislation benefited nobody other than powerful corporations, which were no longer restrained by pesky local and national laws.  NAFTA, among other neoliberal trade policies, decimated the industrial working class in the United States.  And yet, Democrats continue to ask themselves why the struggle to win in states like Ohio and even, sometimes, Michigan.  DUH!  And NAFTA has not exactly helped most Mexicans, either, made evident by the huge number of them, driven off their land, who come to the US to work in the service industry.  Wow—Lou Dobbs might have a point regarding the close connection between trade policies that benefit the filthy rich and “illegal” immigration.

After the Clinton administration oversaw trade legislation that gashed living wages, it then proceeded to sponsor the Welfare Reform Act (1996), which ripped apart the already-limited safety net for the poor.  The new safety net became the prison system.  As Reed asks us to remember, the Clinton administration was responsible for “two repressive and racist crime bills that flooded the prisons” and “the privatizing of Sallie Mae, which set the stage for the student debt crisis” and “ending the federal government’s commitment to direct provision of housing for the poor.”  Fallout from Clinton-era policy continues, including in New Orleans, as 4,500 units of low-income public housing were recently razed, to the dismay of protesters–this, in a city with the worst housing crisis in the country.

Of course, if Democratic partisans ceded me (and Reed) these arguments—which I doubt they would—they would then argue that the current crop of Democratic candidates should not be judged by the Clinton administration.  I suppose this is a decent point.  However, for the most part, nothing the front-running Democratic candidates are saying indicates they will work to reverse the horrible economic and social polices of the Reagan-Clinton-Bush era.  (John Edwards sounds pretty good on economic policy, sometimes, which has led the media to lampoon his populist message.)  Yes, all of the candidates have a plan for universal healthcare.  But none of the “big 3” remove the insurance and pharmaceutical industries from the equation, which is not much of a plan as far as I’m concerned.  On this issue, Michael Moore is on the money.

4)    For me, foreign policy is the single most important issue in US politics, the issue that everything else branches out from.  And on this issue I consider the majority of the Democrats cowardly, especially those who committed the original sin of twenty-first century politics by voting for the war in 2002.  This is unforgivable in my eyes, no matter how hard any of them now try to explain it away.  The argument that everyone was working with bad intelligence does not fly.  Not only were plenty of people (such as Joseph Wilson, Hans Blix, and Scott Ritter) showing evidence that Iraq did not have WMD, the WMD issue is a true non sequitur.  It was completely and utterly beside the point.  Iraq was not responsible for 9-11, as everyone should have understood, and did not represent a threat to the US.  Even if the Hussein government had WMD, he was not a threat to the US because his one goal was to stay in power at any cost and the quickest way to achieve the opposite would have been for him to use WMDs against the US or its allies.

 

This is the central reason why I am unequivocally against the Clinton campaign.  Also, because she is so closely tied to the foreign policy establishment that has been such a poor steward of the nation for the past 60 years, the unbreakable chain from Truman to Bush.  (The legacy of Truman is the creation of this consensus.)  The fact that Richard Holbrooke, Madeleine Albright and Wesley Clark, among others, support Clinton’s campaign is a huge strike against her.  The so-called war on terror reads to me like the past sixty years, when the US involved itself in a number of catastrophic wars and interventions, from Korea to Vietnam to Serbia, all based on objectionable rationales.  The US needs to wake from its dreams of delusion.  It cannot and will not control what other people do at the point of a cruise missile.  This is the lesson of the twentieth century.  The other lesson is that wars have unimaginable, unintended consequences that will haunt us for decades to come.  For instance, this current war will undoubtedly contribute to a huge wave of homelessness in the next forty or fifty years, platitudes about taking care of our veterans notwithstanding.

I will only actively support a presidential candidate who unmasks the war on terror for the sham that it is.  So-called wars on amorphous entities achieve one thing: fear… which then allow the powerful to run roughshod over the rest of us, which is what the Bush administration has done, predictably.  That the Democrats feign shock and indignation over Bush administration unseemliness is laughable.  It’s as if none of them have ever picked up a history book, as if none of them had ever heard of Joseph McCarthy or Richard Nixon (not to mention Truman, who was a clever scaremonger himself).

On foreign policy, Obama has a few advantages.  Most importantly he was publicly against the war before it began.  Of course, had he been in the Senate in 2002, he might have voted alongside Clinton and Edwards, we’ll never know.  Second, Anthony Lake is Obama’s main foreign policy advisor, and he seems better than the Clinton crowd, more chastened by past US failures in Vietnam, Somalia, and Iraq.  That being said, I intend to vote for Kucinich in the primaries, when the pick-your-poison mentality of the general election does not yet apply.  In the debates, Kucinich is the only candidate (other than Ron Paul) who makes any sense on foreign policy issues.

 

In the context of presidential elections, it is important to remember the bipartisan consensus on US foreign policy.  I am one of the few people who thinks a Gore-Lieberman administration would have invaded Iraq.  Secretary of State Holbrooke would have not even have had Colin Powell’s minimal qualms about such a war.  And think about the pressure Gore would have received from the right, who still controlled Congress (not to mention the ever pervasive AM radio waves).  Gore would have been forced to show his toughness, his mettle.  I think our tendency to imagine Gore would have acted differently is us superimposing the new and improved Nobel Laureate Gore on the old politician Gore.  Also, let’s not forget that Lieberman, who is a clone of Dick Cheney on issues of substance, would have been vice president.  On a broader scale, let’s remember that the Democratic Party’s record of getting the US into stupid wars is abysmal: WWI (Wilson); Korea (Truman); Vietnam (Kennedy and Johnson); Yugoslavia (Clinton).  In other words, my counter-factual about Gore and Iraq highlights the problems with the current state of the Democratic Party, and goes to the core of our discussion. 

 

Andrew Hartman

The Enemy Within (the Ivory Tower):
How Conservatives Came to Despise the Academy

By Andrew Hartman

In a May 4, 2005 editorial in the Los Angeles Times titled “Neocons Lay Siege to Ivory Towers,” a UCLA Professor of English warned of the “profound threat posed to academic freedom” by a California bill to enact the David Horowitz-authored “Academic Bill of Rights.” Horowitz, a repentant sixties radical, has become arguably the most influential conservative activist in the professed struggle against rampant anti-Americanism on campuses across the nation. His benignly-named “Academic Bill of Rights,” fashioned into legislative bills in dozens of states, purports to protect students against professors who “take unfair advantage of their position of power over a student by indoctrinating him or her with the teacher’s own opinions.” In practice, the Horowitz bill would allow the state to regulate pedagogical practice, thus serving to decimate academic freedom, as the concept has long been understood.

Considering his powerful allies, the Horowitz quest is hardly quixotic. The influence of conservative groups such as the American Council of Trustees and Alumni, founded by Lynne Cheney and Joe Lieberman, dedicated to monitoring and exposing leftist sentiments among academics, has grown precipitously in the wake of September 11. For these conservative activists, the academy is suspect, a veritable fifth column. For instance, in his latest book The Professors, subtitled “The 101 Most Dangerous Academics in America,” Horowitz argues that a swarm of intellectuals are undermining national security in their sympathy for terrorists.

To read the rest of this essay, please go to the United States Intellectual History blog.

The Making of an Educator

November 3, 2007

The Making of an Educator

Andrew Hartman

It is easy to see, in retrospect, how I came to be interested in education. The most influential people in my life chose education as a career. This includes my grandfather, longtime swimming coach at Colorado State University, where he also taught physical education courses. It includes both of my parents, who are retired public school teachers. My mother continues to be involved in teacher education, even during her so-called retirement, as the director of the Colorado Writing Project, where she preaches best practices to thousands of teachers. Considering my family history, I might venture to say that education is in my blood. Or, since professional recapitulation is more the result of upbringing than genetics, I should rather say that I have been conditioned to the world of education. To this extent, it should not have been surprising when I enrolled in a teacher education program at Metropolitan State College in Denver (MSCD) in 1997, a few years removed from completing my bachelor’s degree in history from the University of New Mexico. But, at that time, still unsure why I wanted to be a teacher, my career decision left me feeling ambivalent at best, apathetic at worst. Education was not yet a calling. It was not yet a passion. This soon changed.

Luckily, while completing the program at MSCD, I happened upon Professor Charles Angeletti, who taught a methods course required by social studies teachers-in-training. Angeletti, a passionate, sometimes-churlish socialist from Oklahoma, sparked a proverbial fire in my belly that has yet to exhaust. He challenged me to think about education—to think about the world—in ways that I had not yet imagined. From then on, I conceptualized teaching as a revolutionary act, as a means to project my desires for justice onto a world seemingly devoid of it.

My first experience in the classroom was as a student teacher at Denver West High School in 1999, where the mostly Latino student body was comprised of some of the more economically disadvantaged students in the state. It was then that I began to recognize what Jonathan Kozol described as “savage inequalities,” or what he would later term “the shame of the nation.” Relative to the high school I attended in a modest middle-class suburb of Denver, the conditions at West were appalling. In what has become an all-too-familiar description of our nation’s urban schools, West students lacked basic materials, their textbooks were antiquated, and the majority of their teachers had grown cynical and bored. To make matters worse, when the district feebly attempted to integrate West, in the form of a magnet program called the Center for International Studies, the students aptly renamed it the Center for Internal Segregation.

My first paid job as a social studies teacher was at Thornton High School, in a working-class suburb just north of Denver. Although the student population was classified mostly “urban”—a euphemism for minority—the conditions at Thornton were vastly superior to West. Unlike at West, the physical plant was not in disrepair, and many of the teachers seemed to enjoy their jobs. And yet, beneath the surface, I recognized problems—problems of the type I was increasingly reading about in the works of critical theorists such as Paolo Freire and Henry Giroux. In short, I came to understand that race and social class largely determined the education students received at Thornton High School, as elsewhere. Whereas the majority of the students enrolled in the advanced placement courses were white, my basic history courses were full of brown-skinned faces.

Although segregation by way of tracking was dispiriting enough, I soon discovered that race and class were problematic in ways even more insidious. For example, military recruitment was pervasive at Thornton, and the recruiters clearly profiled their targets, going after minority students deemed unlikely to attend college. In response, a student club I sponsored alongside my colleague Andres Martinez—Students for Justice—decided to draw attention to the issue of military recruitment. They distributed fliers listing the “top ten reasons not to join the military.” But our efforts were quickly met by resistance from administration, who heard complaints from the military, which paid handsomely for unfettered access to our students.

I left my job at Thornton after two years having learned two important lessons. First, there are powerful forces at work shaping the supposedly safe confines of the school. In other words, as John Dewey correctly theorized a century ago, the divide between school and society is illusive. Second, educational politics animate otherwise reasonable people to behave in unpredictable, often belligerent ways. This was made evident when some of my colleagues shunned me in the wake of my efforts to shed light on the racist character of military recruitment. These two lessons followed me east to Washington, D.C., as I began work on my doctorate in history at the George Washington University. It is now clear to me that these two lessons have formed the foundation of my scholarship.

Two of my first published articles, projects that germinated in graduate seminars on educational history, sought to understand the political, historical, and theoretical roots of race and class in the context of education. “Language as Oppression: The English-only Movement in the United States,” searched for historical explanations as to why the agenda of the English-only movement emerged on the American political landscape in the 1980s, and why it garnered widespread support among Americans. I theorized that a majority of the white American citizenry subconsciously conflated whiteness and the English language with citizenship. Similarly, “The Social Production of American Identity: Standardized Testing Reform in the United States,” sought to unmask the standardized testing movement as rooted in the historical normalization of whiteness, richness, and maleness. I argued that standardized testing represented an important form of social production that has served the American political economy.

My dissertation, which laid the foundation for my book, Education and the Cold War: The Battle for the American School, focused less on the theoretical components of education—on how race and class form education—and more on how political crises meld with educational crises in U.S. history. Education and the Cold War explores the ways in which Americans variously experienced the political crisis of the Cold War as a crisis in education. Beginning with John Dewey and the genealogy of progressive education in the late nineteenth century, and ending with the formation of New Left and New Right thought in the early 1960s, Education and the Cold War traces the postwar transformation in U.S. political culture. My book is rooted in the knowledge that Americans have frequently expressed their political aspirations and fears in educational terms.

Autobiographically, the most important discovery I made while researching my book was that I was not the first teacher to be treated poorly due to my political convictions. Thousands of teachers were purged from the public schools during the early Cold War for their political beliefs. My second book, which I am now researching, A War for the Soul of America: A History of the Culture Wars, will likewise be an examination of how American political culture shapes education, and of how people often act with animosity towards their political foes in the realm of education. The culture wars are a textbook case of the high degree to which educational politics rouse Americans.

If there is one thing I have hoped to draw attention to in this brief biographical narrative, it is that teaching brought me to scholarship. In this process, however, I discovered that my scholarship has made me a better teacher. I am currently an assistant professor of history at Illinois State University (ISU). Our department includes one of the largest history education programs in the nation. We train about 125 future history teachers per year. One of my central duties is to teach the methods course for our pre-service history teachers. In other words, I am to my students at ISU what Charles Angeletti was to me at MSCD. Like Angeletti did for me, I hope to inspire my students to see the liberating potential of education. This is where my scholarship proves helpful. I understand that attempts to change the world of education are fraught with risk. I am aware of how and why many Americans tend to look unfavorably on those who teach for social justice. Such knowledge, I hope, will allow me to help my students navigate the confounding terrain of educational politics, and yet not give up hope. Because, despite the nastiness of educational politics, it is a necessary battle to join.

The following post is the transcript of a talk I gave on October 17, 2007, at the weekly meeting of the Illinois State University International Studies Seminar

The Unintended Consequences of US Wars (and other foreign interventions)

By Andrew Hartman

The single greatest war reporter of our time is Robert Fisk of the London Independent. What makes Fisk a cut above is not just his bravery, which is immense—he has been a first-hand witness to hell on earth, lucky to still be alive. What makes him great is not just his interest in empirical observation, or in counting the dead, although this is an important task for a war reporter. And it is definitely not any interest in glorifying war, like so many of his American counterparts who wrote home during the pre-“Mission Accomplished” stages of the Iraq War, their manic stories dripping with macho nationalism.

On the contrary, Fisk is a war reporter whose quaint mission is to end war, or, more humbly, point out the sheer folly in it. This is why he infuses his criticism of contemporary war with historical analysis. War must be understood historically. Thus, his recent massive book, despite being about the Middle East, is aptly titled, The Great War for Civilisation, paying homage to “the war to end all wars,” now known to us as World War I.

The military and political leaders who led the world to war in 1914 believed that their respective nations would achieve a swift and chivalrous victory. Instead, they achieved misery, both in war and in the so-called peace that followed. Fisk writes of World War I, his father’s war, and the new global borders resulting from the armistice: “In all, it was to take my father’s generation just twenty-three months to create these artificial borders and the equally artificial nations contained within them.” Fisk refers to the creation of Lebanon, Yugoslavia, Iraq, Northern Ireland, and the British Palestine Mandate—all created between August 1920 and July 1922. Fisk personalizes the connections between past and present when he writes: “It is, as I often reflect, a grim fact of my own life that my career as a journalist—first in Ireland, then in the Middle East and the Balkans—has been entirely spent in reporting the burning of these frontiers, the collapse of the statelets that my father’s war allowed us to create, and the killing of their peoples” (306).

This connection made by Fisk—between past and present wars—correlates to the premise of my talk. Wars have unintended consequences—consequences which are, more often that not, terribly destructive. This was particularly true of the First World War. Beyond the results of the partitions described by Fisk, we can also make the claim that the rise of Nazism was a consequence of World War I. Thus, so too was World War II. Less tragically, the modern intellectual revolt against progress and other Enlightenment grand narratives was a consequence of World War I. Intellectuals revolted against the civilization that could produce such a grotesque and meaningless waste of life. In this sense, to stretch this line of argument to an almost absurd level, we might argue that postmodernism is one of the many unintended consequences of World War I—for the typical graduate student compelled to read Derrida, one of the more painful such results.

To argue that wars have unintended consequences is not a new historiographic trend. Historians have long extended their explanations of causation beyond human intentions. To limit our inquiries to human intention is to believe in an omnipotent, hyper-rational humanity—a belief that betrays all empirical evidence. That being said, the sub-field of diplomatic history has to some degree lagged behind such a sensible historiographic trend. I don’t wish to overstate my case, but some diplomatic historians continue to put too much faith in their documentary evidence, namely, the diplomatic cable transcript. Such transcripts are too often understood as a self-evident conversation between two human beings who know exactly what they want, and exactly how to get it. In contrast, it must be stressed that policymakers, like the rest of us, often know not what they do. This is particularly the case when it comes to war.

To say as much is not to absolve the war-makers of blame. On the contrary, although wars have consequences that leaders do not intend, many such consequences are predictable. For instance, it was predictable that the use of military force to remove Saddam Hussein from power and to destroy the Iraqi Ba’ath Party would result in sectarian strife and in a newly empowered Iraqi Shiite population, who would logically align themselves closer to Iran. How do we know this was predictable? Because influential members of the Bush I administration, including Dick Cheney, predicted as much when making their case as to why US troops should not take Baghdad during the first Gulf War.

No, it is not my intention to absolve blame. Rather, I would argue that thinking more carefully and critically about the dangerous consequences of American foreign intervention—of wars, of covert operations, of bullying on all matters political and economic, and other such hubris—would perhaps lead to changed American behavior in the world.

I am now going to survey some of the more destructive consequences of US wars, and foreign policy and intervention more broadly speaking. I will begin with how early-twentieth century foreign policy in China started the US on the path to Pearl Harbor and World War II.
Everyone in this room is probably somewhat knowledgeable about the immediate circumstances in the lead up to the attack on US naval forces at Pearl Harbor in December 1941. In order to carve out their own imperial niche in Asia, especially in China, the Japanese prepared for war against the western nations. They correctly believed that the US would resist their imperial designs. FDR attempted to impede Japanese war preparations by imposing petroleum and steal blockades, to no avail, as made evident by Pearl Harbor.

This historical narrative is correct, but only so far as it goes. We must take it further back, and ask the question: Why was the US so invested in limiting Japanese expansion, especially into China? The false answer most commonly given is that the US is an anti-imperialist nation and was appalled by Japanese brutality in China, such as the Rape of Nanking. If this were the case, the US would have, presumably, been equally appalled by British colonialism in India and by French colonialism in Indochina, neither of which were benign. So again, why the interest in China?

The US interest in China goes back to the Spanish-American War. It might seem like a stretch to say that Pearl Harbor is one of the unintended consequences of the Spanish-American War of 1898, but that is precisely what I am about to argue. (I owe this interpretation to the wisdom of historian Leo P. Ribuffo.)

The US has never been isolationist. Historians of American Indians know as much. But the Spanish-American War did indeed open up a more intense phase of US imperialism. One of the spoils of the quick victory over the Spanish was the Philippines, although Filipinos did not see it that way, and thus revolted against US rule. This led to a violent war of occupation. 4,000 Americans and at least 200,000 Filipinos died. This war was somewhat similar to Iraq. In fact, some neoconservatives, including Max Boot, have argued that the US war against the Philippines should serve as the model for the early twenty-first century. President Bush even cited it in a speech on Iraq in 2004. I assume they cite it as a success story because they consider the Philippines a model nation now, inasmuch as it is a US ally, not because it in any way resembles a healthy society.

The occupation and war in the Philippines had a dialectical effect. The war was justified to a skeptical American population in the name of the Great China Market, which had the effect of enhancing interest in trade with China. But at the time, the European imperialist powers were considering chopping up China into spheres of influence, much as they had done in Africa, as foreign powers took enclaves along the Chinese coast. President McKinley determined that, rather than consent to these European designs—in no small part because all the best coastal enclaves had already been snatched up—the US would deny the legitimacy of the spheres and affirm the national integrity of China. This policy was announced in the famous “Open Door Notes.” The Open Door Notes asserted that the field of economic competition was not to be closed to the US, with the expectation that Americans had the ability to destroy their economic competitors. Secretary of State John Hay called this an “ideal policy” in that it would allow the US “to do nothing, and yet be around when the water-melon is cut.”

This might sound rather innocuous, and in relation to a spheres of influence policy, perhaps it was. But there were serious problems with the open door policy from the beginning. For one, it assumed that China was a stable, unified nation, which it clearly was not. It also assumed that the Chinese would consent to being dominated economically, which they did not, made clear by the Boxer Rebellion of 1900. The US committed 2,500 troops to the anti-Boxer forces, which slaughtered the rebellion and countless civilians. But, above all, the chief danger of the Open Door Notes, from the American perspective, was that the US might begin to believe its own rhetoric about the need to preserve Chinese integrity. It might believe that its national interests were at stake in keeping the door open to China. The question that should have been asked: What happens if some nation—Japan, for instance—attempts to close the door? This is precisely what happened in the 1930s. It is in this sense that, indirectly, Pearl Harbor was an unintended consequence of a policy implemented forty years earlier.

This was not the last time US policies in Asia produced results the opposite of policymaker intentions. Take Vietnam, the greatest tragedy of American diplomacy on record, so far. One of the unintended consequences of US military strategy in the Vietnam War helped ensure US defeat. The large majority of US bombs were dropped on the peasantry in the south, the base of support for the Viet Cong. The cynical idea on the part of US strategists was that they would make the Vietnamese countryside uninhabitable for those who supported the enemy. As a US military commander infamously said, “Sometimes you have to destroy a village in order to save it.” This policy worked insofar as hundreds of thousands of Vietnamese peasants, those who survived the bombings, were forced to migrate to Saigon and other cities. However, this forced migration created levels of instability to which there were no military solutions, much like in Baghdad today. In short, it ensured US defeat. There is no military solution to urban chaos, to the wretched poverty and disorder created by massive human displacement.

Similarly, Cambodia is a textbook study of the potential destructiveness of the unintended consequences of war, especially of bombing campaigns. Orchestrated by Henry Kissinger, the Nixon administration secretly bombed Cambodia during the early 1970s. This was the US attempt to destroy the Viet Cong network—the Ho Chi Minh Trail that extended into neighboring Cambodia. Historians have demonstrated that this bombing campaign failed in its efforts to cut off Viet Cong supply lines. However, more to the point, the bombings managed to drive hundreds of thousands of Cambodians into the cities. This, of course, led to disorder and created a political vacuum into which stepped the genocidal Pol Pot. Pol Pot proceeded to force the peasantry back to the rural areas, killing nearly two million in the process, over a quarter of the Cambodian population. As the leader of the Khmer Rouge, Pot used his agrarian relocation policy as a cover for his attempts to wipe out entire ethnic groups deemed enemies to the Khmer.

I’m not arguing that the US is directly complicit in this genocide. That would run counter to one of my main messages, that the US is not an omnipotent force. Rather, I’m arguing that the effects of bombing campaigns do not end when the bombs quit raining down from the sky. Counter-factual analysis is helpful here: can we imagine Pol Pot minus the Nixon-Kissinger bombing campaign? Or, more generally, minus the Vietnam War? This would require, in my opinion, a fanciful imagination. The dark results of this bombing campaign should be kept in mind with talk of a potential, so-called “preventive strike” on Iran and its fledgling nuclear program.

Speaking of Iran, and the Middle East more generally, many of the dangers and instabilities in that region of the world can be directly attributed to the history of US policy, near-sighted as it was. To paraphrase Marx, the tradition of past US policymakers weighs like a nightmare on the brains of current ones. This nightmare dates back to 1953, when the CIA mounted a coup against the democratically elected Prime Minister Mohammed Mossadegh. President Dwight Eisenhower authorized this coup—it was one of the first things he did when he took office in January of that year. The US then helped bring to power a brutal tyrant, the Shah, a despotic monarch that ruled with an iron fist for the next 25 years. The Shah was one of many murderous leaders in the world, one of the worst. Furthermore, the Shah was a US puppet, his power entirely dependent upon US money and weapons. It was common practice for the US to align with authoritarian leaders who would protect so-called US interests.

This history is instructive for our purposes in that it demonstrates two of the unintended consequences of US policy in the Middle East. First, just as the US backed itself into protecting the open door to China, it also backed itself into protecting authoritarian regimes across the region, and really, across the world. Once one of the corrupt little puppets to which the US had committed money and guns was threatened, the US believed it was in danger of losing “credibility” in the eyes of all of the other corrupt little puppets. This is the real “domino theory,” which had little to do with communism except rhetorically. The US became addicted to maintaining “credibility.” You still hear this today, time and again. US credibility is at stake in Iraq. US credibility will undoubtedly be at stake with regards to Iran’s nuclear program. They said they can’t allow for such a program, thus they must stop it by any means necessary or risk losing credibility.

Second, this history of intervention in Iran demonstrates that, even if US policy achieves short-term material gains, it often ensures long-term security losses. In the case of Iran, the short-term material gains were all about oil. Iran has the second or third largest exploitable oil reserves in the world, behind Saudi Arabia and perhaps Iraq.

The US helped overthrow Mossadegh because he decided to alter the preexisting arrangement Iran had with foreign oil companies. When Mossadegh came to power, British oil companies controlled most Iranian. The US wanted to change this. The US and Britain, although allies, were in essence competing over who would control Middle East oil for the foreseeable future. Mossadegh thought he could play the British and US off one another to leverage a much better deal. When this failed, Mossadegh and the Iranian parliament nationalized the oil. This spelled his demise. The US and the British then cooperated to get rid of him. But when the Shah was installed, the US removed British influence step-by-step. Thus, the US controlled Iranian oil. Material gains.

But what were the long-term losses? In short, the Iranian Revolution of 1979. A government hostile to the US, one that determines its own economic policy, one that controls its own oil, came to power, and remains in power. By destroying a democratic government with liberal sensibilities—Mossadegh’s—the US helped build the path to Iranian theocracy. The chickens came home to roost in the form of the embassy hostage crisis. Something similar resulted from US intervention in Soviet-occupied Afghanistan. This is a history that became all-too-familiar after September 11, 2001.

During the 1980s, the US committed billions of dollars in money, weapons, and experts to the mujihadin who were resisting the Soviets, most of it funneled through Pakistani intelligence services. For example, the CIA provided the Afghan resistance with satellite mapping intelligence and demolitions experts who were able to train the Mujihadin in the use of delayed timing devices for C-4 plastic explosives. The US also provided the rebels with targeting devices for mortars linked to a US Navy satellite, wire-guided anti-tank missiles, and, eventually, the highly effective Stinger missiles. The CIA helped Pakistani trainers establish schools in guerrilla warfare and urban sabotage for the mujihadin. Sniper rifles were given to the rebels for purposes of assassination. The car bomb, used with deadly effectiveness in Iraq to this day, was one of the weapons the CIA helped train the mujihadin to use. In short, the US helped created some of the deadliest urban guerrilla warriors in the world—trained to use weapons of modern-day terror. This, as we now know, was not a very wise policy. The short-term gains, helping to defeat Soviet forces in Afghanistan, were outweighed by the long-term security losses, September 11 and beyond.

All of this ugly history is worth recounting. How else will we destroy the delusional belief that the US has the power, ability, and benevolence to shape the world? This is an extremely dangerous idea, and it is a threat to a peaceful future. I will leave you with an eloquent, yet angry passage from Terry Eagleton, one of my favorite writers:

“The United States has an exalted image of itself, and would be a far more morally decent place if it did not. A touch of skepticism and self-debunkery would work wonders for its spiritual health…It is a demented refusal to limit and finitude, its crazed, blasphemous belief that you can do anything if you put your mind to it, which lies at the source of its chronic weakness…Intoxicated by their own self-image, Americans can perceive nothing beyond themselves, and will find themselves in the most dreadful danger. They will become the enemies of civilization in the very act of seeking to preserve it.” (After Theory, 226)

Housing Bubble: A Broader View

September 26, 2007

A great deal of the national discussion of the crumbling housing market has focused on blaming individual Americans, which often amounts to blaming the victim. This tone is more than a little disturbing. We should instead be looking at the overarching problems associated with our unprecedented levels of debt.

I’m not saying that individual Americans have not, collectively, made some really bad personal financial decisions. But the key word here is “collective,” not “individual.” When historians attempt to understand phenomena, rarely do they attribute it to individual causation. For example, “Why did so many individuals go along with the Nazis?” An explanation to such a historical question cannot be made by reference to the fact that “Germans made bad personal choices.” However factually correct such a statement might be, per se, it is not much of a historical explanation. When we attempt to understand phenomena, we should look for larger, structural causes.

In the case of the historical question I posed here about the Nazis, historians have looked at, among other things, how the punitive measures of the Versailles Treaty compounded by the global economic collapse helped make the Weimar Republic coalition insolvent, and the Nazis were able to fill this political vacuum. Their economic successes — built on military rearmament (military Keynesianism) — helped make them increasingly popular, especially to the rich and powerful, who feared the growing communist movement. That much of the world seemed against their national success from the start, despite the Munich agreement — that they had real enemies — helped in their successful attempts to scapegoat their perceived internal enemies, Jews.

So, in the case of the housing bubble question, or the phenomenon of collective American debt, future historians will likely look at larger, structural causes, rather than blaming financially illiterate individuals. Perhaps the answers are not clear cut, yet. It’s tough to see the forest for the trees. But are some questions they might attempt to answer.

Why were Americans compelled to maintain a standard of consumption above and beyond their means? In other words, why is American cultural worth — status — dependent upon goods purchased and displayed? These questions would lead scholars to look at the broad history of marketing and the culture of consumption. Americans have long marketed the suburban life as the ideal, as a sort of utopia, made evident when then-Vice President Richard Nixon debated Soviet Premier Nikita Kruschchev about the best kitchen designs, the so-called kitchen debate. Using our houses as ATM’s is not just a matter of individual stupidity. It is part of our broader culture. US real wages have been in a collective, broad-term decline since 1973. So Americans leverage their over-valued houses to continue the suburban dream, in the face of financial reality.

There are plenty of other structural questions future historians will likely pose, some of them similar to the type posed by Dean Baker in many excellent articles he’s written on the Fed’s role in this mess. What role did the Fed have in all of this? How has the role of the Fed shifted alongside the so-called “Reagan revolution” as personified by Alan Greenspan, who believed that his main purpose as Chairman was to keep wages low?

What role has the decline of unions had in all this? As the national economy has shifted from one that produced material goods to one that “produces” immaterial goods (information and service), why haven’t the new sectors unionized like the old ones?

One question that will undoubtedly be asked: How does military power compel the rest of the world to fund American debt, and what are the limits of such power?

We should be posing these types of broad questions to help us grasp the complex problems of our financial mess. Blaming individuals gets us nowhere. In fact, it seems cruel and misplaced to blame the unfortunate who are merely emulating the actions of the nation as a whole.

Andrew Hartman

I’m thinking of putting together a panel for the HISTORIANS AGAINST THE WAR NATIONAL CONFERENCE to be held in Atlanta, Georgia, April 11-13, 2008. The title of the conference is, “WAR AND ITS DISCONTENTS: UNDERSTANDING IRAQ AND THE U.S. EMPIRE.” Anyone interested in joining me?

I think my talk would be on the development of the “stab in the back” theory in the aftermath of Vietnam, or, how conservatives came to understand that the loss in Vietnam was the fault of the antiwar movement and the Democratic Party of George McGovern. I would also discuss the implications of the “stab in the back” theory for Iraq and the current antiwar movement, such as it is.

If you would like to join me, we’d have to tie our topics together in some sort of logical fashion. The conference stresses that this is not just for academics. So, if there are any high school history teachers out there who would like to present on how to teach Vietnam from an anti-war perspective, or how to teach Vietnam while comparing and contrasting Iraq, that would be a good topic. Let me know.

Andrew Hartman
ahartma@ilstu.edu

The Brilliant and Schizophrenic Politics of Richard Rorty

Andrew Hartman
July 6, 2007

After philosopher Richard Rorty’s death a few weeks ago, my colleagues at the U.S. Intellectual History web log wondered aloud how important Rorty is to historical and contemporary social thought. I found myself shamefully unprepared to contribute to that discussion. Some agreed with a Harvard professor quoted by The Chronicle of Higher Education as follows: “It is scarcely an exaggeration to say that one could not be taken seriously as an intellectual in the 1990s without forming some kind of opinion as to Rorty’s views.” Others disagreed.

I was inclined to disagree since I consider myself well versed in U.S. intellectual currents yet not in Rorty. Sure, I knew the basics. I knew Rorty to be a voice of the American left. I also knew him to be a philosophic pragmatist carrying on where John Dewey left off. I understood that Rorty saw these two positions – leftist politics and pragmatism – as interrelated. In this regard, I knew all about how he had been criticized by Marxists such as Terry Eagleton and Slavoj Zizek for his neo-pragmatism, which they considered at one with a depoliticized postmodernism.

But I decided before making the solipsist and fallacious argument that Rorty is unimportant because I am unaware, I had better read the man. I chose to read his Achieving Our Country: Leftist Thought in Twentieth-Century America (1998), because the title drew attention to two of my main interests: leftist politics and U.S. intellectual history. This book, based on a set of five lively lectures, is a passionate plea for a revived reformist left, a return to the left of Eugene Debs instead of that of Michel Foucault.

Achieving Our Country reveals Rorty’s brilliant political mind. I found his lecture on “A Cultural Left” particularly illuminating. In it, Rorty describes the historical genealogy of an academic left that, beginning in the 1960s, focused on ameliorating sadism – racism, sexism, and homophobia – instead of lessening economic equality. This is the much-lamented shift to “identity politics.” This dichotomy is somewhat reductionist in that the two strands are not easily separated: sadist politics are often rooted in economic selfishness. This connection is what W.E.B. Dubois and David Roediger have referred to as the psychological “wages of whiteness,” imputed to also mean the “wages of maleness” or the “wages of straightness.” However, such reductionism notwithstanding, Rorty’s lecture is highly instructive.

Unlike conservatives who decry identity-based affirmative action, Rorty understands the necessity of fighting sadism. He credits the cultural left for a nicer society. “Especially among college graduates,” Rorty argues, “the casual infliction of humiliation is much less socially acceptable than it was during the first two-thirds of the century… The adoption of attitudes which the Right sneers at as ‘politically correct’ has made America a far more civilized society than it was thirty years ago” (81).

But despite such success, Rorty laments the cultural left’s inability to fight against economic inequality. “During the same period in which socially accepted sadism has steadily diminished, economic inequality and economic insecurity have steadily increased” (83). It’s not that Rorty blames the cultural left for inequality, which he correctly attributes to the race-to-the-bottom ethos of corporate globalization, which is proletarianizing a good portion of the U.S. middle class. But he does argue that the cultural left is unable to engage in national politics due to severed links to unions and others who continue the struggle against the greedy corporations that dominate the political system. To some degree Rorty’s argument, made nine years ago, is a touch anachronistic: the academic left has been steadily increasing its involvement in the political-economic life of the nation. Yet, insofar as it is still relevant, I agree entirely: class matters.

Despite Rorty’s brilliance, Achieving Our Country is a deeply schizophrenic work. On the one hand, Rorty argues that the left needs to forgo its attachment to grand narratives such as Marxism and instead focus its political attention on piecemeal reformism. On the other hand, in his final lecture, titled “The Inspirational Value of Great Works of Literature,” Rorty argues that professors of literature, in their attempts to posit all texts within a Foucauldian discourse, have taken the life out literature. Rorty cannot possibly reconcile these two positions, since to do so would require him to argue that literature is meant to inspire, but not political philosophy.

Rorty’s anti-Marxism is of course rooted in the fact that he was a liberal anticommunist in the vein of Arthur Schlesinger and Irving Howe. He loathed the sectarianism of the communist Marxists, many of whom had a habit of labeling anyone to the right of them a “reactionary.” Without delving into the much-discussed merits of liberal anticommunism (or lack thereof!), it needs to be pointed out that Rorty raises some important points about the stultifying effects of sectarian scrums. He is correct in his assessment that the American left cannot possibly afford continued infighting, which only serves to benefit the corporate bosses and their corrupt political charges.

That being said, Rorty is blind to his own sectarianism, as are all sectarians, who think that, whereas their opponents represent some narrow agenda, they represent the universal. In Rorty’s case, whereas Marxists represent a foreign doctrine, he and the liberal anticommunists represent a true “American” left, a left rooted in Emerson, Thoreau, Whitman, and, of course, Dewey.

It was once said that to know where one stands in relation to John Dewey is to know where one stands in relation to America. It might be more appropriate to argue that to know where one stands in relation to Dewey is to know where one stands in relation to the American left. The intellectual left’s sectarian lines are drawn between Dewey and Marx. We are allowed to like one or the other, never both.

At a recent academic conference I gave a paper that argued Dewey was part of the American Popular Front left of the 1930s that was wiped out by the Cold War. I was greeted by vehement disagreement. It seems people are so conditioned to understand Dewey as an emblem of liberal anticommunist thought that they can’t conceive of him as a radical.

At the same conference, I attended a highly entertaining talk on Dewey by a professor who rooted his analysis in the best of postwar social thought. His analysis lamented the truly huge gap between reality and perception, between the everyday existence of the modern organization society that is the United States and the delusional ideology of rugged individualism. In this sense, this professor asked all the correct questions, those asked by the likes of C. Wright Mills, Paul Goodman, and Christopher Lasch. Why don’t Americans understand that their society is not one in which liberty thrives? That being said, the professor gave, in my view, the wrong answer. He argued that John Dewey’s organizational pedagogy adjusted Americans to the corporate order while allowing them to maintain the fantasy that they were free.

I would argue that John Dewey was radical to a forgotten degree, and to this extent he was never as influential as his critics – left or right – make him out to be. Perhaps my different reading of Dewey speaks to the truly subjective nature of textual readings, but I think Dewey actually anticipated the best of postwar social thought in maintaining that individualism was over-hyped. Dewey called the ideology of rugged individualism an “unnamed form of insanity.”

I suspect that this professor’s problems with Dewey stemmed from the fact that Dewey recognized the organization society as a given, that there was no going back to the romanticized town hall of the pre-Civil War era. In this sense, perhaps the professor attacked Dewey from a left-libertarian rather than a Marxist position. Perhaps he thought Dewey was the opposite of Thoreau instead of, as Rorty argues, part of the same intellectual lineage. That said, despite the fact that Dewey wanted Americans to reconcile themselves to the organization society, Dewey also wanted the organization society to reconcile itself to social democracy. In the absence of social democracy — or what he categorized as economic democracy — Dewey believed that politics would continue to be the “shadow cast on society by big business.”

Obviously, Dewey never saw the U.S. become a social democracy, as the organization society remained incredibly hierarchical, which is why Dewey grew increasingly angry over the state of U.S. politics and education, arguing against all those who urged the schools become extensions of the industrial order in his Education and Experience. In a 1914 article Dewey wrote for the first issue of the New Republic, he was brutal in his assessment of David Snedden’s version of vocational education, which Dewey described as a practice that fashioned schools as “preliminary factories supported at public expense.”

In conclusion, I’m going to take Rorty’s advice and discipline myself against leftist sectarianism by ignoring his advice that I consent to purging Marxism from the American left. I guess this is how one properly responds to brilliant schizophrenia.

Andrew Hartman is an assistant professor of history at Illinois State University and author of the forthcoming book Education and the Cold War (Palgrave Macmillan).

Here’s a talk I gave a few months back to a peace group in Peoria. It’s still very relevant.

Target Iran: Historical Lessons Drawn from the U.S. Addiction to War and Intervention
Andrew Hartman
Bradley University
April 3, 2007

My talk tonight, made clear by my title, is going to focus on what the history of the US addiction to war and intervention can tell us about the likelihood of a coming war with Iran. My understanding of the history of US foreign policy is rooted in the work of a number of revisionist historians who came on the scene during the 1960s, especially William Appleman Williams and Gabriel Kolko. Their importance as historians transcended the ivory tower, as they talked at a number of the so-called “teach-ins” that kick started the 1960s anti-war movement, whose activists always understood the importance of historical knowledge to the movement. It’s in that spirit that I want to talk to you tonight.

With Iran being in the headlines with increasing frequency, and with the Bush administration rattling its sabers and talking about Iran in much the same way it discussed Iraq back in 2003, a lot of people have understandably been asking the all-important question: Is war with Iran imminent?

I’m afraid don’t have a definitive answer. Historians are not in the habit of making such definitive predictions. If you learn one thing from studying history, it’s that the future is contingent and unpredictable. Historical actors were rarely able to predict the future, especially with regards to war, the most unpredictable of human interactions.

I will say, however, that US policies have created a context that makes such a war more likely, and this is a very dangerous situation, with possible grave results for the region and for the United States. I’ll come back to this. First, I want to elaborate some basic historical background about US foreign policy that might allow us a clearer understanding of what’s likely to happen and of what’s at stake.

The United States has been addicted to intervening in the affairs of foreign nations throughout the twentieth century, but this addiction became much more pronounced after World War II, from which the US emerged as the world’s most powerful military and economic force. This addiction to intervention has made the world much less safe. In fact, many of the dangers and instabilities in the Middle East are directly attributable to faulty US policies. The history of US policy with regards to Iran is a prime example.

As I hope most of you already know, the CIA mounted a coup against the democratically elected Prime Minister Mohammed Mossadegh in 1953. President Dwight Eisenhower authorized this coup – it was one of the first things he did when he took office in January of that year. The US then helped bring to power a brutal tyrant, the Shah, who was basically a despotic monarch, ruling with an iron fist for the next 25 years. The Shah was one of many murderous leaders in the world, one of the worst. The Shah’s power was entirely dependent upon US money and weapons. This history is instructive for our purposes in two ways.

First, it allows us to examine the motivations of US policymakers – the driving force of US foreign policy. Regarding the overthrow of Mossadegh, some of you more historically-minded people might say, “It was 1953, the heart of the Cold War. The coup must have had something to do with communism and the Soviet Union.” Good guess, but no. The large majority of US interventions during the Cold War had very little to do with fighting communism. If this were so, we should have expected to see a decline in the number of interventions since the fall of the Soviet Union in 1991. But the opposite has in fact happened. The US military is larger and more expensive than ever, and has been intervening and going to war with increasing frequency since the end of the Cold War. So what did propel the US to overthrow Mossadegh?

As Bill Clinton would have said during his 1992 campaign, “It’s the economy, stupid.” Or rather, it’s the oil, stupid. Although most Middle East oil is exported to places other than the US (namely Europe, Japan and China), the nation that controls access to Middle East oil – and the nation whose currency is used to trade for oil – has greater ability to shape the global economy. Oil is power. Iran has the second or third largest exploitable oil reserves in the world, behind Saudi Arabia and perhaps Iraq. Thus Iran has long been of strategic consequence to US policymakers, which informed policy in 1953, when Mossadegh was overthrown.

Mossadegh was no communist and overthrowing him was not an anticommunist maneuver. Mossadegh was your basic liberal nationalist. He wanted to modernize Iran. But he needed funds to do this, and Iran was seriously in debt, so he decided to alter the preexisting arrangement Iran had with foreign oil companies.

What’s interesting is that, when Mossadegh came to power, British oil companies controlled most of the oil in Iran. The US wanted to change this. The US and Britain, although nominally allies, were in essence competing over who would control Middle East oil for the foreseeable future. Mossadegh thought he could play the British and US off one another, and perhaps the Soviets as well, to leverage a much better deal. This didn’t work – the US was far more committed to controlling and profiting from the oil than to ensuring Iran could pay off its debts – so Mossadegh and the Iranian parliament nationalized the oil. This spelled his demise. The US and the British then cooperated to get rid of him. But when the Shah was installed, the US removed British influence step-by-step. The Shah was a US puppet and US companies controlled Iranian oil.

You might then ask, if it’s only about oil, then why has the US intervened in literally dozens of nations that have no oil reserves? Also: why did the US wage a catastrophic fifteen-year war against Vietnam, which has no oil, dropping twelve times the tonnage of bombs on that nation that was dropped by all warring sides combined throughout World War II? Well, it’s somewhat complicated, because US motives are not very rational. Basically, the US has been committed to a naïve and utopian fantasy. It wants to remake the world in its own image. US policymakers, as has often been the case with the powerful throughout world history, confuse their interests with the interests of humankind. They think what’s good for Standard Oil and Halliburton is good for the people of the world.

Such an ideology, which has long been a part of US political culture, might otherwise be thought of as a way to rationalize a foreign policy driven by economic interests, or by what is known as the “Open Door Policy.” The US demands open and unfettered access to the markets, resources, and labor of the world, especially in the underdeveloped nations of the world. This has been the driving force of US foreign policy since at least 1900, when the US demanded an open door to China. No surprise, the US has been hypocritical in its application of the open door, never really allowing other nations a reciprocal open door but for a few exceptions.

But this doesn’t detract from the larger point: US policymakers believed that it was in the interests of everybody, everywhere, for the world to be remade in the image of the US. Of course, in the process of attempts to fulfill this fantasy, they often came to the hard realization that such a dream was impossible. Thus, US leaders committed themselves to the somewhat more realistic goal of aligning with authoritarian leaders who would protect US interests. People like the Shah.

Another problem then arises. Once one of the corrupt little puppets to which the US had committed money and guns was threatened, the US believed it was in danger of losing “credibility” in the eyes of all of the other corrupt little puppets to which it had committed money and guns. This is the real “domino theory,” which had little to do with communism except rhetorically. So the US became addicted to intervention and to “credibility.” You still hear this today, over and over again. Our credibility is at stake in Iraq, our leaders tell us. Our credibility will undoubtedly be at stake with regards to Iran’s nuclear program. We said we can’t allow for such a program, thus we must stop it by any means necessary or risk losing credibility.

So let’s review: the history of US intervention in Iran is instructive because it shows that US foreign policy is driven by economic pursuits, which is tied up in an ideology of what historians term “American exceptionalism.” According to those who view America as exceptional, the overwhelming power and goodness of the US means that it, unlike other nations, should not be constrained by reality. This leads us to the second way in which Iran is instructive for our purposes: US policy was and is in fact constrained by reality. Reality has a way of smacking even the most stubborn and intransigent nations in the face, even the US, which is the last remaining nation that believes it can act across the globe with impunity. And the reality was and is this: no matter how powerful, no matter how technologically advanced its weapons systems, the US could not and cannot dictate the terms of the planet. Period. And attempts to do so, especially by force, have been disastrous.

In other words, wars have unintended consequences. The only certainty of war, other than death and destruction, is that there will be unintended consequences. In Iran, the unintended consequence of overthrowing Mossadegh in 1953 and replacing him with a brutal proxy regime was the Iranian Revolution of 1979. A government hostile to the US came to power, and remains in power. And say what you will about the theocratic nature of the Iranian government since 1979, it determines its own economic policy, unlike most of the other nations of the Middle East. And this is why it is an enemy of the US. Iran is a target because they are one of two remaining oil-rich countries in the Middle East – Syria being the other – that has refused to submit to US rule. Iraq under Baathist rule used to be such a nation.
Let me give you some other examples of the unintended consequences of war in the twentieth century, which saw more warfare than any century in human history. World War I – which, by the way, was launched by myopic leaders who wrongly believed that military victory would be swift, much as Bush wrongly believed with regards to Iraq, not to mention Truman in Korea and Johnson in Vietnam – created the instability that brought about both the Bolshevik Revolution and Nazism. In fact, where communists came to power they basically acted as a force for stability. They brought stability to areas of the world made unstable by war. Russia is a good example. So is China. It’s hard to imagine a Bolshevik Revolution without World War I and hard to imagine a Chinese Revolution in 1949 without the brutal World War II Japanese invasion and occupation.

What about Vietnam? One of the unintended consequences of US military strategy in that war was to ensure a US defeat. The large majority of US bombs were dropped on the peasantry in the south, the base of support for the Viet Cong. The cynical idea on the part of US policymakers was that they would make the Vietnamese countryside uninhabitable for those who supported the enemy. As a US military commander infamously said, “Sometimes you have to destroy a village in order to save it.” This policy worked insofar as hundreds of thousands of Vietnamese peasants, those who didn’t die, were forced to migrate to cities like Saigon. However, this forced migration created levels of instability to which there were no military solutions, much like Baghdad today. In short, it ensured US defeat.

At the risk of belaboring this point: one might say that Iran’s fledgling nuclear program is an unintended consequence of the war in Iraq. The Iranian government learned its lesson: whereas nations without the bomb (Iraq) are subject to invasion, nations with the bomb (North Korea) aren’t.

Along this same line of thinking, it’s easy to argue that increased US-Iranian tensions are a direct result of the occupation. Imagine a counter-factual scenario if you will: imagine that Iran had invaded and occupied Mexico in 2003, sparking a bloody factional civil war. Are we then also to imagine that the US wouldn’t find ways to influence this civil war, considering it would have the most to gain or lose (other than the Mexicans themselves)?

Another unintended and ironic consequence of the Iraq War is that it has made Iran much more powerful and influential in the region. The US destroyed the counter-balance to Iran that it helped cultivate during the 1980s: the Hussein regime. And now it has empowered the Shiite majority in Iraq, which has ties to Iran. This speaks to the seeming lack of logic behind recent Bush attempts to link Iran to the killing of US soldiers. The large majority of American deaths have come at the hands of the Sunni resistance.

So why is the Bush administration targeting Iran now? Perhaps because they are attempting to deflect attention away from the unmitigated disaster that is Iraq? Perhaps because it needs a scapegoat? Does the Bush administration actually intend to invade and occupy Iran? This seems unlikely given that support for the current war has fallen precipitously and US military capabilities are stretched thin. The more likely scenario is the current course. The US will continue to pressure Europe to impose economic sanctions on Iran that will hurt ordinary Iranians while doing nothing to lessen the power of the government. The US will continue to covertly aid anti-regime elements in Iran, hoping that such elements will create an atmosphere for civil war.

The US might even undertake “precision bombing,” an oxymoron if there ever was one. Of course, it’s easy to imagine that any of these scenarios could provoke a larger war – which the US would have to fight in order to not lose credibility. Bombing campaigns bring about horrible consequences. Take Cambodia, which the Nixon administration illegally bombed during the early 1970s in an attempt to destroy the Viet Cong network. This bombing campaign drove millions of Cambodian peasants into the cities, which led to incredible disorder. This then created a climate, or political vacuum, for the rise of Pol Pot, who proceeded to force the peasantry back to the rural areas, killing over a million in the process. A bombing campaign does not end when the target is destroyed.

We can talk more about the potential likelihood of war with Iran in the questions and answers session if you like. For now, I want to leave you with three warnings. First, DO NOT believe a word that comes out of the mouths of the Bush administration. Governments lie, plain and simple, even those that are supposedly democratic like ours. Some might say that the Bush administration lies more than others, which is plausible. Many of those who have advised Bush over the past six years are disciples of the late University of Chicago philosopher Leo Strauss, the “father of neo-conservatism,” who, as a modern day incarnation of Machiavelli, advised his would-be princes that subjects aren’t capable of understanding the complexities of the world. Therefore, political leaders must lie. Strauss was especially adamant that, in order for leaders to be given free reign, the people must be afraid. When people are afraid, they’ll believe just about anything said by those in power.

But the Bush administration has no monopoly on lying. Harry Truman knowingly lied in 1947, overestimating the power and threat level posed by the Soviet Union abroad and communists at home. In the words of Senator Arthur Vandenberg, he “scared the hell out of Americans” in order to convince a reluctant American people to commit to his extremely expensive foreign policy plans, which included the Truman Doctrine, the Marshall Plan, and the creation of NATO. Franklin Delano Roosevelt also deceived the American public, hiding from view his dangerous brinksmanship with Japan, which is why Pearl Harbor was a surprise to everyone but FDR and his closest advisors. Both Truman and FDR were Democrats, beacons of liberalism. Republicans and Democrats lie alike.

This leads me to my second warning. DO NOT trust for one second that things are likely to change if a Democrat gets elected president, whether it’s Hillary or Obama or whomever. There has been a solid two-party consensus on matters of foreign policy since World War II. These parties might at times differ in style – which is why the same Europeans who seemed so smitten with Clinton now hate Bush. But when it comes to the substance of war and intervention, these parties do not differ in any significant way. The institution of the presidency forms the person, not vice versa. This is not to say that this can’t change. Most of those who opposed the Iraq War from the beginning usually vote Democratic. But the Democratic Party leadership has yet to be compelled to jump ship from this two-party consensus.

Third and last warning: DO NOT trust the media to give you the correct answers to these questions. For example, the media has been incredibly helpful to the Bush administration in demonizing the Iranian leadership. Any comments made by Ahmadinejad are dubiously translated and meant to convince us that he is a crazy man, a loose cannon when in fact he has little control over Iranian foreign policy, which rests in the hands of his superior, the Ayatollah Khamenei. Ahmadinejad says a lot of things, often contradictory, which the US media reports very selectively. When he calls for the destruction of Israel – front page. When he says that he agrees with the Arab League with regards to the two-state solution and to normalization of relations with Israel – buried, if even mentioned.

It is your obligation to read widely and independently in order to get some handle on the truth. Take the example of New York Times, which claims to be “all the news fit to print.” In the run-up to the Iraq War, the Times printed story after story that bought the administration argument hook-line-and-sinker. This speaks to two things: the executive branch’s incredible power at being able to dictate the terms of the national discussion. Because every major press corps has reporters whose only function is to cover the White House, when the White House makes an announcement, it automatically becomes a lead story. Of course, this doesn’t then entail that reporters have to believe what the administration is telling them, especially those anonymous “senior officials.” But they do. Especially Times reporters. Two reporters are especially guilty of this, Judith Miller and Michael Gordon, who uncritically parroted the administration WMD argument. Since then, the Times has issued an apology, and Judith Miller is no longer with the paper. But Michael Gordon, as chief military correspondent, recently echoed administration claims about how makeshift bombs used to kill American soldiers originated in Iran. Just like with WMD, he cites “senior officials” to support his claims without any useful or corroborating evidence.

This leads me back to one of my original questions: is war imminent? I can’t answer that, but it sure does feel like we’ve been here before.

Andrew Hartman