One advantage that historians have over journalists concerns time, not so much in the sense that they are free from urgent deadlines, but that they have the deeper perspective conferred by the years — or decades — between events and the act of writing about them. Twenty years is not a lot of time in historical terms, of course, but when it comes to understanding the war that the US launched against Iraq in March 2003, it is all we have.
Not surprisingly, even two decades after the war began, there is no consensus regarding its legacy. This is to be expected, because all wars are fought three times. First comes the political and domestic struggle over the decision to go to war. Then comes the actual war, and all that happens on the battlefield. Finally, a long debate over the war’s significance ensues: weighing the costs and benefits, determining the lessons learned and issuing forward-looking policy recommendations.
The events and other factors that led to the US decision to go to war in Iraq remain opaque and a matter of considerable controversy. Wars tend to fall into two categories: those of necessity and those of choice. Wars of necessity take place when vital interests are at stake and there are no other viable options available to defend them. Wars of choice, by contrast, are interventions initiated when the interests are less than vital, when there are options other than military force that could be employed to protect or promote those interests, or both.
Illustration: Mountain People
Russia’s invasion of Ukraine was a war of choice. Ukraine’s armed defense of its territory is one of necessity.
The Iraq war was a classic war of choice: The US did not have to fight it. Not everyone agrees with this assessment. Some contend that vital interests were at stake, because Iraq was believed to possess weapons of mass destruction that it might use or share with terrorists. Proponents of the war had little to no confidence that the US had other reliable options to eliminate the purported Iraqi weapons of mass destruction (WMDs).
Moreover, coming in the wake of the Sept. 11, 2001, terrorist attacks, the decision reflected a staunch unwillingness to tolerate any risk to the US whatsoever. The idea that al-Qaeda or another terrorist group could strike the US with a nuclear, chemical or biological device was simply unacceptable. Then-US vice president Dick Cheney was the primary exponent of this view.
Others, including then-US president George W. Bush and many of his top advisers, appeared also to be motivated by additional calculations, such as the pursuit of what they saw as a new and great foreign-policy opportunity. After the Sept. 11, 2001, attacks, there was a widespread desire to send a message that the US was not just on the defensive. Rather, it would be a proactive force in the world, taking the initiative with great effect.
Whatever progress had been made in Afghanistan after the US invaded and removed the Taliban government that had provided a safe haven to the al-Qaeda terrorists who planned and carried out the Sept. 11, 2001, attacks, it was deemed inadequate. Many in the Bush administration were motivated by a desire to bring democracy to the entire Middle East, and Iraq was viewed as the ideal country to set the transition in motion.
Democratization there would set an example that others across the region would be unable to resist following — and Bush himself wanted to do something big and bold.
I should make clear that I was part of the administration at the time, as the head of the US Department of State’s policy planning staff. Like virtually all my colleagues, I thought Iraqi president Saddam Hussein possessed WMDs, namely chemical and biological weapons. Even so, I did not favor going to war. I believed there were other acceptable options, above all measures that could slow or stop the flow of Iraqi oil to Jordan and Turkey, as well as the possibility of cutting Iraq’s oil pipeline to Syria. Doing so would have put significant pressure on Saddam to allow inspectors into suspected weapons sites. If those inspections were blocked, the US could have conducted limited attacks against those facilities.
I was not particularly worried about Saddam getting into the terrorism business. He ruled secular Iraq with an iron fist and considered religious-fueled terrorism (with or without Iranian backing) the greatest threat to his regime. He was also not the sort of person to hand WMDs over to terrorists, as he wanted to maintain tight control of anything that could be linked to Iraq.
Moreover, I was deeply skeptical that Iraq — or the wider region — was ripe for democracy, given that the economic, political and social prerequisites were largely missing. I also foresaw that establishing democracy would require a large, prolonged military occupation that would likely prove costly on the ground and controversial at home.
The war itself went better, and certainly faster, than expected — at least in its initial phase. After the invasion in mid-March 2003, it took only about six weeks to defeat the Iraqi armed forces. By May, Bush could claim that the mission was accomplished, meaning that Saddam’s government had been eliminated and any organized, armed opposition had disappeared.
While the US force that had been sent to remove the government was more than capable of winning the war, it could not secure the peace. Core assumptions that had informed the planning of the invasion — namely, that Iraqis would welcome the troops as liberators — might have been true for a few weeks, but not after that.
The Bush administration wanted to reap the benefits of nation-building without putting in the hard work it required. Worse still, those in charge disbanded the former Iraqi regime’s security forces, and ruled out political and administrative roles for the many Iraqis who had been members of the ruling Baath Party, even though membership of the party was often essential to employment under Saddam’s regime.
As might be expected, the situation on the ground deteriorated rapidly. Looting and violence became commonplace. Insurgent movements and a civil war between Sunni and Shiite militias destroyed what temporary order had been established. After that, conditions did not begin to improve until 2007, when the US deployed an additional 30,000 troops to Iraq in the famous “surge.” Four years later, Bush’s successor, Barack Obama, decided to withdraw US troops in the face of worsening political relations with the Iraqi government.
The results of the war have been overwhelmingly negative. Yes, a horrendous tyrant who had used chemical weapons against his own people and initiated wars against two of his neighbors was ousted. For all its flaws, Iraq today is better off than it was, and its long-persecuted Kurdish minority enjoys a degree of autonomy that it was previously denied.
However, the cost side of the ledger is far longer. The Iraq war took the lives of 200,000 Iraqi civilians and 4,600 US soldiers. The economic costs to the US were in the range of US$2 trillion, and the war upset the balance of power in the region to the benefit of neighboring Iran, which has increased its sway over Syria, Lebanon and Yemen, in addition to Iraq.
The war also isolated the US, owing to its decision to fight alongside only a few partners and without explicit backing from the UN. Millions of Americans became disillusioned with their government and US foreign policy, helping to set the stage for the anti-government populism and foreign-policy isolationism that has dominated US politics in the past few years. The war ultimately proved to be a costly distraction. Without it, the US could have been in a much better position to reorient its foreign policy to contend with a more aggressive Russia and a more assertive China.
The war’s lessons are manifold. Wars of choice should be undertaken only with extreme care and consideration of the likely costs and benefits, as well as of the alternatives. This was not done in the case of Iraq. On the contrary, decisionmaking at the highest levels was often informal and lacking in rigor. The lack of local knowledge was pervasive. It might seem obvious to suggest that it is dangerous or even reckless to invade a country that you do not understand, but that is exactly what the US did.
Assumptions can be dangerous traps. The decision to go to war rested on a worst-possible-case assessment that Iraq possessed WMDs and would use them or provide them to those who would. If foreign policy always operated on this basis, interventions would be required everywhere. What is needed is a balanced consideration of the most likely scenarios, not just the worst ones.
Ironically, the analysis of what would follow a battlefield victory in Iraq erred in the opposite direction: US officials placed all their chips on a best-case scenario. After rolling out the welcome mat to those who had liberated them from Saddam, the Iraqis would quickly put aside their sectarian differences and embrace democracy.
We know what happened instead. The fall of Saddam became a moment for violently settling scores and jockeying for position. Promoting democracy is a daunting task. It is one thing to oust a leader and a regime, but it is quite another thing to put a better, enduring alternative in its place.
Still, common critiques of the war get it wrong when they conclude that the US government cannot ever be trusted to tell the truth. Yes, the US government maintained that Iraq possessed WMDs, and my boss at the time, Then-US secretary of state Colin Powell, made that case before the UN. It turned out not to be true.
Governments can and do get things wrong without lying. More than anything else, the run-up to the Iraq War demonstrated the danger of leaving assumptions unexamined. Saddam’s refusal to cooperate with UN weapons inspectors was seen as proof that he had something to hide. He did, but what he was hiding was not WMDs, but that he did not have them. That revelation, he feared, would make him look weak to his neighbors and his own people.
Others have argued that the war was undertaken at Israel’s behest. That, too, is not true. I remember meetings with Israeli officials who suggested that the US was going to war with the wrong country. They saw Iran as the much greater threat. These officials held back from saying so publicly, because they sensed that Bush was determined to go to war with Iraq and did not want to anger him with futile attempts at dissuasion.
Nor did the US go to war for oil, as many on the left have often insisted. Narrow commercial interests are not generally what animate US foreign policy, especially when it comes to using military force. Rather, interventions are predicated on, and motivated by, considerations of strategy, ideology or both. Indeed, former US president Donald Trump criticized his predecessors for not demanding a share of Iraq’s oil reserves.
The Iraq war also contains a warning about the limits of bipartisanship, which is frequently touted in US politics as if it is a guarantee of good policy. It is no such thing. There was overwhelming bipartisanship in advance of not just the Iraq war, but also the Vietnam War. The 2002 vote authorizing the use of military force against Iraq passed with clear support from both major political parties.
Even before that, former US president Bill Clinton’s administration and the US Congress had come together, in 1998, to call for regime change in Iraq. More recently, we have seen bipartisanship in opposition to free trade, and in support of leaving Afghanistan and confronting China.
Just as broad political support is no guarantee that a policy is right or good, narrow support does not necessarily mean that a policy is wrong or bad. The 1990-1991 Gulf War — in which the US successfully led a UN-backed international coalition that liberated Kuwait at minimal cost — barely passed the US Congress, owing to considerable Democratic opposition. Whether a policy has bipartisan support says nothing about the quality of the policy.
In 2009, I wrote a book arguing that the 2003 Iraq war was an ill-advised war of choice. More than a decade later, and 20 years after the war began, I see no reason to amend that judgement. It was a bad decision, badly executed. The US and the world are still living with the consequences.
Richard Haass, president of the Council on Foreign Relations, is the author, most recently, of The Bill of Obligations: The Ten Habits of Good Citizens.
Copyright: Project Syndicate
Monday was the 37th anniversary of former president Chiang Ching-kuo’s (蔣經國) death. Chiang — a son of former president Chiang Kai-shek (蔣介石), who had implemented party-state rule and martial law in Taiwan — has a complicated legacy. Whether one looks at his time in power in a positive or negative light depends very much on who they are, and what their relationship with the Chinese Nationalist Party (KMT) is. Although toward the end of his life Chiang Ching-kuo lifted martial law and steered Taiwan onto the path of democratization, these changes were forced upon him by internal and external pressures,
Chinese Nationalist Party (KMT) caucus whip Fu Kun-chi (傅?萁) has caused havoc with his attempts to overturn the democratic and constitutional order in the legislature. If we look at this devolution from the context of a transition to democracy from authoritarianism in a culturally Chinese sense — that of zhonghua (中華) — then we are playing witness to a servile spirit from a millennia-old form of totalitarianism that is intent on damaging the nation’s hard-won democracy. This servile spirit is ingrained in Chinese culture. About a century ago, Chinese satirist and author Lu Xun (魯迅) saw through the servile nature of
The National Development Council (NDC) on Wednesday last week launched a six-month “digital nomad visitor visa” program, the Central News Agency (CNA) reported on Monday. The new visa is for foreign nationals from Taiwan’s list of visa-exempt countries who meet financial eligibility criteria and provide proof of work contracts, but it is not clear how it differs from other visitor visas for nationals of those countries, CNA wrote. The NDC last year said that it hoped to attract 100,000 “digital nomads,” according to the report. Interest in working remotely from abroad has significantly increased in recent years following improvements in
In their New York Times bestseller How Democracies Die, Harvard political scientists Steven Levitsky and Daniel Ziblatt said that democracies today “may die at the hands not of generals but of elected leaders. Many government efforts to subvert democracy are ‘legal,’ in the sense that they are approved by the legislature or accepted by the courts. They may even be portrayed as efforts to improve democracy — making the judiciary more efficient, combating corruption, or cleaning up the electoral process.” Moreover, the two authors observe that those who denounce such legal threats to democracy are often “dismissed as exaggerating or