It was Japan’s intention on Dec. 7, 1941, to notify the United States 30 minutes before the attack on Pearl Harbor began that it was breaking off diplomatic negotiations. However, because of mishandled communications between Tokyo and Washington, the message was not delivered until nearly an hour after the first bombs fell in Hawaii.
The Japanese declaration of war was published in Tokyo later that day. Almost 75 years have passed but Japan still carries the stigma of a “sneak attack” on Pearl Harbor.
It was a time when wars—the ones between major powers, at least—were undertaken with formality and protocols. The Hague Convention of 1907 had prescribed the declaration of war before the beginning of hostilities, but the practice dates back in military custom to the Roman empire.
Since World War II, formal declarations of war have fallen into disuse, although there are exceptions. In 2012, for example, Sudan declared war on South Sudan.
That does not mean wars no longer happen. According to researchers at the University of Warwick in England and Humboldt University in Berlin, the frequency of “pairwise conflicts”—meaning independent states fighting each other—continues to increase steadily. Excitable commentators see an alarming increase in wars.
Closer examination reveals a more complex picture. The number of small conflicts is indeed increasing, but the number of wars—defined by the researchers as conflicts in which 1,000 or more lives are lost a year—is decreasing. Large nations avoid going to war with each other. The so-called “Long Peace” in Europe since 1945 is the longest such interval since the Middle Ages.
The main factor in raising the threshold of war was nuclear weapons and their potential to wipe combatant nations out of existence. The net effect was to establish two different thresholds, a very high one for nuclear war and a lower one for lesser conflicts. Even so, the nuclear-armed nations were careful to keep the smaller engagements from escalating to the nuclear level.
The question of when and how to enter armed conflict persists. No nation has agonized about it more than the United States, which has been chronically ambivalent about escaping its role as the world’s policeman. For better or worse, others are swept along by where the United States sets the threshold of war for itself.
An additional dimension of the problem is the proliferation of nuclear weapons among nations that may be tempted to use them. “The evidence suggests that the once strong firebreaks between nuclear and conventional conflict are narrowing and the taboo against nuclear use is growing weaker rather than stronger,” says Barry D. Watts of the Center for Strategic and Budgetary Assessments.
Winds of Change
Prior to the 20th century, most of the world’s power was concentrated in the hands of large nations and empires. Smaller states were relegated to the status of colonies or clients, or else existed in the spheres of influence of the great powers.
World War I marked the end of the Hapsburg, Ottoman, and Romanov empires. After World War II, the European colonial powers withdrew from Asia and Africa. This brought about sweeping gains in self-determination—but with a corresponding loss in the stability that the old regimes had enforced.
“In 1870, the world contained fewer than 50 independent states. By the end of the 20th century, there were more than 180,” says the report from Warwick and Humboldt universities. “As a result, the total number of possible country pairs in the world between whom relations of peace or war could exist grew from around one thousand to over 17,000.”
The United Nations, created in 1945, gave the new nations a voice in world affairs and a share of the political power. The emerging nations found frequent cause to use military force against external adversaries as well as rival factions within their own borders.
According to the latest report from Uppsala University in Sweden, which keeps track of the number of wars, there were 33 active conflicts causing at least 25 battle-related deaths in 2013, but that “conflicts claiming more than 1,000 lives, defined as wars, have declined by more than 50 percent, from 15 in the early 1990s to seven in 2013.”
Death and destruction from these conflicts does not approach the scale of the world wars. Total casualties for conflicts in 2012 were just short of 38,000, of which the civil war in Syria accounted for 14,700, almost 40 percent. Uppsala has not posted casualty numbers for 2013, citing a shortage of reliable information.
The Nuclear Firebreak
The US monopoly on nuclear weapons did not last long. The Soviet Union had the atomic bomb by 1949 and the hydrogen bomb in 1953. The mutual danger set up what strategic theorist Herman Kahn described as “large and very clear firebreaks between nuclear and conventional war.”
“War” became a loaded term. In June 1950, President Truman insisted that the US was “not at war” in Korea and that the combat operation there was a “police action” against a “bunch of bandits.” Truman did not seek any form of congressional authorization and committed US forces under the aegis of the United Nations.
At first, the atomic bomb was regarded as a weapon that could be used. Truman said there was “active consideration” of employing it in Korea. President Eisenhower’s “New Look” strategy in 1953 said that nuclear weapons were “as available for use as other munitions.” In practice, both Truman and Eisenhower were far more reluctant to initiate use of nuclear weapons than their statements would suggest.
Throughout the Cold War, both superpowers were constantly aware of the danger of escalation. In 1962, Soviet Premier Nikita Khrushchev pulled his missiles out of Cuba, fearing that a nuclear exchange could destroy his country whereas the United States would survive, even though sustaining millions of casualties.
The old rules for declaring war no longer applied. A nuclear war would begin and be over too fast for that, especially after the introduction of ballistic missiles. ICBMs set a hair trigger for crossing the threshold of war. It was called “Launch on Warning,” adopted by both sides. Lest one’s missiles be caught on the ground by an attack, they would be launched while the enemy’s missiles were still in the air. There was a constant risk that a false warning of attack might lead to war. Both sides experienced such miscues during the Cold War but fortunately discovered the mistakes in time.
In the 1980s, activists in Europe and the United States clamored for NATO to adopt a “No First Use” policy on nuclear weapons. NATO had already renounced first use of any weapon, but not specifically the first use of nuclear weapons, an option needed to defeat (and deter) a large-scale attack by Warsaw Pact conventional forces, which greatly outnumbered NATO. A “No First Use” nuclear guarantee would have decoupled NATO from the extended protection of the US nuclear deterrent and removed any existential risk to the Soviet Union for sponsoring a conventional attack in western Europe.
Mutual nuclear deterrence was not a perfect strategy, but it worked well enough to keep the peace between the superpowers until the Cold War ended in 1991.
The “Weinberger Doctrine” led to a reconsideration of the threshold of conventional war. In Vietnam, more than 58,000 Americans had died in a war the United States did not regard as important enough to fight to win.
The point was underscored by the terrorist truck bombing in Beirut in 1983 that killed 241 US marines, who had no defined military objective in Lebanon and no mission there except for providing “presence.”
In his landmark speech in November 1984, Defense Secretary Caspar Weinberger listed a series of tests to be applied before US forces were committed to combat abroad. They should not go unless a vital national interest was at stake, clear military and political objectives had been established, and the nation was prepared to commit sufficient force to win.
Among those disagreeing with Weinberger was Secretary of State George P. Shultz, who said the US could not be “the Hamlet of nations, worrying endlessly over whether and how to respond.” Then and now, those who favor a foreign policy of interventionism assail Weinberger as being too reluctant to engage.
Weinberger’s critics often miss a critical point: He made no judgment about the interests or provocations for which the United States should cross the threshold of war, only that they be vitally important and that the nation be truly committed.
In his memoirs, President Ronald Reagan listed a set of principles “to guide America in application of military force abroad.” They are a close paraphrase of the Weinberger Doctrine.
The Gulf War of 1991 met every condition of the Weinberger Doctrine.
The Clinton Administration, which took office in 1993, did not agree with the restraints of the Weinberger Doctrine. Les Aspin, the new Defense Secretary, had been chairman of the House Armed Services Committee and rejected what he called the “All-or-Nothing” school of thought on the use of military force. Aspin was in favor of a “Limited Objectives” school.
On Aspin’s watch in 1993, 18 American soldiers were killed in the notorious “Black Hawk Down” incident in Somalia, trying to capture a local warlord who was riding around in US aircraft two months later.
Madeleine Albright, ambassador to the UN and later Secretary of State, asked Gen. Colin Powell, Chairman of the Joint Chiefs of Staff, “What’s the point of having this superb military that you’re always talking about if we can’t use it?”
In 1998, referring to possible strikes against Iraq, Albright said, “We are talking about using military force, but we are not talking about war.”
In his annual report for Fiscal Year 1995, Secretary of Defense William J. Perry said that US forces could be used if “the United States has important, but not vital, national interests” at stake. In 1996, National Security Advisor Anthony Lake announced a list of circumstances in which military force could be used, including “to preserve, promote, and defend democracy.”
Bush and Pre-emption
George W. Bush began his presidency in 2001 promising an end to “vague, aimless, and endless deployments” to “places like Kosovo and Bosnia.” He said that, “We will not be permanent peacekeepers, dividing warring parties. This is not our strength or our calling.”
Before the year was out, though, the perspective changed. Airliners hijacked by terrorists were flown into New York City’s World Trade Center, the Pentagon, and a field in Shanksville, Pa., and suddenly the United States was in the midst of a new and difficult kind of war. The initial US military response was in Afghanistan, but Bush, convinced that Afghanistan was only part of the problem, persuaded Congress and a coalition of allies to broaden the effort to Iraq.
Speaking at the West Point commencement in 2002, Bush announced a strategy of “Pre-emption,” with resounding consequences for the threshold of war. “If we wait for threats to fully materialize, we will have waited too long,” Bush said. “Our security will require all Americans to be forward-looking and resolute, to be ready for pre-emptive action when necessary to defend our liberty and to defend our lives.”
Pre-emption for defensive purposes was not a not an unprecedented idea. The most famous example is the Israeli air strike that took out Iraq’s nuclear reactor at Osirak in 1981. The United States had considered strategic pre-emption at various times in the past and had used it in tactical and operational situations in the course of wars already begun. However, pre-emption was not US national strategic policy until Bush made it so.
Bush first applied pre-emption in Operation Iraqi Freedom in March 2003. It soon ousted dictator Saddam Hussein but not without serious and longstanding side effects. The main justification given for the initial invasion was the belief that Saddam had weapons of mass destruction. That assumption turned out to be unfounded and the pre-emptive strategy lost much of its credibility.
US operations in Iraq and Afghanistan drifted on longer than expected and turned toward loosely defined objectives of nation building and counterinsurgency.
Obama: Reset and Reversal
When President Barack Obama took office in 2009, he shifted the emphasis to international solutions and a reduced role abroad for the United States. He sped up the withdrawal of US forces abroad, first from Iraq and then from Afghanistan. In 2010, he announced a “reset” in relations with Russia. In 2011, he declared that “the tide of war is receding.”
None of this has worked out as Obama hoped. Foes of the United States gained ground in Iraq and Afghanistan. Russia did not reciprocate on the reset and the main result was to embolden the aggressiveness of Russian President Vladimir Putin.
As civil war engulfed Syria, Obama set a “red line” against the use of chemical weapons by the Bashar Assad regime. When challenged, he backed away from the red line, claiming that it had been set by “the world,” not by him.
In May 2014, Obama codified his rules for going to war in a speech at the West Point commencement, the same venue at which Bush had proclaimed the pre-emption strategy in 2002. “The United States will use military force, unilaterally if necessary, when our core interests demand it—when our people are threatened; when our livelihoods are at stake; when the security of our allies is in danger,” he said.
“On the other hand,” he added, “when issues of global concern do not pose a direct threat to the United States, when such issues are at stake, when crises arise that stir our conscience or push the world in a more dangerous direction but do not directly threaten us, then the threshold for military action must be higher. In such instance, we should not go it alone. Instead, we must mobilize allies and partners to take collective action.”
The United States should strike “only when we facing a continuing, imminent threat and only when there is near certainty of no civilian casualties,” he said.
A month after Obama’s West Point speech, radical insurgents known variously as ISIS and ISIL pushed a large-scale offensive into Iraq and proclaimed an independent nation.
“Mr. Obama’s decision to stand back from Syria and Iraq has done much to create the present threat to the United States,” said an editorial in the Washington Post, which takes Obama’s side on most issues. “Continued passivity will only make it worse.”
Former Vice President Dick Cheney and his daughter Liz attacked “the collapsing Obama Doctrine” in a widely cited Wall Street Journal column. “Weakness and retreat are provocative,” they said. “US withdrawal from the world is disastrous and puts our own security at risk.” Neoconservative commentator Robert Kagan said “superpowers don’t get to retire.”
There was, however, a somewhat supporting view from an unexpected source. “Many of those clamoring for military action now are the same people who made every false assumption imaginable about the cost, challenge, and purpose of the Iraq war,” said Sen. Rand Paul (R-Ky.), one of the most conservative politicians in the nation. “They have been so wrong for so long. Why should we listen to them again?”
Obama has virtually nothing to show for his efforts to let the international community take a leading role in preserving peace. The state of world conflict is, if anything, worse than in 2008 and global stability has deteriorated. Three months ago, in early August, an anguished Obama authorized “targeted air strikes” in northern Iraq but limited them to protecting American diplomats and advisors and a “humanitarian effort” to save civilians in Kurdistan endangered by the ISIL advance.
In late August, Obama admitted that “we don’t have a strategy yet.” Not until September did he acknowledge a threat to the United States and state an objective to “degrade and destroy ISIL.”
The Fading Nuclear Taboo
The “Doomsday Clock,” maintained by the Bulletin of the Atomic Scientists, purports to indicate the relative danger of nuclear war in terms of “minutes to midnight.” The first posting in June 1946 set the clock at seven minutes to midnight. The setting has been as short as two minutes (in 1953) and as long as 17 minutes (in 1991). The current setting is five minutes to midnight with the notation that “the potential for nuclear weapons use in regional conflicts in the Middle East, Northeast Asia, and South Asia are alarming.”
Almost everyone recognizes that the Doomsday Clock is a gimmick to promote a political point of view, but the rising danger of local nuclear war is widely regarded as fact.
Former US presidents, notably Jimmy Carter and Ronald Reagan, had expressed their desire to get rid of nuclear weapons, but in 2009, Obama made it official US policy, announcing a commitment “to seek the peace and security of a world without nuclear weapons.” In 2010, overruling his own Secretary of Defense, Obama said the United States would not develop any new nuclear weapons.
Putin saw Obama’s policy as an opportunity for Russia to improve its position in the strategic relationship.
Noting in 2012 that the Americans “have yet to modernize their nuclear arsenal,” Putin said that Russia plans to develop and deploy “an entirely new generation of nuclear weapons and delivery systems.”
This year, Putin raised Russian aggression to the highest level since the Cold War by annexing Crimea and invading Ukraine. Obama said, “We are not taking military action to solve the Ukrainian problem,” but set another one of his red lines some distance away, promising that the US would meet its “solemn duty” under NATO Article Five (an attack on one is an attack on all) should a NATO member be invaded.
Even so, the most immediate nuclear concern is that smaller states may be inclined to use tactical nuclear weapons in limited or local situations. Pakistan is reported to be ready to respond with a short-range nuclear missile if war should again break out with India. North Korea has threatened several times to use its nuclear weapons, and the high-strung regime in Iran seems likely to join the club of nuclear nations.
So far, the threshold for major global conflict remains high, but for other kinds of conflict—including local nuclear wars—the threshold is precariously low, with no shortage of belligerents prepared to charge across it.
John T. Correll was editor in chief of Air Force Magazine for 18 years and is now a contributor. His most recent article, “The Long Retreat,” appeared in the October issue.