An Air Force F-35 pilot commanding a flight of “uninhabited stealth wingmen” is tasked to destroy ballistic missiles protected by an advanced air defense system. She infiltrates the battlespace, getting constant updates on the threats’ locations from cyber and space assets. She sends her wingmen farther into enemy airspace for better targeting information, then releases long-range semiautonomous weapons, validating the targets are destroyed through data displayed in her helmet.
This scenario, laid out by Secretary of the Air Force Deborah Lee James at AFA’s Air Warfare Symposium in Orlando, Fla., illustrates how she and others envision integration and autonomy in future air warfare. But though much of the technology exists or is in development today, questions remain about exactly how and when it can be employed.
Certainly, multidomain operations are already happening in Operation Inherent Resolve. Lt. Gen. John W. “Jay” Raymond, the Air Force’s deputy chief of staff for operations, noted that a January strike on an ISIS cash collection point—televised on CNN and elsewhere—required persistent ISR, position navigation and timing, the ability to link up and synchronize communication systems, weather satellites, GPS to put the bomb on target, and other data.
“We have done such a great job … of integrating these capabilities into the fight, we cannot live without them. The question was: What would you do without GPS? It would be a really bad day,” Raymond said. “When the light switch goes on and space needs to be there, we need to make sure it’s there. … What we need to be able to do is be able to protect and defend those capabilities.”
That won’t necessarily be easy, said Winston A. Beauchamp, the deputy undersecretary of the Air Force for space.
“The secret is out regarding the US and allied dependence on space,” he said. “The advantages that we derive from space give us both an information advantage that is a qualitative and quantitative edge over adversaries—[and] also makes an irresistible target for potential adversaries.”
John G. Clark, director of focused technology roadmaps for Lockheed Martin Skunk Works, said that while remotely piloted aircraft provide more persistent information than in the past, and technology like F-35 sensors can provide important data, information doesn’t get shared across the battlespace. The Air Force needs to determine how to better take advantage of that information, he said.
“I think that RPAs have really helped pierce maybe a bubble, allowing us to think differently,” but there is still a long way to go in terms of getting the information out and exploiting it to win the war, Clark said.
What the US doesn’t need is “more data for data’s sake,” said John Goolgasian III, the director of the National Geospatial-Intelligence Agency’s Source Operations and Management Directorate.
As the ability to find new targets expands, Goolgasian said he’d like to see an automated process to track activities in “white space.” The idea is that someone would be alerted to abnormal activity in an area that may not have been a known focus before, instead of the military trying to collect and analyze everything.
Sensing, not collection, will drive the focus for analysis, he said.
“What I want is information that I can derive from this to provide to you to make your job easier, faster, and to hold targets at bay longer, from space,” Goolgasian said.
Eric S. Mathewson’s projected scenario is somewhat different from James’. A retired colonel who served as the director of USAF’s Unmanned Aircraft Systems Task Force at the Pentagon, Mathewson foresees an Air Force that no longer needs to deploy large numbers of people to launch and fly aircraft, and instead can place command and control nodes anywhere it wishes, sending automated technology to fly and fight.
The US is “in the midst of a revolution in military affairs,” and remotely piloted aircraft are “the poster child of the revolution.”
Automation is Not Autonomy
RPAs and automated technology “will change the paradigm of war,” Mathewson said.
“Automation is key,” he said. “Without automation, we’re wasting our time.”
Yet there is a difference between automation and autonomy, said Lani Kass, senior vice president and corporate strategic advisor for CACI International, who has served as senior policy advisor to the Chairman of the Joint Chiefs of Staff and special assistant to the Air Force Chief of Staff.
“You don’t worry that your car came from a fully automated assembly line,” because that assembly line is still controlled by a human, she said. “I believe you will change the nature of war, not just the conduct of war, if you completely remove a thinking, ethical human being from the control.”
Mathewson acknowledged the strong cultural resistance to fully automated systems, but said if we can control platforms from anywhere in the world, it does not make sense to risk lives and equipment to forward deploy troops for something like launch and recovery.
“Why not just push a button someplace and let the thing take off?” he asked.
David A. Deptula, dean of AFA’s Mitchell Institute for Aerospace Studies, said he can’t see anyone moving to automate the decision to use weapons.
“You still have a human consenting to the employment of those weapons,” he said, noting that a pilot in a fifth generation aircraft doesn’t designate all the targets, but he or she does consent to the employment of weapons.
“We’re not going to turn the employment of weapons over to a machine … without any oversight,” Deptula said. “Technologically, it’s possible, but policywise, I daresay it ain’t going to happen.”
Still, Mathewson argued that while the US has “the luxury of control” in today’s fights, a large-scale war in the style of the world wars would not allow that level of control.
“If you write the program and you launch the aircraft, you’re basically making that consent,” he said. In a major force-on-force fight, “why do we limit ourselves?”
Right now, the US limits its use of RPAs and cyber tools because of ethics concerns, but Kass said the argument that the use of a certain weapon helps grow terrorists can be applied to virtually anything.
“Even if you covered [an enemy] in foam, you would have an argument that the child who watched his or her
mother or father being covered in green goop is going to grow up to be a terrorist,” Kass said. “That argument can be made in relation to any weapon system that you choose.”
Because of those fears, and the Judeo-Christian principle of minimizing casualties on both sides, the US tends to tie its own hands on policy, she said.
In the cyber realm, Kass pointed out, the approval required to use a “cyber tool” against an enemy “is the equivalent of allowing the release of a nuclear weapon.” The idea is that “if we open that Pandora’s box, … there is no end to what the enemy could do to us.”
However, that box has already been opened, she said.
“We aren’t opening that Pandora’s box, the same way we didn’t open Pandora’s box on reconnaissance-strike complex. We are using American ingenuity and American technology to … project power without projecting vulnerability,” Kass said.