An Air Force F-16 piloted entirely by an AI-powered brain survives an air-to-air dogfight with a human-piloted jet and defeats the military's top official, Air Force Secretary Frank Kendall.
Kendall, 75 and 25 years old, away from careers as active-duty Army and Army Reserve officers, sits in the front seat of an experimental F-16 that twists and snakes with up to 5 G's of force during a simulated air-to-air flight. was. Air combat over Edwards Air Force Base, California.
The experimental F-16, called the X-62A VISTA by the Air Force, flies and fights using what the Air Force calls “machine learning and live agent integration.” The Air Force said neither Kendall nor the safety pilot in the back seat of the X-62 touched the plane's controls during the hour-long flight. The X-62 and its AI-powered brain are a joint development of the US Air Force Test His Pilot Sc Research Division.fool and the Defense Advanced Research Projects Agency's Air Combat Evolution Program.
“While the potential for autonomous air-to-air combat has been imagined for decades, reality has remained a distant dream until now,” Kendall said in an Air Force statement.
According to the Air Force, the VISTA project began four years ago with the basic purpose of letting the F-16 simulate the flight characteristics of other aircraft. But over the years, the project has morphed into what the service claims is the military's first AI flying fighter.
Kendall boarded the plane with the pilot.
Will AI fly and fight?
There are other examples of pilotless aircraft, such as the joint DARPA-Army program that will fly the H-60 helicopter without a pilot in 2022. But Kendall said all of the tactics and maneuvers the plane chooses to perform in dogfights are made by his AI system without any actual human input.
“AI actually takes the most capable technologies you have, combines them, and uses them to solve problems that previously had to be solved by human decision-making. It’s an automation of decisions, and it’s very specific,” Kendall said.
But the arrival of advanced AI in cockpits and other military systems signals a growing debate in military and policy circles. “Can we trust AI to pull the trigger?”
Israel introduced AI in decision-making and airstrike targeting for Gaza airstrikes, using a system called Lavendar that reportedly identified as many as 37,000 Palestinian targets in the early weeks of the war. has been used and has been criticized.
The United Nations and the International Committee of the Red Cross (ICRC) have issued a joint statement calling on political leaders to establish international rules on autonomous weapons systems.
“In the current security climate, setting clear international red lines would benefit all countries,” they added, adding that “autonomous weapons systems (generally capable of targeting targets without human intervention) (understood as a weapon system that selectively uses force) raises serious humanitarian, legal and ethical questions. UN Secretary-General António Guterres and ICRC President Mirjana Sporjaric said in a statement released last October. “Human control must be maintained in life and death decisions.”