San Francisco, Apr 29 (IANS): A group of US lawmakers have proposed a bill to stop artificial intelligence (AI)-driven autonomous systems from single-handedly launching nuclear weapons, as fears arise on the potential threat of AI taking a decision on its own.
Senator Edward Markey (D-MA) and Representatives Ted Lieu (D-CA), Don Beyer (D-VA), and Ken Buck (R-CO) have introduced the 'Block Nuclear Launch by Autonomous AI Act' in the US, reports The Verge.
The bill would "prohibit the use of Federal funds to launch a nuclear weapon using an autonomous weapons system that is not subject to meaningful human control".
"In all cases, the US will maintain a human ain the loop' for all actions critical to informing and executing decisions by the President to initiate and terminate nuclear weapon employment," read the bill.
The Senators feel that it is the sense of Congress that the use of lethal, autonomous nuclear weapons systems that are not subject to meaningful human control cannot properly adhere to international humanitarian law and "any decision to launch a nuclear weaponAshould not be made by AI".
An earlier 'National Security Commission on Artificial Intelligence' report recommended affirming a ban on autonomous nuclear weapons launches, "not only to prevent it from happening inside the US government but to spur similar commitments from China and Russia", the report mentioned.
A large-scale nuclear war would lead to the deaths of millions of people, firestorms, radioactive fallout contamination, agricultural failure, and catastrophic climate results.
"Compliance with international humanitarian law, human control and human legal judgment is essential in the nuclear command and control process," the bill stressed.
The 2022 Nuclear Posture Review states that "in all cases, the US will maintain a human ain the loop' for all actions critical to informing and executing decisions by the President to initiate and terminate nuclear weapon employment".