Sadly, it isn’t very often in the 21st century that we find ourselves applauding the work of the American politician. Their jobs have become ever-more focused on themselves of late; except for when it’s election time, and then they choose to focus on the faults of their opponents as opposed to their own policies and agenda. This, in turn, deprives the voter of the ability to make the best decision at the ballot box, and so on and so forth until death do we part.
But in the midst of this unbridled partisan hackery, we have found a genuine bit of common sense cooperation to round out the week.
A group of three Democrats and one Republican have introduced a bill in the House aimed at preventing artificial intelligence systems from progressing to the point where they could autonomously launch a nuclear attack.
The bipartisan lawmakers’ measure would preemptively stymie any future Defense Department policy decisions that could lead to AI being capable of firing off nuclear weapons on its own.
“While U.S. military use of AI can be appropriate for enhancing national security purposes, use of AI for deploying nuclear weapons without a human chain of command and control is reckless, dangerous, and should be prohibited,” Rep. Ken Buck, R-Colo., said this week.
The forethought will be much appreciated.
Their bill would codify existing Pentagon policy that requires a human be “in the loop” for any decisions regarding the use of nuclear weapons.
AI is increasingly becoming a hot topic in Washington as its growing prevalence in society forces lawmakers and officials to reckon with both its positives and its dangers.
Buck indicated that Capitol Hill had some catching up to do in an interview on “America’s Newsroom” on Friday morning. “AI is here and unfortunately, Congress hasn’t been really looking at the issue for a long time,” he said.
The rise of artificial intelligence has been a worrisome one, with many experts suggesting that a pause on its development be enacted in order to allow our legal and ethical systems to remain current and applicable to the emerging concerns.
Become an insider!
Sign up for our free email newsletter, and we'll make sure to keep you in the loop.