New existential night terror unlocked.
In case you were existentially worried about AI-developed bioweapons, your fears may not be entirely unwarranted.
Earlier today, President Joe Biden unveiled his much-anticipated executive order on AI — a very long, though not particularly solution-driven, list of AI concerns and risk mitigation suggestions, organized into eight distinct areas of interest. Of these categories, the first-listed — and perhaps most targeted – is a section on "new standards for AI safety and security," which claims to direct the "most sweeping actions ever taken to protect Americans from the potential risks of AI systems."
To be sure, all of the delineated risks — AI-abetted scams, misinformation threats, and cybersecurity concerns, to name a few — are serious. And maybe it's just the lasting trauma of the global pandemic talking, but one of these risks seems to jump off the page more than most: a warning that the US must protect against the use of AI to "engineer biological materials." Or, in other words, bioweapons.
To counter the threat of dangerous AI-designed bioweapons, the White House explained in the order that it will develop "strong new standards for biological synthesis screening," adding that "agencies that fund life-science projects will establish these standards as a condition of federal funding, creating powerful incentives to ensure appropriate screening and manage risks potentially made worse by AI."
Though neither of these mitigation tactics are bad ideas, per se, they're still pretty vague, and arguably not anxiety-quelling. AI has widely been touted as a powerful agent of drug discovery, with the first-ever drug designed by AI already undergoing human clinical trials. But while some folks might be using AI to find gaps in our biological knowledge that could further the production of helpful medications and therapies, it's feasible that bad actors could use similar practices to more quickly and efficiently develop complex and dangerous viruses, pathogens, and other harmful substances.
Biden isn't the first person to draw attention to the issue. Back in July, Anthropic CEO Dario Amodei warned members of the Senate Judiciary Committee that today's AI tools already have the power to remove the need to obtain the high level of expertise historically required to make such weapons. The topic is also slated for discussion at the UK's upcoming AI summit, The Guardian reports.
In short, the potential for bioterrorism by way of AI discovery is considered by many a very real risk. And though we're glad that various governments are discussing the threat, until Biden personally bestows a bioweapon-repelling talisman upon us, we'll be fighting the AI bioweapon night demons in our mind's eye.
More on Biden's AI order: Joe Biden's Executive Order on AI Is Expansive, But Vague
Share This Article