In a March 2022 paper in Nature Machine Intelligence, researchers from a US pharmaceutical company who were building artificial intelligence systems for virtual drug discovery issued a wake-up call to their colleagues. The system could also be easily misused. The opinion article When All Research Is Dual Use discusses this in more detail and calls for dual use risks to be included as early as possible in the research and also to broaden it with social aspects.
After years of working on a suite of models to improve toxicity prediction, the researchers were invited to an international security conference to give a presentation on how such models could be misused to create chemical and biological weapons—something they had not previously considered, even though they had worked with neurotoxins and Ebola. “The thought had never previously struck us,” they wrote. By simply changing their models to search for molecules with more toxicity rather than less and running the trained algorithm for under six hours, the researchers were able to generate 40,000 molecules that were likely lethal, including the nerve agent VX and many new molecules that were predicted to be even more potent than known chemical warfare agents. “We were naive in thinking about the potential misuse of our trade,” the researchers wrote. “We are not trained to consider it.”
In the article When All Research Is Dual Use by Sam Weiss Evans it is stated that education and training is not enough to increase researchers' awareness of dual-use risks. What is needed is the recognition that science is a social system. In this complex social system, what questions get asked—like whether an AI tool could be used for ill—has as much to do with institutional culture, economics, politics, and ethics as with the science itself. The knowledge and technologies created by this system are a result of the contexts where they are made, and will bring about new ways of harming as well as helping.