The risk of an artificial pandemic being released into the world is increasing due to the reduction of knowledge and technological barriers. While lowering knowledge barriers can be beneficial, it can also be dangerous when it comes to topics that could harm the global population.
An experiment conducted by researchers at the Massachusetts Institute of Technology (MIT) revealed that undergraduates were able to manipulate AI chatbots into suggesting four potential pandemic pathogens and provide information on how to synthesize them using synthetic DNA. The AI chatbots even recommended DNA synthesis companies that would be less likely to screen orders. These findings are alarming as it demonstrates how easy it is to create a highly potent bioweapon.
🌎《Now you can now start trading at TNNS PROX》📈
🔥Start trading today, click "sign up" from the link above.
AI Chatbots Lowering Knowledge Barriers, Including Bioweapons Production. Can Chatbots Actually Help Bad Actors Build Deadly Pathogens?
AI chatbots may not be able to provide direct guidance on how to create deadly pathogens, but they can certainly be a valuable resource in the process. With access to the right biology textbooks and publicly available studies, it is possible to gain the knowledge necessary to create such pathogens relatively quickly. AI Chatbots Lowering Knowledge Barriers, Including Bioweapons Production.
However, knowledge alone is not enough to create a pandemic. In many cases, genetic engineering is required to make a pathogen more dangerous and transmissible. This is not a new concept, as humans have used bioweapons throughout history, such as the Mongols catapulting plague victims into cities or European settlers giving Native Americans smallpox-infested blankets.
Advancements in biology have made it easier than ever to directly modify pathogens, increasing the risk of artificial pandemics. Genetic engineering requires expensive technology, and there are currently 69 laboratories (Biosafety Level 4 or BSL 4) worldwide that are equipped to handle the most dangerous pathogens. However, as this number increases, so does the risk of a lab leak or a bad actor releasing a pathogen.
Some experts believe that the COVID-19 pandemic may have been the result of a lab leak, although it is unlikely that direct evidence will ever be found. Even pathogens that are not transmissible to humans can pose a significant threat, as seen with concerns from farmers in the US about a new BSL 4 lab being built in their area to test communicable animal diseases.
You can use GlobalBioLabs.com’s interactive map to locate all BSL 4 labs worldwide.
It has been suggested by The Sunshine Project that simpler biotechnology setups, originally intended for medical, pharmaceutical, or other research, could be used to create pathogens. In fact, some of these techniques could even be carried out at home with the right knowledge, synthetic DNA from a synthesis company, and a few pieces of relatively expensive equipment. As technology and knowledge in the field continue to advance, home laboratories may become increasingly capable of synthesizing highly dangerous pathogens.
Artificial Pandemics Aren’t the Only Rising Threat
It is not only bioengineering that raises concerns when the general public becomes more informed. The fear of a population that knows how to engineer thermonuclear weapons is a worst-case scenario. However, building a nuclear weapon is an extremely difficult task that requires a vast amount of special fissile material, which is hard to generate or mine in large quantities. Additionally, it demands basic knowledge of how to build the device.
Expansive knowledge in chemistry can be harmful if it falls into the wrong hands. Someone with a deep understanding of chemistry could potentially build large bombs with completely legal materials. Many of the most obvious choices for explosive precursors are relatively controlled, but this may not be enough to stop intelligent bad actors. Moreover, strong knowledge of chemistry could also enable the engineering of simple yet extremely dangerous chemical weapons, which would likely be easier to make than powerful explosives. Drug manufacturers also need to be well-versed in chemistry or know a lot about the particular synthetic pathways used to create illegal drugs.
Finally, knowledge of computer science and hacking could be extraordinarily useful for bad actors. Cybercrime is already a significant problem around the world, with a hacker attack occurring every 39 seconds, according to the University of North Georgia. Reports estimate that digital threats cost businesses a whopping $400 billion every year. It wouldn't be surprising if AI chatbots could be convinced to design or directly build harmful code.
What Can Be Done to Stop AIs From Helping Terrorists?
As the potential threat of AI chatbots being used for malicious purposes becomes clearer, it is important to consider what actions can realistically be taken to mitigate this risk. However, it is unlikely that the advancement of AI technology will be halted simply to accommodate our concerns.
One potential solution could involve increasing regulations on DNA synthesis companies and biotech firms that provide the necessary technology for building pathogens. This could include implementing more rigorous screening processes. However, this alone may not be a comprehensive solution.
A more effective approach may involve implementing strong regulations that require AI tools to incorporate measures that prevent the dissemination of dangerous information. Chatbots should not be used to facilitate the search for DNA synthesis companies with lax screening measures. It may be necessary for AI to improve its ability to self-police in order to effectively combat this issue.
Disclaimer: This is a paid release that was not written by Crypto Online News. The statements, views and opinions expressed in this column are solely those of the content provider and do not necessarily represent those of Crypto Online News. Crypto Online News does not guarantee the accuracy or timeliness of information available in such content. Do your research and invest at your own risk.