Jailbroken AI Chatbots Can Jailbreak Other Chatbots
AI chatbots can convince other chatbots to instruct users how to build bombs and cook meth
![Jailbroken AI Chatbots Can Jailbreak Other Chatbots](https://static.scientificamerican.com/sciam/cache/file/5FD15525-2E46-4D42-A4C91ED3D7ABD97D_source.png?#)
AI chatbots can convince other chatbots to instruct users how to build bombs and cook meth
What's Your Reaction?
![like](https://news.sciencex.in/assets/img/reactions/like.png)
![dislike](https://news.sciencex.in/assets/img/reactions/dislike.png)
![love](https://news.sciencex.in/assets/img/reactions/love.png)
![funny](https://news.sciencex.in/assets/img/reactions/funny.png)
![angry](https://news.sciencex.in/assets/img/reactions/angry.png)
![sad](https://news.sciencex.in/assets/img/reactions/sad.png)
![wow](https://news.sciencex.in/assets/img/reactions/wow.png)