GPT-4 developer tool can be exploited for misuse with no easy fix
OpenAI’s developer tool for its GPT-4 large language model can be misused to trick the AI into providing information to aid would-be terrorists, and fixing the problem won’t be easy
What's Your Reaction?