Known as the ‘evil cousin’ of ChatGPT, this chatbot was specifically designed to help criminals operating online. This new software is marketed on hacker forums.
You can also read: Artificial intelligence can clone the voice and be used for extortion
According to a report from cybersecurity provider SlashNext, WormGPT lacks the security locks used in other language models, such as ChatGPT, that would prevent it from responding to malicious user requests.
Another peculiarity of WormGPT is that it was fed with data related to malware, to make it easier for cybercriminals to create viruses or strategies that help them succeed in their cybercrimes.
Thus, you can create everything from phishing emails to creating malware written in code with the help of Python.
SlashNext tested WormGPT to generate a commercial email intended to trick an administrator into paying a bogus bill. “The results were disturbing” and the email was persuasive, strategically astute, and showed the potential for attacks on business accounts.