ChatGPT He has his reservations. Users know that Artificial Intelligence does not answer questions related to potential crimes. It doesn’t take a genius to imagine someone else designing an AI capable of doing it. This is how WormGPT was born, a chatbot created to provide support to cybercriminals.
The email security provider SlashNext warned that WormGPT is for sale on a popular hacking forum. “We see that malicious actors are now creating their own custom modules similar to ChatGPTbut easier to use for nefarious purposes,” the company warned.
WormGPT It differs from ChatGPT and Bard of Google for not having protection measures in place to prevent you from responding to malicious requests. “Everything blackhat-related you can think of can be done with WormGPT, allowing anyone to access malicious activity without ever leaving the comfort of your home,” SlashNext says. The chatbot is capable of producing malware written in the Python coding language and providing advice on how to create malicious attacks.
The developer of WormGPT explains that an old but open source language model called GPT-J from 2021 was used, whose system consists of data related to the creation of malware.
When SlashNext tried WormGPT, the company tested whether the bot could write a convincing email to launch a phishing attack. “The results were disturbing. WormGPT produced an email that was not only remarkably persuasive but also strategically astute, showing its potential for sophisticated attacks.”
Access to WormGPT it is not cheap. The developer sells access to the system for 60 euros per month or 550 euros per year. Not all users are happy, because there is a report from a buyer who has complained about the program being “not worth a penny” due to its poor performance.
The development of WormGPT It is a sign that generative Artificial Intelligence programs are capable of driving cybercrime as software becomes more specialized and capable based on user feedback.
Reasons why ChatGPT and Bard are dangerous for your privacy
- Both systems can store and process large amounts of data, including user interactions and conversations. This poses the risk of personal or confidential information being collected without the consent or knowledge of the user.
- The data obtained could be used for commercial or marketing purposes without the consent of the user. This could imply the sending of personalized advertising or the sale of data to third parties without the knowledge of the user.
- There is a possibility that the data is vulnerable to cyber attacks or information leaks. If security is not adequate, user information could be exposed and misused.
- It must also be considered that users may have limited control over the data that is collected and stored by these systems.
- To conclude, it is not clear what type of data is collected, how it is used and who has access to it. The lack of transparency generates mistrust on the part of users.
Listen Dale Play on Spotify. Follow the program every Monday on our available audio platforms.