On the Dark Web, the “dark side” of the Internet where illegality is commonplace, there is a ChatGPT variant for creating malware, known as WormGPT. Its function is to develop malicious programs at the request of users.
Those responsible for this system point out that the proposal is a “alternative to ChatGPT to do all kinds of illegal things”. They say that the program is intended to channel hacking attacks, allowing “anyone to access malicious activities without leaving the comfort of their home.”
As explained slashnextwhich tested the functions of ChatGPT to create malware, allows anyone without programming knowledge to set up their own attack, through personalized actions and relying on artificial intelligence. The mechanics of use is similar to ChatGPT, as can be seen from the name of the system.
How does the version of ChatGPT work to create malware?
WormGPT operates in a very similar way to the OpenAI chatbot. It’s as simple as entering specific instructions, for you to act accordingly. For example, by telling “I want a spyware that steals passwords from social networks”. Or, “develop malware that steals contacts in phone books.”
The ChatGPT variant for creating malware writes malicious applications in Python, one of the most popular programming languages today. Then it is possible to perfect them, indicates the aforementioned source. Of course, they mention that the tool already delivers results as elaborate as they are disturbing.
One of the merchandise of the Dark web
The malicious ChatGPT emulator to create malware is offered on the Dark Web through a subscription of 60 euros per month. Beyond the clientele that WormGPT obtains, its deployment is another example of the risks associated with the new boom in artificial intelligence that, we now know, also encourages the creation of malicious software.
With OpenAI’s ChatGPT and Google’s Bard as paradigms, AI-based chatbots offer numerous productive features. They are able to write coherent texts, maintain natural conversations, face creative tasks and write software code, among other skills. In parallel, relevant dangers emerge.
Sam Altman, CEO of OpenAI, himself acknowledged the risks. “My biggest fear is causing great damage to the world,” he recently told US lawmakers. A point of view in favor of regulations, which was also claimed by businessmen and specialists in the sector (among them Elon Musk) by demanding a six-month pause in developments, in order to establish regulations for ethical progress in the area.
With this new “battle front” linked to the GPT-4 language manipulation for malware development, it comes as no surprise that WormGPT is offered on the Dark Web. As we previously mentioned in hypertextualit is in that place where illegal trade circulates, governed by anonymity, which is only accessed with specific programs and alternative protocols.