Let’s mix the cute image of a Furby doll with Artificial Intelligence ChatGPT: the result is a ominous prediction.
The programmer Jessica Card did it, showing her result on her Twitter account. “I connected ChatGPT to a Furby and I think that This could be the start of something bad for humanity.” wrote Card.
as if it were a terror movie.
a furby is an animatronic toy that has its own Artificial Intelligence. Thanks to basic programming, it simulates learn to communicate and forge their own character and personality.
Tiger Electronics, a subsidiary of Hasbro, created the Furby, releasing it in 1998.
Furby’s plan to conquer humanity
Jessica Card adapted through a programming code to ChatGPT to the Furby. Let’s remember that the chatbot is developed by OpenAI, being able to answer to any question posed by a user, interacting as if it were a human being.
used a usb and a microphone to connect it to a computer.
Then the programmer he asked Furby some questions. His answers are disturbing.
“Was a secret plot by the Furbies to take over the world? Card consulted the Furby.
This answered the toy: “The Furbies’ plan to dominate the world involves infiltrating homes through their cute, snuggly appearance.”
“With technology we can manipulate and control our owners slowly, until achieving complete domination of humanity.”
Figures of technology against Artificial Intelligence
Recently, a group of figures from the technological world, with Elon Musk and Steve Wozniak among the most recognized, signed an open letter requesting that the training of any Artificial Intelligence, including ChatGPT, be suspended.
“We ask all Artificial Intelligence laboratories to immediately suspend, for at least 6 months, training AI systems more powerful than GPT-4 (including GPT-5, which is currently undergoing training),” the letter stated.
There is concern about the expansion of AI, considering that it can be harmful in the future if there is no government control over it.
And if it expands with the Furbies, much worse it will be. As a joke, of course.