In the Information Technology industry, strange episodes have been experienced in all large companies and Facebooknow Meta, obviously has not escaped those twisted anecdotes.
Regardless of all the mess with your metaversean issue that looks like an inexhaustible source of material on the evolution of a project that even now with all the money in the world looks like an uncertain future.
But in the past, at the end of the last decade, the company followed the trend of many others in the same field, with a marked obsession with the development of Artificial Intelligence systems.
The Internet of Things (IoT) was shown as the next great innovation that would revolutionize the market and firms such as Google or Facebook itself focused on creating systems that were ready for this moment of change.
The reality, as we have already verified, is that we are still somewhat far from that envisioned moment and now Mark Zuckerberg is investing his dollars in more… virtual projects.
But at least he left us a curious anecdote for history with the creation of his two chatbots that he had to kill for creating his own language.
This is the story of Alice and Bob: the chatbots that Facebook disconnected for creating their own language
Friends of Analytics India Magazine in 2017 they published an extensive and very interesting article to recap Alice and Bob’s tragedy with Facebook.
And curiously it has risen interest in this episode recently, with some local and European media picking up the story.
Broadly speaking, during this fever for AI systems, Mark Zuckerberg and Facebook started a project of this type focused on the development of two advanced chatbots that could communicate with each other. These bots received the names of Alice and Bob.
Then, there came a point where Facebook would have pulled the two of them offline, after they started talking to each other in a language they made up.
At the time the media (including us) covered the news with an almost comical tone, as if Zuckerberg had to kill both chatbots before they gave rise to the Terminator plot. But the story is much simpler and therefore sad.
It so happens that Facebook’s Artificial Intelligence Research Group (FAIR) started a new stage of the project, where they tried to test that both chatbots could negotiate with each other.
But it was there that they realized that they had not programmed Alice and Bob to respect the basic rules of grammar of the English language, so they began to “negotiate” with this kind of dialogue:
Bob: “I can can II everything else.”
Alice: “Balls have zero to me to me to me to me to me to me to me to me to.”
So in the end everything was a failure due to a simple error in the projection of the evolution of the tests from the beginning. Nothing radically dangerous for humanity.