The protest departure of one of Google’s artificial intelligence (AI) pioneers has caused quite a stir, fueling our old fears of technology.
What happens when you play God with science? That’s what Mary Shelley wondered in her novel “Frankenstein or the Modern Prometheus.” The protagonist is Victor Frankenstein, a young genius who makes a living being in his castle. But the experiment goes wrong and the monster gets out of hand. Poor Victor doesn’t know how to control his creation and ends up losing his mind. A horror story that will make you think about the limits of science. Will Geoffrey Hinton be the doctor Frankenstein of this era?
Geoffrey Hinton has left his job at Google to warn of the risks posed by this technology. Hinton, 75, is known as the “father” of AI, as he developed the technology at the heart of chatbots like ChatGPT. However, he now regrets contributing to this field and says that if it hadn’t been him, someone else would have done it. What happened?
TRUE. Geoffrey Hinton is an artificial intelligence (AI) expert who has left Google to warn of the dangers of this technology. In an interview with The New York Times, he said that the AI is advancing very fast and it can cause problems for society. For example, that the internet is filled with lies in the form of texts, photos and videos, that AI takes many people’s jobs and that it is even smarter than us. Sure, it all depends on what we teach the AI. If we give it junk information, it will give us junk back. And, of course, if we don’t protect people, many people can be left on the streets.
Of course, Hinton isn’t the only one warning about the dangers of AI. Other experts and intellectuals have called for a moratorium on the development of this technology to reflect on its ethical and legal implications. Hinton believes research should be held back until it is well understood how to control AI and prevent it from falling into the hands of bad actors. In this case, we cannot shoot first and ask later. This technology is so powerful that the regulation has to be established before and not after something breaks.
Hinton is a crack at AI and that is why they gave him the Princess of Asturias Award for Scientific and Technical Research last year along with three other colleagues. But now the lord has realized that AI can be very good or very bad, it depends on how it is used. That’s why, He has left Google and has started to warn everyone that you have to be careful with this science that can change everything.
Silicon Valley is the paradise of technological innovation, but also the hell of competition. The companies that operate there are in the habit of launching their products on the market early, without waiting for them to be well tested and regulated. Your goal is to get there before your rivals and make big profits. But this attitude, although understandable, is extremely dangerous. What if the technology they release has bugs, risks, or negative consequences for society? Who is responsible? Regulators usually act too late, when the piper has to be picked up. For this reason, Silicon Valley needs to be more prudent and responsible with its creations. Technology can be a blessing or a curse, depending on how it is used.
Dedalus was a genius who knew how to invent and build incredible things. One day, King Minos asked him to make a giant labyrinth to contain a very ugly bug that had the head of a bull and the body of a man. His name was Minotaur and he was very scary. Dedalus made the labyrinth so well that neither he nor his son Icarus could get out of it.
Then, Dedalus had another idea: to make wings with wax and feathers to fly. He told Icarus not to be silly and to fly carefully. That he didn’t get too close to the sun, but not to the sea either. But Icarus got so excited about his new toy that he forgot what his father told him. He went so high that the sun melted his wings and he took a good hit. Thus ended the adventure of Icarus for not paying attention. Is Silicon Valley the Icarus of our era?
Silicon Valley is the place where the wonders of technology are invented and developed. Things are created there that make our lives easier and more fun, such as mobile phones, social networks or artificial intelligence. But you also have to be very careful what you do there. You can’t play lightly with technology without thinking about the consequences. You can’t fly that high without considering the risks. You cannot ignore the advice of experts who warn us of the dangers. If Silicon Valley doesn’t listen, it could end up like Icarus: falling into the void.
AI technology is great, but let’s not go crazy. It is not about throwing everything away and going back to the caves. However, it is not a very good idea to swallow everything that comes from Silicon Valley without chewing. We have forgotten that technology is there to make our lives easier and not more difficult or unfair. We must not become puppets of technology. If we don’t keep that tiger under control, it will eat us up.
Geoffrey Hinton obviously knows what he’s talking about. He has left Google to alert us to the risks of this technology. He tells us that AI can be very good or very bad, it depends on how it’s used. He asks us to be prudent and responsible with this science that can change the world. I hope she doesn’t become a Cassandra, the Trojan prophetess that nobody believed. Hopefully we listen to him and don’t regret it later. Hopefully it’s not too late.
Disclaimer: The information and/or opinions expressed in this article do not necessarily represent the views or editorial line of Cointelegraph. The information presented here should not be taken as financial advice or investment recommendation. All investment and commercial movement involve risks and it is the responsibility of each person to do their due research before making an investment decision.
It may interest you:
Investments in crypto assets are not regulated. They may not be suitable for retail investors and the entire amount invested may be lost. The services or products offered are not directed or accessible to investors in Spain.