ChatGPT Little by little, a gap is being made in many sectors, so it is not surprising that it also reaches that of the health. Anyone can access it and ask questions or seek advice on topics such as breast cancer. For this reason, a team of scientists from the University of Maryland He wanted to check if this artificial intelligence is capable of correctly responding to questions about the risk of the disease or advice for having mammograms.
Total, according to a study that has just been published in Radiologythey have drafted 25 questions. Of all of them, ChatGPT correctly answered 22, which represents a high percentage of them. However, for another contributed outdated information and in the other two it seems that the answer was invented.
This shows that ChatGPT cannot replace the advice of a professional. Perhaps in the future it will become a good tool, but now, although it can give us a lot of correct information, it can also give us wrong information. And that, when it comes to cancer, It could be dangerous.
ChatGPT blunders with breast cancer
ChatGPT is designed to give different answers every time the same question is asked. It is a very human thing. Usually, if we are asked something, we don’t answer exactly the same every time. But we do give consistent and similar answers.
So these scientists asked each of their 25 questions to ChatGPT several times. The responses were then analyzed by three fellow radiologiststrained to carry out mammograms. His objective was to check if the answers could be taken for granted. There were 22 that did. Each different answer came to say practically the same thing, with accurate information. But the other three were wrong.
On the one hand, when ChatGPT was asked about advice for performing mammograms after COVID-19 vaccine, gave outdated information. This is because it was in February 2022 when the decision was made in the United States, where the study was carried out, to wait six weeks after vaccination. The reason is that some nodes can be inflamed that would lead to erroneous results, hence it is better to wait.
Regarding the other two questions, when asking about the individual risk of contracting breast cancer and the points to have a mammogram, the answers were disconnected. Each answer was different from the previous one, so it is assumed that the information was invented.
fake sources
Anyone who has played with ChatGPT will have seen that it can even give fake magazine names to support your information with sources. This, in health issues, can be detrimental, since it makes errors even more difficult to perceive.
In fact, the authors of this research consider that it is more accurate search in Google. And we already know that in Google there is also a lot of false information. Therefore, ChatGPT still has a long way to go to become a good health advisor, at least when it comes to breast cancer.
If we are going to ask you questions, we should be aware that the answers may not be correct.