John Ayers, a computational epidemiologist who was the lead author of the UCSD study, said that as with other medical interventions, the focus should be on patient outcomes. "You should never use our models to provide diagnostic or treatment services for serious medical conditions." "OpenAI's models are not fine-tuned to provide medical information," a company spokesperson said. OpenAI, the company that developed ChatGPT, also urged caution. "The potential for exploitation of people and the commercialization of data is unprecedented." "Companies might want to push a particular product over another," said Marks. The biggest danger, in his view, is the likelihood that market incentives will result in AI interfaces designed to steer patients to particular drugs or medical services. "I don't know how helpful it is to say, 'Well, let's just throw this conversational AI on as a band-aid to make up for these deeper systemic issues,'" he said to KFF Health News. He recently wrote an opinion piece on AI chatbots and privacy in the Journal of the American Medical Association. "That's a little bit of a disappointing bar to set, isn't it?" said Mason Marks, a professor and MD who specializes in health law at Florida State University. Google is unconvincing, these critics say. The proposition that AI should be embraced because it represents a marginal improvement over Dr. For many medical professionals, AI chatbots are an invitation to trouble: They cite a host of issues relating to privacy, safety, bias, liability, transparency, and the current absence of regulatory oversight. Still, even the researchers who have demonstrated ChatGPT's relative reliability are cautious about recommending that patients put their full trust in the current state of AI. "They are accurate enough at this point to start meriting some consideration," he said. While a postdoctoral fellow in nursing at the University of Alberta in Canada, Benoit published a study in February reporting that ChatGPT significantly outperformed online symptom checkers in evaluating a set of medical scenarios. "We need physicians to start realizing that these new tools are here to stay and they're offering new capabilities both to physicians and patients," said James Benoit, an AI consultant. Indeed, a number of companies are exploring how chatbots could be used for mental health therapy, and some investors in the companies are betting that healthy people might also enjoy chatting and even bonding with an AI "friend." The company behind Replika, one of the most advanced of that genre, markets its chatbot as, "The AI companion who cares. Another study, published in April by researchers from the University of California-San Diego and other institutions, even noted that health care professionals rated ChatGPT answers as more empathetic than responses from human doctors. A report published in Nature in early July by a group led by Google computer scientists said answers generated by Med-PaLM, an AI chatbot the company built specifically for medical use, "compare favorably with answers given by clinicians."ĪI may also have better bedside manner. The Emory study is not alone in ratifying the relative accuracy of the new generation of AI chatbots. So, we need to understand the potential advantages and the pitfalls." Bots with good bedside manner "People have already discovered its utility. "There's no question we have issues with access to care, and whether or not it is a good idea to deploy ChatGPT to cover the holes or fill the gaps in access, it's going to happen and it's happening already," said Jain. It's unclear how such a regime might apply to general-purpose AIs like ChatGPT. Many doctors believe AI-based medical tools should undergo an approval process similar to the FDA's regime for drugs, but that would be years away. When it comes to consumer chatbots, though, there is still caution, even though the technology is already widely available - and better than many alternatives. Shots - Health News AI in medicine needs to be carefully deployed to counter bias – and not entrench it
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |