June 14, 2023 – The researchers also found ChatGPT provided evidence-based answers 91% of the time.
ChatGPT is a large language model that picks up nuance and subtle language cues. For example, it can identify someone who is severely depressed or suicidal, even if the person doesn’t use those terms. “Someone may never actually say they need help,” Ayers said.
‘Promising’ Study
Eric Topol, MD, author of Deep Medicine: How Artificial Intelligence Can Make Healthcare Human Again and executive vice president of Scripps Research, said, “I thought it was an early stab at an interesting question and promising.”
But, he said, “much more will be needed to find its place for people asking such questions.” (Topol is also editor-in-chief of Medscape, part of the WebMD Professional Network).
“This study is very interesting,” said Sean Khozin, MD, MPH, founder of the AI and technology firm Phyusion. “Large language models and derivations of these models are going to play an increasingly critical role in providing new channels of communication and access for patients.”
“That’s certainly the world we’re moving towards very quickly,” said Khozin, a thoracic oncologist and an executive member of the Alliance for Artificial Intelligence in Healthcare.
Quality Is Job 1
Making sure AI systems access quality, evidence-based information remains essential, Khozin said. “Their output is highly dependent on their inputs.”
A second consideration is how to add AI technologies to existing workflows. The current study shows there “is a lot of potential here.”
TOO LITTLE, TOO LATE? – Dec. 19, 2024 - Assembly Bill 56 (AB 56) proposes…
AND STOPPED DIGGING – Dec. 4, 2024 - In a new interview with The Times,…
NOT JUST IN PENCILS – Dec. 8, 2024 - Americans born before 1966 experienced “significantly…
AS SUCCESSFUL AS EVER – Dec. 3, 2024 - Family Affair actor Johnny Whitaker looked…
ALANON Plus – Dec. 7, 2024 - A high percentage of treatment failures occur due…
AUDIO – A GIANT IS GONE – Dec. 10, 2024 - Nikki Giovanni, the poet,…