claushauler t1_jdhkvnv wrote
Reply to comment by Rofel_Wodring in Artificial Intelligence Predicts Genetics of Cancerous Brain Tumors in Under 90 Seconds by JackFisherBooks
The resulting malpractice lawsuits will be insane. Hospitals will need to carry insurance indemnifying them for potential billions.
SgathTriallair t1_jdhrmw2 wrote
They'll just makes it a tool doctors use. The doctor will get your symptoms, and ask relevant questions, feed it to the AI which will spit out a diagnosis, and then the doctor will read off what the AI said.
There will be websites that say things like "this is only for informational purposes, please seek actual medical assistance for an emergency" while walking you through the process of performing great surgery.
claushauler t1_jdhsuxu wrote
It doesn't matter if the diagnosis is delivered by proxy, really. There will be grave errors as a result and the resultant litigation will be literally endless.
SgathTriallair t1_jdhvgk3 wrote
Doctors will still bear the brunt of the liability. If it's more effective then there will be less liability to go around, so less malpractice suits than there are today.
SoylentRox t1_jdinjmy wrote
Doctors commit grave errors now. They won't switch to AI until it is substantially better than humans .
FTRFNK t1_jdiireh wrote
If a doctor currently has a misdiagnosis rate of 15%, and an AI tool changes that to 5% it stands to reason that there will be LESS malpractice suits, not more. You're giving the current medical establishment way too much credit for what they see versus what they currently miss.
SoylentRox t1_jdinpql wrote
Yep. Plus in that 5 percent category the AI could be designed to inform the supervising doctor when it has low confidence so that when possible humans augment it.
econpol t1_jdke9wy wrote
Insurance will require doctors to use these tools.
Rofel_Wodring t1_jdhtdlq wrote
As opposed to...?
JoeUrbanYYC t1_jdj2m70 wrote
Just make another AI that defends against lawsuits
Viewing a single comment thread. View all comments