Viewing a single comment thread. View all comments

claushauler t1_jdhkvnv wrote

The resulting malpractice lawsuits will be insane. Hospitals will need to carry insurance indemnifying them for potential billions.

−2

SgathTriallair t1_jdhrmw2 wrote

They'll just makes it a tool doctors use. The doctor will get your symptoms, and ask relevant questions, feed it to the AI which will spit out a diagnosis, and then the doctor will read off what the AI said.

There will be websites that say things like "this is only for informational purposes, please seek actual medical assistance for an emergency" while walking you through the process of performing great surgery.

10

claushauler t1_jdhsuxu wrote

It doesn't matter if the diagnosis is delivered by proxy, really. There will be grave errors as a result and the resultant litigation will be literally endless.

−7

SgathTriallair t1_jdhvgk3 wrote

Doctors will still bear the brunt of the liability. If it's more effective then there will be less liability to go around, so less malpractice suits than there are today.

8

SoylentRox t1_jdinjmy wrote

Doctors commit grave errors now. They won't switch to AI until it is substantially better than humans .

3

FTRFNK t1_jdiireh wrote

If a doctor currently has a misdiagnosis rate of 15%, and an AI tool changes that to 5% it stands to reason that there will be LESS malpractice suits, not more. You're giving the current medical establishment way too much credit for what they see versus what they currently miss.

4

SoylentRox t1_jdinpql wrote

Yep. Plus in that 5 percent category the AI could be designed to inform the supervising doctor when it has low confidence so that when possible humans augment it.

3

econpol t1_jdke9wy wrote

Insurance will require doctors to use these tools.

2