Oh, no, see they will blame the doctor that was treating them and would normally get a radiologist report, but instead got an AI report.
AI is actually pretty good at x-ray interpretation but it does get it wrong, as do radiologists. The safe option is to have the radiologist review ai output.
The problem is false negatives. Positive reports would still be reviewed before treatment.
AI already has less false negatives than humans. Both together is optimal but at some point you need to prioritize. A doctor looking at scans could in stead be treating a patient.
Until the hospital realizes they would be responsible for malpractice suits since there is no doctor to blame.
Yeah right they’ll just lobby the government to make them immune from liability by saying “the free market” will take care of any shortcomings.
Sigh… likely true.
Oh, no, see they will blame the doctor that was treating them and would normally get a radiologist report, but instead got an AI report.
AI is actually pretty good at x-ray interpretation but it does get it wrong, as do radiologists. The safe option is to have the radiologist review ai output.
The problem is false negatives. Positive reports would still be reviewed before treatment.
AI already has less false negatives than humans. Both together is optimal but at some point you need to prioritize. A doctor looking at scans could in stead be treating a patient.