I was watching Paul Chang interview on AM today. He explains that in order for us to consume AI, we will have to develop the proper infrastructure. By which he means we need to think up ways for us human rads to interface with the machines. Only then can AI truly ASSIST us, he says.
He acknowledges that it will be hard. I'd go on a tangent and ask, Why bother?
Why do we keep looking at this as though we will be overseeing AI? Don't you think it'd be redundant? I mean we'd be doing the same darn thing. Even worse, it could become a case of the idiot supervising the master. A more likely scenario is where we do different jobs. Whoever does it better. For instance AI does the screenings while we do the diagnostics, like Hospital-rad said in the other thread. Over time though AI will take on more roles. In essence it will REPLACE us, little by little.
Finding ways for us to "interact" is exercise in the superfluous. It's also unwieldy. The path of least resistance is to just let it take over the task. Now mind you, I'm not advocating for this. All I'm saying is that it'd be much easier than thinking up awkward ways to interact.
Doing so would be like thinking up creative ways to bring the ugly roommate along. The proper question isn't "how do we go about this?" It's "why bother?"
The year is 2060.
At the end of a dimly lit hallway, an old feeble man sits in a rocking chair.
Rocking Back and forth
Back and forth.
Mumbling through his edentulous gums
“AI is going to replace us”
AI would be more worrisome if there was a uniform widely accessible single database of medical information. But as we all are painfully aware, everything is in silos. And that problem ain’t going away anytime soon. After all, patient data is most valuable commodity “owned” by consolidated health enterprises, and they’re not just giving it up to IBM and big data players.
Not saying AI won’t have an impact...it’s an attractive business model for corporate gigs to separate wheat from chaff and triage exams, or to allow introduction of mid levels into interpretive roles by clueless hospital admins.
But it is not the panacea it’s made out to be...
May turn out to be the catalyst for formalization of two tier health care system....if walmart/CVS/Amazon clinic run by AI and PE overlords finds some “problem” you get triaged to the second tier financed by combination of your employer and federal government.
Learning to perform and interpret radiographs is "hard".
Learning to perform and interpret barium studies is "hard".
Learning to perform and interpret sonograms is "hard".
Learning to perform and interpret pneumoencephalograms was "hard".
Learning to perform and interpret angiograms is "hard".
Learning to perform angioplasty is "hard".
Learning to interpret CTs is "hard".
Learning to interpret MRIs is "hard".
Learning to interpret scintigrams is "hard".
Learning to interpret PET/CTs is "hard".
Learning to interpret PET/MRIs is "hard".
Learning to use PACS instead of film is "hard".
Learning to use EMR's is "hard".
Learning to use Speech Recognition is "hard".
The anatomy of our patients and their disease processes, pathology, and so on haven't particularly changed, but our armamentarium has grown over the 100+ years of Radiology's existence.
The abject terror and hostage-mentality inspired in SOME of us by AI is foolish and self-defeating. AI is a TOOL, nothing more, nothing less. A very powerful tool that we have to learn to use and adapt to help us best serve our patients.
AI has great potential, and it needs to be developed by those who actually know what the damn machine is looking at. Petulant tropes about how "AI is going to do what we do but do it better" are short-sighted and self-defeating.
It’s scaring the hello out of medical students
Decreasing the appeal of radiology
..... and creating considerable job security forbid for the foreseeable future