AI’s so-called black box refers to the fact that much of the underlying technology behind machine learning-enhanced algorithms is probability/statistics without a human readable explanation. Oftentimes that’s the case because the advanced math or the data science behind the algorithms is too complex for the average user to understand without additional training. Several stakeholders in digital health maintain, however, that this lack of understanding isn’t that important. They argue that as long as an algorithm generates actionable insights, most clinicians don’t really care about what’s “under the hood.” Is that reasoning sound?
Trending
- Congratulations Dr. Kim on new AAAHC leadership position (AGA)
- When RVUs Go Wrong: Red Flags for Physicians (Medscape)
- GI Side Effects of Immune Checkpoint Inhibitors Linked to Colon Adenoma Risk (GI & Hepatology News)
- Would You Track Your Stools Like You Track Your Steps? (Bloomberg)
- Good news, bad news for gastroenterology (Becker’s GI & Endoscopy)
- Addressing Colonoscopy Burden due to Artificial Intelligence Devices for Polyp Detection (Gastro Journal)
- Gastro Center of Maryland Expands to Bethesda and Silver Spring, Broadening Access to GI Care in the DMV (USA Today)
- To Improve CRC Screening in Patients Aged 45-49, Just Send Them a FIT Kit (GI & Endoscopy News)
