AI’s so-called black box refers to the fact that much of the underlying technology behind machine learning-enhanced algorithms is probability/statistics without a human readable explanation. Oftentimes that’s the case because the advanced math or the data science behind the algorithms is too complex for the average user to understand without additional training. Several stakeholders in digital health maintain, however, that this lack of understanding isn’t that important. They argue that as long as an algorithm generates actionable insights, most clinicians don’t really care about what’s “under the hood.” Is that reasoning sound?
Trending
- 5 Top Tips for Colonoscopy (GI & Endoscopy News)
- Scientists uncover gut signals that may improve early cancer detection (Earth.com)
- The AGA Research Foundation awards $2.9 million in digestive health research funding (EurekAlert!)
- The GI hospitalist: A practice-changing health care opportunity (MDedge)
- Virtual Multidisciplinary GI Care Improves Outcomes, Reduces Costs (Gastroenterology Advisor)
- Validation of the American Society for Gastrointestinal Endoscopy’s Complexity Grading System for Endoscopic Procedures (Clinical and Translational Gastroenterology )
- A Randomized Trial of Rifaximin vs Low FODMAP Diet for Symptom Outcomes and Microbiome Changes in Irritable Bowel Syndrome (Clinical GI & Hepatology)
- Aspirin Unlikely to Cut Colorectal Cancer Incidence (Medscape)
