AI’s so-called black box refers to the fact that much of the underlying technology behind machine learning-enhanced algorithms is probability/statistics without a human readable explanation. Oftentimes that’s the case because the advanced math or the data science behind the algorithms is too complex for the average user to understand without additional training. Several stakeholders in digital health maintain, however, that this lack of understanding isn’t that important. They argue that as long as an algorithm generates actionable insights, most clinicians don’t really care about what’s “under the hood.” Is that reasoning sound?
Trending
- 7 HLTH Announcements You Don’t Want to Miss (MedCity News)
- Practical Tips for Contracting, Part 2 (GI & Endoscopy News)
- GRAIL PATHFINDER 2 Results Show Galleri® Multi-Cancer Early Detection Blood Test Increased Cancer Detection More Than Seven-Fold When Added to USPSTF A and B Recommended Screenings (GRAIL)
- Eliminating Cost Sharing Boosted Follow-Up Colonoscopy Rates (AJMC)
- St. Charles Health System Taps WovenX to Transform GI Access and Optimize Capacity (PR Newswire)
- United Digestive adds 5 gastroenterologists in 3 months (Becker’s GI & Endoscopy)
- Negotiated Prices for Large Insurers and Regional Differences in Employed Cardiology and GI Groups Revealed (Medscape)
- Health care in the USA: money has become the mission (The Lancet)
