AI’s so-called black box refers to the fact that much of the underlying technology behind machine learning-enhanced algorithms is probability/statistics without a human readable explanation. Oftentimes that’s the case because the advanced math or the data science behind the algorithms is too complex for the average user to understand without additional training. Several stakeholders in digital health maintain, however, that this lack of understanding isn’t that important. They argue that as long as an algorithm generates actionable insights, most clinicians don’t really care about what’s “under the hood.” Is that reasoning sound?
Trending
- Asaf Kraus: AI-based stool recognition aids Uber engineer support GI care delivery
- ChatGPT In Healthcare: What Science Says (The Medical Futurist)
- Liquid Biopsy Comparison Highlights Cell-Free RNA, Multiomic Approaches for Detecting Gastrointestinal Cancers (GenomeWeb)
- Universal DX and Quest partner for colorectal cancer blood test (Medical Device Network)
- The Fate Of Digital Health Startups: How Companies Live, Die And Why (Forbes)
- Ingestible Vital Signs Monitor Proves Promising in Initial Human Trial (Mirage.News)
- Southern New Hampshire Health adds GI Genius module (Becker’s GI & Endoscopy)
- GI psychologists help navigate ‘complexities’ of various GI conditions, personalize care (Healio)