AI’s so-called black box refers to the fact that much of the underlying technology behind machine learning-enhanced algorithms is probability/statistics without a human readable explanation. Oftentimes that’s the case because the advanced math or the data science behind the algorithms is too complex for the average user to understand without additional training. Several stakeholders in digital health maintain, however, that this lack of understanding isn’t that important. They argue that as long as an algorithm generates actionable insights, most clinicians don’t really care about what’s “under the hood.” Is that reasoning sound?
Trending
- Just like Uber and Netflix, Salvo is bringing on-demand care to GI
- Da Vinci Code: First Autonomous Robot Surgery Achieved in Pig Cadavers (Inside Precision Medicine)
- Personalis Expands Tempus Strategic Collaboration to Bring Ultra-Sensitive Cancer Recurrence Testing to Colorectal Cancer Patients (CLP)
- Physician Services Top $1 Trillion as ASC-Linked Spending Grows (ASC News)
- AGA members push states to provide obesity care (AGA)
- Vibrant Wellness Launches Redesigned Gut Microbiome Test With Broader Microbial and Metabolic Insights (ACCESS Newswire)
- Oral Semaglutide: By Year’s End, a GLP-1 Weight-Loss Pill? (Medscape)
- Bringing Clinical Trial Enrollment to the Point of Diagnosis (Pharma’s Almanac)