The Yuan requests your support! Our content will now be available free of charge for all registered subscribers, consistent with our mission to make AI a human commons accessible to all. We are therefore requesting donations from our readers so we may continue bringing you insightful reportage of this awesome technology that is sweeping the world. Donate now
Health-Tech Snake Oil
By Leeza Osipenko  |  Jun 29, 2021
Health-Tech Snake Oil
Image courtesy of and under license from Shutterstock.com
Astrology buff and Google Health chief David Feinberg tells The Wall Street Journal patients will soon have to receive personalized clinical horoscopes based on their medical histories and inferences drawn from a growing pool of patient records, so we should take a hard look at what today’s health-tech proponents are really selling. Algorithms and data can fail unexpectedly, such as in finance, insurance, law enforcement, and education, where they can be highly discriminatory and destructive.

LONDON - In an interview with the Wall Street Journal, David Feinberg, the head of Google Health and a self-professed astrology buff, enthused that, “If you believe me that all we are doing is organizing information to make it easier for your doctor, I’m going to get a little paternalistic here: I’m never going to let that get opted out.”

Patients will soon thus have no choice but to receive personalized clinical horoscopes based on their own medical histories and inferences drawn from a growing pool of patient records. But even if we want such a world, we should take a hard look at what today’s health-tech proponents are really selling.

Most of the United States’ Big Tech firms - along with many startups, the Big Pharma companies, and others - have jumped on the health-tech bandwagon in recent years. With Big Data analytics, artificial intelligence (AI), and other novel means, they promise to cut costs for struggling healthcare systems, revolutionize doctors’ medical decision-making and save us from ourselves. What could possibly go wrong?

Quite a lot, it turns out. In Weapons of Math Destruction, data scientist Cathy O’Neil lists many examples of how algorithms and data can fail us in unsuspected ways. When transparent data-feedback algorithms are applied to baseball, they work better than expected; but when similar models are used in finance, insurance, law enforcement, and education, they can be highly discriminatory and destructive.

Healthcare is no exception. Individuals’ medical data are susceptible to subjective clinical decision-making, medical errors, and evolving practices, and the quality of larger data sets is often diminished by missing records, measurement errors, and a lack of structure and standardization. Nevertheless, the Big Data revolution in healthcare is being sold as if these troubling limitations did not exist. Worse, many medical decision-makers are falling fo

The content herein is subject to copyright by Project Syndicate. All rights reserved. The content of the services is owned or licensed to The Yuan. The copying or storing of any content for anything other than personal use is expressly prohibited without prior written permission from The Yuan, or the copyright holder identified in the copyright notice contained in the content.
Continue reading
Sign up now to read this story for free.
- or -
Continue with Linkedin Continue with Google
Comments
Share your thoughts.
The Yuan wants to hear your voice. We welcome your on-topic commentary, critique, and expertise. All comments are moderated for civility.