Machine Learning Meets Telehealth

The synergy between machine learning and telehealth will soon make the futuristic medicine practiced in the hit movie Doctor Strange seem like child’s play.

IBM Watson Health is leading the revolution in so-called “cognitive computing,” where machines develop the ability to make inferences and discoveries. For example, the Watson supercomputer can “read” 200 million pages of text in three seconds, then draw conclusions from what it analyzed. If there’s a medical journal in Singapore that provides the missing link to research in San Francisco, IBM Watson can find it. That’s a much-needed skill, since there are now an estimated 150 exabytes of healthcare data in the world. (That’s 150 billion gigabytes.)

In an article published recently in The Guardian, InTouch founder and chairman Dr. Yulun Wang notes that “artificial intelligence, such as machine learning, will soon be integrated cohesively into healthcare delivery through telehealth so that big data sets can be gathered and analyzed to improve global care. It will also improve individual care by matching the specifics of a patient’s diagnosis and treatment plan to millions of comparable cases.” Dr. Wang went on to predict that by 2050 most healthcare will be delivered virtually – and will be as commonplace as online banking is today.

That’s a realistic forecast, considering that IBM Watson researchers are now thinking in terms of “zettabytes” of data. A zettabyte is 270 – a number that’s hard to get your head around. One zettabyte of data would fill a stack of DVDs that would literally stretch from Earth to Mars.

Thanks to advances in machine learning and processing power, it won’t be long before exam rooms become “smart spaces” where every word spoken in a telehealth encounter gets entered as text in a patient’s chart in real time. There may even come a day when Watson is the telehealth neurologist providing remote guidance to human clinicians.