
Many data types are used to diagnose and treat disease, but interpreting them can be challenging. Scientists in the Eric and Wendy Schmidt Center and Data Sciences Platform built an AI framework, MODES (Multi-mOdal Disentangled Embedding Space), that decouples or separates the information that is shared in all modalities from those that only one test can uniquely measure. They applied MODES to a cardiovascular model using ECG and cardiac MRI and showed that it provides a better picture of an individual’s health than previous models, which may help improve diagnostics.
The authors of the paper are Schmidt Center postdoctoral fellow Sana Tonekaboni, Senior Group Leader and Prinicipal Machine Learning Scientist Sam Freesun Friedman, former Schmidt Center PhD fellow Xinyi Zhang, Director of Machine Learning for Health Mahnaz Maddah, and Schmidt Center Director Caroline Uhler.
Read more in npj Digital Medicine.
