Video

MS research: “Our patients can’t wait” for conventional techniques


 

REPORTING FROM ACTRIMS FORUM 2019

– The time is right to bring big data and high-horsepower computation to the thorniest problems in multiple sclerosis (MS) research, said Jennifer Graves, MD, who cochaired the closing session at the meeting held by the Americas Committee for Research and Treatment in Multiple Sclerosis. The session focused on harnessing machine learning, deep learning, and the newest noninvasive observational techniques to move research and clinical care forward.

“We’ve reached a point in MS research where we’re hitting some stumbling blocks. And a lot of those stumbling blocks are related to how well and how precisely we can measure phenotype in MS. The reason that’s important is that our next frontier is treating progressive MS – and what that requires is finding things that let us know what’s happening at the biological level, so that we can screen drugs faster. We can’t afford to have 3- to 5-year clinical trials. ... Because our patients can’t wait,” said Dr. Graves, an associate professor of neuroscience at the University of California, San Diego.

“We can use all sorts of big data sources, whether it’s the rich imaging data we get on patients when they go into the MRI scanner, whether it’s wearable sensors,” or even newer technology, Dr. Graves said. “We can use technology to give us the sensitivity that we’ve been missing.”

Wearable technology, including accelerometers, can track physical activity that tracks with outcomes in MS, she added. As the tech armament increases, so will data available for analysis and correlation.

However, the key to progress will be to focus on technology that measures change over time. “This is the key: sensitivity to change over time. A lot of things can be associated with disability,” said Dr. Graves, but the key is tracking what changes in an individual patient with disease progression, “so that we can detect treatment effects or side effects.”

Next Article: