MIT, MGH create VR system to advance physical therapy at home

featured image

Daily News | Online News

MIT and Massachusetts General Hospital researchers created a motion and muscle engagement monitoring system for unsupervised physical rehabilitation, which could help with injuries and better mobility for the elderly and athletes, they say.


Disability conditions benefit from physical rehabilitation, but there aren’t enough physical therapists to go around.

To better enable data-driven unsupervised rehabilitation for athletes in injury recovery, patients currently in physical therapy or those with physical limiting ailments, MIT Computer Science and Artificial Intelligence Laboratory and MGH researchers created a sensor-based wearable device and virtual reality platform, MuscleRehab. 

The system calculates muscle engagement and visualizes it on a virtual muscle skeleton avatar. It uses an imaging technique called electrical impedance tomography that measures how the muscles are engaging with a VR headset and tracking suit that allows patients to watch themselves performing alongside a physical therapist. 

The researchers, who are preparing to present their work for the first time, say studies of the system show that monitoring and visualizing muscle engagement during unsupervised physical rehab can improve therapeutic accuracy and post-rehab evaluations, and perhaps prevent re-injury. 

“By actively measuring deep muscle engagement, we can observe if the data is abnormal compared to a patient’s baseline, to provide insight into the potential muscle trajectory,” said Junyi Zhu, MIT CSAIL PhD student and lead author on a paper about MuscleRehab in today’s announcement. 

The system includes a training regimen with pre-recorded baseline standards for the regimen and streams the avatar with real-time muscle engagement.

Patients wear the tracking suit and VR to capture their 3-D movement data, and then perform various exercises such as lunges, knee bends, deadlifts, leg raises, knee extensions, squats, fire hydrants and bridges that measure quadricep, sartorius, hamstring and adductor activity. 

The EIT sensing board is accompanied by two straps filled with electrodes that are slipped onto a user’s upper thigh to capture 3D volumetric data. Using a motion capture system, the EIT sensing data shows actively triggered muscles on the display where muscles become darker with more engagement. 

The team compared the exercise accuracy with and without the EIT wearable. In both cases, their avatar performs alongside a physical therapist.

A professional PT explained which muscle groups were supposed to be engaged during each of the exercises. They compared the two results – with just the motion tracking data overlaid onto their patient avatar and adding the EIT sensing straps that provide information and visualization of the motion and muscle engagement.

By visualizing both muscle engagement and motion data during these unsupervised exercises instead of just motion alone, the overall accuracy of exercises improved by 15% among the test subjects. 

The researchers also compared how much time during the exercises the correct muscle group got triggered with and without the wearable. 

By monitoring and recording the most engagement data, the PTs reported a much better understanding of the quality of the patient’s exercise, and that it helped to better evaluate their current regime and exercise based on those stats.

Zhu was interested in finding a better way than electromyography used by some wearable devices to sense the engagement (blood flow, stretching, contracting) of different layers of the muscles and was inspired by EIT, which measures the electrical conductivity of muscles and is usually used for monitoring lung function, detecting chest tumors and diagnosing pulmonary embolisms. 

Currently, MuscleRehab focuses on the upper thigh and the major muscle groups inside but could expand to the glutes.

The paper’s co-authors include the scientist Hamid Ghaednia, instructor at the Department of Orthopaedic Surgery of Harvard Medical School and co-director of Center for Physical Artificial Intelligence at Mass General Hospital and Dr. Joseph Schwab, chief of Orthopaedic Spine Center, director of spine oncology and co-director of the Stephan L. Harris Chordoma Center and associate professor of orthopedic surgery at Harvard Medical School.


There is a growing trend to use technologies like remote patient monitoring to care for patients and ease burdens on hospitals and providers.

Innovation with clinicians and patients open and ready to adopt new solutions is in its early stages, Dr. Waqaas Al-Siddiq, chairman, CEO and founder of Biotricity, told Healthcare IT News in March.

“We can advance RPM by looking at diagnostic devices that currently exist for each condition, find which sensors can be integrated into wireless devices and create clinically-relevant, continuous solutions,” he said.

Just like RPM can significantly reduce hospital readmissions and emergency room visits, new sensor-based technologies can embolden approaches to healthcare-at-home and have the potential to improve outcomes and reduce in-person visits. 


“This work advances EIT, a sensing approach conventionally used in clinical settings, with an ingenious and unique combination with virtual reality,” Yang Zhang, assistant professor in electrical and computer engineering at the UCLA Samueli School of Engineering, said in the announcement. 

“The enabled application that facilitates rehabilitation potentially has a wide impact across society to help patients conduct physical rehabilitation safely and effectively at home. Such tools to eliminate the need for clinical resources and personnel have long been needed for the lack of workforce in healthcare.”

Andrea Fox is senior editor of Healthcare IT News.


Healthcare IT News is a HIMSS publication.

Read More

Share on Google Plus
    Blogger Comment
    Facebook Comment


Post a Comment