Authors
|
F. Alvarez |
M. Popa | |
V. Solachidis | |
G. Hernandez | |
A. Belmonte-Hernandez | |
S. Asteriadis | |
N. Vretos | |
M. Quintana | |
D. Dotti | |
T. Theodoridis | |
P. Daras | |
Year
|
2018 |
Venue
|
in IEEE MultiMedia, vol. 25, no. 1, pp. 14-25, Jan.-Mar. 2018 |
Download
|
|
The analysis of multimodal data collected by innovative imaging sensors, Internet of Things devices, and user interactions can provide smart and automatic distant monitoring of Parkinsons and Alzheimers patients and reveal valuable insights for early detection and/or prevention of events related to their health. This article describes a novel system that involves data capturing and multimodal fusion to extract relevant features, analyze data, and provide useful recommendations. The system gathers signals from diverse sources in health monitoring environments, understands the user behavior and context, and triggers proper actions for improving the patients quality of life. The system offers a multimodal, multi-patient, versatile approach not present in current developments. It also offers comparable or improved results for detection of abnormal behavior in daily motion. The system was implemented and tested during 10 weeks in real environments involving 18 patients.