The issue that the project will adress
The goal of the project is to demonstrate and refine an integrated solution for the remote monitoring, diagnosis and support of aged people living independently at home or home-care institutions. The project aims to improve the speed and reliability of situational assessment to attain timely, personalized, intervention and adaptation of care. We investigate the use of multiple sensors for the detection and management of changes in activities of activities of daily living (ADLs) [22], lifestyle patterns, emotions, and vital signs, as well as development of an intelligent system to translate multi-sensor inputs into an accurate situational assessment and supportive intervention with seniors at elevated risk for adverse health events.
Why this research is needed at the present time
The older population -65 years or older- in 2000 represented 12.4% of the US population, this number is expected to grow to 19% in 2030 [23]. This increasing proportion of the population requires of novel assistive methods leading to a better quality of living. Our broad motivation is to allow aging adults to live at home for longer periods of their lives to improve their quality of life and to avoid costly and unwanted relocations into institutional care. To achieve such a goal, a better understanding of real human activities is needed.
In this project, we aim to advance the understanding of the correlation of behavioral and cognitive cues with meaningful fluctuations in health status. The project will address the detection and prediction in real-life conditions of daily living activity including: falls, infrequent activity (e.g. front door loitering, immobility, irregular balance), functional activity (e.g. urinary frequency, medication intake, substance abuse) and long term behavioral patterns (e.g. depression, apathy, restlessness).
The automatic detection of daily activities is a first step to generate rich information that can be used by caregivers in the early diagnosis and assessment of an aging adult condition and the prediction of possible unwanted events. The output of this project aims at improving current techniques for: establishing treatment baselines, optimizing intervention planning, monitoring the progress of a treatment, predicting high risk situations (e.g. injury produced by falls). All daily activities are important, and have an impact on the overall healthy condition of an elder. To give an example, the detection of high risk situations is an immediate social necessity where approximately 32-40% of elderly patients (75 years or more) fall once a year [24], and this number only increases with age and frailty.
In this project, we aim to advance the understanding of the correlation of behavioral and cognitive cues with meaningful fluctuations in health status. The project will address the detection and prediction in real-life conditions of daily living activity including: falls, infrequent activity (e.g. front door loitering, immobility, irregular balance), functional activity (e.g. urinary frequency, medication intake, substance abuse) and long term behavioral patterns (e.g. depression, apathy, restlessness).
The automatic detection of daily activities is a first step to generate rich information that can be used by caregivers in the early diagnosis and assessment of an aging adult condition and the prediction of possible unwanted events. The output of this project aims at improving current techniques for: establishing treatment baselines, optimizing intervention planning, monitoring the progress of a treatment, predicting high risk situations (e.g. injury produced by falls). All daily activities are important, and have an impact on the overall healthy condition of an elder. To give an example, the detection of high risk situations is an immediate social necessity where approximately 32-40% of elderly patients (75 years or more) fall once a year [24], and this number only increases with age and frailty.
Outline of research questions, study design, and study method
Daily activity detection systems need to operate in real time and they need to be reliable and accurate with high sensitivity and specificity. In some cases these properties can be reached in experimental environments [1, 1b, 5, 6, 7 , 20, 21] but the detection rate decreases when the systems are applied to real situations [11]. In general the systems are designed as proof of concepts, using a small set examples of simulated (i.e. acted) activity to build the computational activity detectors. We aim at reducing the gap between research prototypes and a real world applications by developing new machine learning -based algorithms for long term real data. The algorithms benefit of the variety of data to automatically learn the important cues that define the structure of different human activity. Also, long term data will enable unprecedented studies on behavioral patterns (e.g. the functional evolution of an Alzheimer's patient over a period of one year).
The use of real information will enable to understand the best practices of usability and acceptance of the technology. Elder’s acceptance presents an unprecedented problem since they may not be familiar with electronic devices. To address this challenge the way the system works is of critical importance [17]. Previous studies encourage the use of technology for the elder, who show a positive attitude [18] on embracing simple assistive daily technology (i.e. emergency buttons), but the extension to more advanced systems --such as ours-- is still an open question.
The project will explore the combination of context-aware and wearable sensors. The combination of both type of sensors can lead to more robust detection systems and opens new research paths on multi sensorial data understanding. In the one hand, context-aware approaches enable to detect human activity in an accurate and noninvasive manner [1b]. Context aware sensors are a good option for detecting complex activities such as emotions, and social interactions. But the operation is limited to the place where the sensors are located. In the other hand, wearable sensors allows accurate measurements of low resolution human cues such as acceleration which can be used to understand sleeping patterns and human motion. But they are invasive and the resolution of the data is limited to simple activities.
The design of the study is composed of 2 parts 1) Data Collection 2) System development 3) Evaluation.
1) Data Collection: The collection of real-world data is of crucial importance to develop a robust to failure system. The system needs to be able of working on different scenarios with minimal configuration. Also, the type of sensors used to capture human activity cues will be a decisive factor in building the system algorithmic pipeline. Senior volunteers are recruited and the data collection takes a period of time superior than 2 month. The data is obtained through sensors located at the home care institution. Participants will be instructed to behave maintaining their daily routines. The data collection aims at capturing a set of cues leading to household activities such as “Preparing meal”, “sleeping patterns”, “common walking pattern”, etc. The target activities will be chosen to assess the person’s independence and to understand the phenotypic structures of normal and abnormal behaviors.
2) System Development: The computational processing (i.e. detection of activities, analytics, etc.) occurs in place to ensure privacy. Once the computational model is developed there is no recording of information that can be used to identify the data content creator. The system will follow the privacy by design paradigm [14][17]. The raw data collected, will be labeled with semantical interpretation and the computational representation of the activities will be learned by a set of methods [1b] [25] [26] [27] developed by one of the project partners. Once trained, the system will be capable of detecting automatically the trained activities in unseen raw data.
3) Evaluation: The study concludes by quantitatively evaluating the effectiveness of the system and reporting the findings.
The use of real information will enable to understand the best practices of usability and acceptance of the technology. Elder’s acceptance presents an unprecedented problem since they may not be familiar with electronic devices. To address this challenge the way the system works is of critical importance [17]. Previous studies encourage the use of technology for the elder, who show a positive attitude [18] on embracing simple assistive daily technology (i.e. emergency buttons), but the extension to more advanced systems --such as ours-- is still an open question.
The project will explore the combination of context-aware and wearable sensors. The combination of both type of sensors can lead to more robust detection systems and opens new research paths on multi sensorial data understanding. In the one hand, context-aware approaches enable to detect human activity in an accurate and noninvasive manner [1b]. Context aware sensors are a good option for detecting complex activities such as emotions, and social interactions. But the operation is limited to the place where the sensors are located. In the other hand, wearable sensors allows accurate measurements of low resolution human cues such as acceleration which can be used to understand sleeping patterns and human motion. But they are invasive and the resolution of the data is limited to simple activities.
The design of the study is composed of 2 parts 1) Data Collection 2) System development 3) Evaluation.
1) Data Collection: The collection of real-world data is of crucial importance to develop a robust to failure system. The system needs to be able of working on different scenarios with minimal configuration. Also, the type of sensors used to capture human activity cues will be a decisive factor in building the system algorithmic pipeline. Senior volunteers are recruited and the data collection takes a period of time superior than 2 month. The data is obtained through sensors located at the home care institution. Participants will be instructed to behave maintaining their daily routines. The data collection aims at capturing a set of cues leading to household activities such as “Preparing meal”, “sleeping patterns”, “common walking pattern”, etc. The target activities will be chosen to assess the person’s independence and to understand the phenotypic structures of normal and abnormal behaviors.
2) System Development: The computational processing (i.e. detection of activities, analytics, etc.) occurs in place to ensure privacy. Once the computational model is developed there is no recording of information that can be used to identify the data content creator. The system will follow the privacy by design paradigm [14][17]. The raw data collected, will be labeled with semantical interpretation and the computational representation of the activities will be learned by a set of methods [1b] [25] [26] [27] developed by one of the project partners. Once trained, the system will be capable of detecting automatically the trained activities in unseen raw data.
3) Evaluation: The study concludes by quantitatively evaluating the effectiveness of the system and reporting the findings.
Why the need of an interdisciplinary effort
Most of previous studies addressing complex human activity detection fall short in the applicability and deployment for real-world applications. These limitations can produce results which are constrained to particular experimental setups. There is an increasing need of multidisciplinary teams which can produce end-to-end solutions to real world problems. Our team is composed of partners with extensive experience in healthcare and technical fields. The team is composed of: Physicians, health-care professionals and Computational Researchers. In the one hand, OnLok has a vast amount of resources to collect human data and to provide medical knowledge to develop the proposed system.
In the other hand, the stanford PAC partner has experience on developing automatic systems for understanding human activities of multiple sensors [1] [25] [26] [27].
In the other hand, the stanford PAC partner has experience on developing automatic systems for understanding human activities of multiple sensors [1] [25] [26] [27].
REFERENCES
[1] Zouba-Valentin N, “Multisensor Fusion for Monitoring Elderly Activities at Home", PhD thesis, Nice-Sophia Antipolis University, January 2010.
[1.b] Unsupervised Discovery and Recognition of long term Activities - Guido PUSIOL, Francois BREMOND, Monique THONNAT - The 8th International Conference on Computer Vision Systems. (ICVS 2011).
[2] Mathuranath P.S., et al., Instrumental activities of daily living scale for dementia screening in elderly people. Int Psychogeriatr, 2005. 17(3): p. 461-74.
[3] http://www.pewinternet.org/2013/01/28/tracking-for-health/
[4] Towards an autonomous fall detection and alerting system on a mobile and pervasive environment Ivo C. Lopes, Binod Vaidya, Joel J. P. C. Rodrigues. Telecommunication Systems. April 2013, Volume 52, Issue 4, pp 2299-2310
[5] Fall detection system using Kinect’s infrared sensor. Journal of Real-Time Image Processing December 2014, Volume 9, Issue 4, pp 635-646 Date: 15 Mar 2012 Georgios Mastorakis, Dimitrios Makris
[6] Introducing the use of depth data for fall detection. Rainer Planinc and Martin Kampel. Pers Ubiquit Comput. 2012
[7] Michal Kepski and Bogdan Kwolek: "Fall Detection using Ceiling-mounted 3D Depth Camera." In Proc. Int. Conf. on Computer Vision Theory and Applications, Volume 2 pp. 640-647, Lisbon, 5-8 January 2014. SciTePress 2014 ISBN 978-989-758-004-8
[8] http://le2i.cnrs.fr/Fall-detection-Dataset?lang=fr
[9] Challenges, issues and trends in fall detection systems Raul Igual*, Carlos Medrano and Inmaculada Plaza. BioMedical Engineering OnLine 2013, 12:66 doi:10.1186/1475-925X-12-66
[10] http://www.biomedical-engineering-online.com/content/12/1/66
[11] Noury N, Fleury A, Rumeau P, Bourke AK, Laighin GO, Rialle V, Lundy JE: Fall detection - principles and methods. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society. Lyon: Institute of Electrical and Electronics Engineers; 2007:1663-1666.
[12]Drost C: Privacy in context-aware systems. [http://www.utwente.nl/ewi/asna/assignments/completed/drost.pdf]
[13] Baldauf M: A survey on context-aware systems. Int J Ad Hoc and Ubiquitous Comput 2007, 2(4):263-277.
[14] Sensors and In-Home Collection of Health Data: A Privacy by Design Approach. http://www.ipc.on.ca/images/Resources/pbd-sensor-in-home.pdf
[15] Roupa Z, Νikas M, Gerasimou E, Zafeiri V, Giasyrani L, Κazitori E, Sotiropoulou P: The use of technology by the elderly. Health Sci J 2010, 4:118-126
[16] Albert MV, Kording K, Herrmann M, Jayaraman A: Fall classification by machine learning using mobile phones. PLoS One 2012, 7:e36556.
[17]Kurniawan S: Older people and mobile phones: A multi-method investigation.Int J Human-Comput Stud 2008, 66:889-901. Publisher Full Text
[18] Plaza I, Martin L, Martin S, Medrano C: Mobile applications in an aging society: Status and trends. J Syst Softw 2011, 84:1977-1988.
[19] http://www.ipc.on.ca/images/Resources/7foundationalprinciples.pdf
[20] Kangas M, Vikman I, Nyberg L, Korpelainen R, Lindblom J, Jms T: Comparison of real-life accidental falls in older people with experimental falls in middle-aged test subjects. Gait Posture 2012, 35:500-505.
[21] Chen J, Kwong K, Chang D, Luk J, Bajcsy R: Wearable Sensors for Reliable Fall Detection.In Proceedings of the IEEE Engineering in Medicine and Biology 27th Annual Conference. Shanghai: Institute of Electrical and Electronics Engineers; 2005:1-4.
[22] Monitoring Activities of Daily Living of the Elderly and the Potential for Its Use in Telecare and Telehealth: A Review. Gokalp, Hulya; Clarke, Malcolm. December 2013. Telemedicine & e-Health;Dec2013, Vol. 19 Issue 12, p910
[23] US. Department of Health and Human Services.
[24] World Health Organization: Global report on falls prevention in older age.
[25] Large-Scale Video Classification with Convolutional Neural Networks. Andrej Karpathy, George Toderici, Sanketh Shetty, Thomas Leung, Rahul Sukthankar, Li Fei-Fei - CVPR 2014 (ORAL)
[26] Socially-aware Large-scale Crowd Forecasting. Alexandre Alahi, Vignesh Ramanathan, and Li Fei-Fei. CVPR 2014 (ORAL)
[27] Combining the Right Features for Complex Event Recognition Kevin Tang, Bangpeng Yao, Li Fei-Fei, Daphne Koller ICCV 2013
[1.b] Unsupervised Discovery and Recognition of long term Activities - Guido PUSIOL, Francois BREMOND, Monique THONNAT - The 8th International Conference on Computer Vision Systems. (ICVS 2011).
[2] Mathuranath P.S., et al., Instrumental activities of daily living scale for dementia screening in elderly people. Int Psychogeriatr, 2005. 17(3): p. 461-74.
[3] http://www.pewinternet.org/2013/01/28/tracking-for-health/
[4] Towards an autonomous fall detection and alerting system on a mobile and pervasive environment Ivo C. Lopes, Binod Vaidya, Joel J. P. C. Rodrigues. Telecommunication Systems. April 2013, Volume 52, Issue 4, pp 2299-2310
[5] Fall detection system using Kinect’s infrared sensor. Journal of Real-Time Image Processing December 2014, Volume 9, Issue 4, pp 635-646 Date: 15 Mar 2012 Georgios Mastorakis, Dimitrios Makris
[6] Introducing the use of depth data for fall detection. Rainer Planinc and Martin Kampel. Pers Ubiquit Comput. 2012
[7] Michal Kepski and Bogdan Kwolek: "Fall Detection using Ceiling-mounted 3D Depth Camera." In Proc. Int. Conf. on Computer Vision Theory and Applications, Volume 2 pp. 640-647, Lisbon, 5-8 January 2014. SciTePress 2014 ISBN 978-989-758-004-8
[8] http://le2i.cnrs.fr/Fall-detection-Dataset?lang=fr
[9] Challenges, issues and trends in fall detection systems Raul Igual*, Carlos Medrano and Inmaculada Plaza. BioMedical Engineering OnLine 2013, 12:66 doi:10.1186/1475-925X-12-66
[10] http://www.biomedical-engineering-online.com/content/12/1/66
[11] Noury N, Fleury A, Rumeau P, Bourke AK, Laighin GO, Rialle V, Lundy JE: Fall detection - principles and methods. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society. Lyon: Institute of Electrical and Electronics Engineers; 2007:1663-1666.
[12]Drost C: Privacy in context-aware systems. [http://www.utwente.nl/ewi/asna/assignments/completed/drost.pdf]
[13] Baldauf M: A survey on context-aware systems. Int J Ad Hoc and Ubiquitous Comput 2007, 2(4):263-277.
[14] Sensors and In-Home Collection of Health Data: A Privacy by Design Approach. http://www.ipc.on.ca/images/Resources/pbd-sensor-in-home.pdf
[15] Roupa Z, Νikas M, Gerasimou E, Zafeiri V, Giasyrani L, Κazitori E, Sotiropoulou P: The use of technology by the elderly. Health Sci J 2010, 4:118-126
[16] Albert MV, Kording K, Herrmann M, Jayaraman A: Fall classification by machine learning using mobile phones. PLoS One 2012, 7:e36556.
[17]Kurniawan S: Older people and mobile phones: A multi-method investigation.Int J Human-Comput Stud 2008, 66:889-901. Publisher Full Text
[18] Plaza I, Martin L, Martin S, Medrano C: Mobile applications in an aging society: Status and trends. J Syst Softw 2011, 84:1977-1988.
[19] http://www.ipc.on.ca/images/Resources/7foundationalprinciples.pdf
[20] Kangas M, Vikman I, Nyberg L, Korpelainen R, Lindblom J, Jms T: Comparison of real-life accidental falls in older people with experimental falls in middle-aged test subjects. Gait Posture 2012, 35:500-505.
[21] Chen J, Kwong K, Chang D, Luk J, Bajcsy R: Wearable Sensors for Reliable Fall Detection.In Proceedings of the IEEE Engineering in Medicine and Biology 27th Annual Conference. Shanghai: Institute of Electrical and Electronics Engineers; 2005:1-4.
[22] Monitoring Activities of Daily Living of the Elderly and the Potential for Its Use in Telecare and Telehealth: A Review. Gokalp, Hulya; Clarke, Malcolm. December 2013. Telemedicine & e-Health;Dec2013, Vol. 19 Issue 12, p910
[23] US. Department of Health and Human Services.
[24] World Health Organization: Global report on falls prevention in older age.
[25] Large-Scale Video Classification with Convolutional Neural Networks. Andrej Karpathy, George Toderici, Sanketh Shetty, Thomas Leung, Rahul Sukthankar, Li Fei-Fei - CVPR 2014 (ORAL)
[26] Socially-aware Large-scale Crowd Forecasting. Alexandre Alahi, Vignesh Ramanathan, and Li Fei-Fei. CVPR 2014 (ORAL)
[27] Combining the Right Features for Complex Event Recognition Kevin Tang, Bangpeng Yao, Li Fei-Fei, Daphne Koller ICCV 2013