PRE2018 3 Group2
0LAUK0 - 2018/2019 - Q3 - group 2
Group members
Name | Student ID |
---|---|
Koen Botermans | 0904507 |
Ruben Hendrix | 1236095 |
Jakob Limpens | 1019496 |
Iza Linders | 0945517 |
Eleonora Opstal | 0956340 |
Introduction
Problem Statement
We live in an ageing society, the number of elderly people is ever increasing (OECD, 2007) [1]. As a consequence, pressure on caregivers is also rising. Researching possibilities to alleviate this pressure by means of a robotic platform is what we are going to do in the following eight weeks. However this platform cannot be any platform since the emotional comfort is vital in medical environments (ANR, 2005) [2]. As a result our research will have as main focus the emotional response of elderly people in the presence of such a robotic agent. Except for the emotional response the dignity of this robotic platform will be assessed as well. Since our group is a multidisciplinary team this problem will be approached from both a technical perspective and a user centered perspective.
Initial ideas
Objectives
To get a better overview of what needs to be done to complete the project, we set ourselves a couple objectives:
System objectives:
- The system should be able to express emotions in a manner that is natural, non-ambiguous and clear.
- The system should behave like a human, without entering the uncanny valley.
Research objectives:
- We want to understand how elderly react to different facial emotions/designs expressed by a robot.
- We want to optimize the way that robots show emotions to elderly-users.
State of the Art
Different themes related to our project can be identified. They are listed here, together with the relevant papers that provide the stated information.
Robot emotion expression
Different papers provide an insight into the existing technology in the field of recognizing human emotion in HCI research.[3] Article 1 provides a discussion of existing literature on social robots paired with care for elderly suffering from dementia. It also discusses what contributions and cautions are bound to the use of these assistive robotic agents, and provied literature related to care of elderly with dimentia.[4] Article 2 places its focus on the more general user, and contains a study about the effect of a robot's expressions and physical appearance on the perception of said robot, by the user.[5] Article 3 and Article 4 present a description of the development, testing and evaluation of a robotic system, and a framework for emotion interaction for service robots, respectively. Article 3 focuses on the development of EDDIE, a flexible low-cost emotion-display with 23 degrees of freedom for expressing realistic emotions through a humanoid face, while Article 4 presents a framework for recoginition, analysing and generation of emotion based on touch, voice and dialogue.[6][7] The results of a different project, KOBIAN, are described in Article 5. Here, different ways of expressing emotions through the entire body of the robot are tested and evaluated.[8]
Emotion recognition
In Article 6, titled "Affective computing for HCI", multiple broad areas related to HCI are adressed, discussing recent and ongoing work at the MIT Media Lab. Affective computing aims to reduce user frustration during interaction. It enables machines to recognize meaningful patterns of such expression, and explains different types of communication (parallel vs. non-parallel).[9] Article 7 consists of a large elaboration on emotion, mood, effects of affect which yields performance and memory, along with its causes and measurements. There is an interaction of affective design and HCI.[10] A study on emotion-specific autonomic nervous system activity in elderly people was done in Article 8. The elderly taking part followed muscle-by-muscle instructions for constructing facial proto-types of emotional expressions and re-lived past emotional experiences.[11] Reporting on two studies, Article 9 shows how elderly have difficulty differentiating anger and sadness, when shown faces. The two studies provide methods for measuring the ability of emotion recognition in humans.[12] Lastly, a definition of several emotions and their relation to physical responses is needed (breathing, facial expressions, body language, etc.). Article 10 describes how a computer can track and analyse a face to compute the emotion the subject is showing. Useful as an insight in how input data can be gathered, analysed and used for the purpose of mimicking or mirroring by a robot.[13]
Assistive technology for elderly
This category focusses mostly on robotic care for elderly living at home. Articles such as article 15 [14] and article 11 [15] explore the effect of robots to the social, psychological and physiological levels, providing a base for knowing what types of results certain actions yield.
Article 14 [16] explores six ethical issues related to deploying robots for elderly care, article 13 [17] provides more knowledge when it compares children, elderly and robots and their roles in relationships with humans.
Article 12 [18] looks at the available technologies for tracking elderly at home, keeping them safe and alerting others when needed.
Dementia and Alzheimer's
Closely related to elderly are the diseases Dementia and Alzheimer’s. These diseases affect the brain of a patient, making them forgetful or unable to complete tasks in their day-to-day life without help. Article 16 [19] elaborates on this and explains the impacts and a possible framework to counter these diseases is proposed. Article 17 [20] discusses whether an entertainment robot is useful to be deployed in care for elderly with dementia.
Measuring tools
Here, papers are collected that provide aid for designing a robotic system, article 18 [21] discusses some of the difficulties in measuring affection in Human Computer Interaction (HCI). For designing a system that is capable of expressing emotion, article 19[22] and article 20 [23] show recent developments when designing a ‘face’ or whole body to express emotion to a user.
Miscellaneous
In the study of emotions, models are often useful. Article 21 elaborates on social aspects of emotions, together with the implementation in human-computer interaction. A model is tested and accordingly a system is designed with the goal of providing a means of affective input (for example: emoticons). [24]
Article 22 describes emotional comfort was researched in patients that found themselves in a therapeutic state. It provides an understanding of the role of personal control in recovery and aspects of the hospital environment that impact hospitalized patients’ feelings of personal control.[25]
A more medical approach is taken in Article 23: evidence is provided for altered functional responses in brain regions subserving emotional behaviour in elderly subjects, during the perceptual processing of angry and fearful facial expressions, compared to youngsters.[26]
Statistics about the greying of global population provide insight in challenges paired with this greying. Article 24 is a collection of evidence for the greying of the wordl's population, with causes and consequences. It states three challenges.[27] Article 25 indicates the importance of a healthy community to the efficiency and economic growth of said community. It points out the goals of improving the healthcare system and how these improvements can be made.[28]
Project setup
Approach
Planning
The planning for our group can be viewed here
Milestones
Deliverables
At the end of this project period, these are the things we want to have completed to present:
- This wiki page
- A study report
- A prototype
USE aspects
Users
Society
Enterprise
Conclusion
Discussion
References
Useful links
Netflix film 'Next gen' [29]
- ↑ https://www.oecd.org/newsroom/38528123.pdf OECD. (2007). Annual Report 2007. Paris: OECD Publishing.
- ↑ https://www.sciencedirect.com/science/article/pii/S0897189704000874 ANR. (2005). Applied Nursing Research 18 (2005) 22-28
- ↑ https://www.researchgate.net/publication/235328873_Dominance_and_valence_A_two-factor_model_for_emotion_in_HCI Dryer, Christopher. (1998). Dominance and valence: A two-factor model for emotion in HCI.
- ↑ https://www.ncbi.nlm.nih.gov/pubmed/23177981 Elaine Mordoch, Angela Osterreicher, Lorna Guse, Kerstin Roger, Genevieve Thompson (2013), Use of social commitment robots in the care of elderly people with dementia: A literature review, Maturitas, p 14-20
- ↑ http://www.cs.cmu.edu/~social/reading/breemen2004c.pdf
- ↑ https://ieeexplore.ieee.org/document/4058873 S. Sosnowski, A. Bittermann, K. Kuhnlenz and M. Buss (2006), Design and Evaluation of Emotion-Display EDDIE, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 3113-3118.
- ↑ https://ieeexplore.ieee.org/document/4415108 D. Kwon et al. (2007), Emotion Interaction System for a Service Robot, RO-MAN 2007 - The 16th IEEE International Symposium on Robot and Human Interactive Communication, pp. 351-356.
- ↑ https://ieeexplore.ieee.org/document/5326184 M. Zecca et al. (2009), Whole body emotion expressions for KOBIAN humanoid robot — preliminary experiments with different Emotional patterns —, RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication, pp. 381-386.
- ↑ https://affect.media.mit.edu/pdfs/99.picard-hci.pdf R. W. Picard (2003), Affective computing for HCI
- ↑ https://www.researchgate.net/publication/242107189_Emotion_in_Human-Computer_Interaction Brave, Scott & Nass, Clifford. (2002). Emotion in Human–Computer Interaction. The Human-Computer Interaction Handbook: Fundamentals, Evolving Technologies and Emerging Applications. 10.1201/b10368-6.
- ↑ https://www.ncbi.nlm.nih.gov/pubmed/2029364 Levenson, R. W., Carstensen, L. L., Friesen, W. V., & Ekman, P. (1991). Emotion, physiology, and expression in old age. Psychology and Aging, 6(1), 28-35.
- ↑ https://www.tandfonline.com/doi/abs/10.1080/00207450490270901 Susan Sullivan & Ted Ruffman (2004) Emotion recognition deficits in the elderly, International Journal of Neuroscience, 403-432.
- ↑ https://ieeexplore.ieee.org/document/911197 R. Cowie et al., "Emotion recognition in human-computer interaction," in IEEE Signal Processing Magazine, 32-80, Jan 2001.
- ↑ https://www.researchgate.net/publication/3450481_Living_With_Seal_Robots_-_Its_Sociopsychological_and_Physiological_Influences_on_the_Elderly_at_a_Care_House Wada, K & Shibata, Takanori. (2007). Living With Seal Robots - Its Sociopsychological and Physiological Influences on the Elderly at a Care House. Robotics, 972-980.
- ↑ https://www.researchgate.net/publication/229058790_Assistive_social_robots_in_elderly_care_A_review Broekens, Joost & Heerink, Marcel & Rosendal, Henk. (2009). Assistive social robots in elderly care: A review. Gerontechnology, 94-103.
- ↑ https://www.researchgate.net/publication/226452328_Granny_and_the_robots_Ethical_issues_in_robot_care_for_the_elderly Sharkey, Amanda & Sharkey, Noel. (2010). Granny and the robots: Ethical issues in robot care for the elderly. Ethics and Information Technology. 27-40.
- ↑ https://ieeexplore.ieee.org/document/5751987 A. Sharkey and N. Sharkey, "Children, the Elderly, and Interactive Robots," in IEEE Robotics & Automation Magazine, vol. 18, no. 1, pp. 32-38, March 2011. doi: 10.1109/MRA.2010.940151
- ↑ https://www.ncbi.nlm.nih.gov/pubmed/11742772 FG. Miskelly, Assistive technology in elderly care. Department of Medicine for the Elderly
- ↑ https://alzres.biomedcentral.com/articles/10.1186/alzrt143 M. Wortmann, Alzheimer's Research & Therapy (2012)
- ↑ https://academic.oup.com/biomedgerontology/article/59/1/M83/533605 The Journals of Gerontology: Series A, Volume 59, Issue 1, 1 January 2004, Pages M83–M85
- ↑ https://dl.acm.org/citation.cfm?id=1358952&dl=ACM&coll=DL N. Sadat Shami, Jeffrey T. Hancock, Christian Peter, Michael Muller, and Regan Mandryk. 2008. Measuring affect in hci: going beyond the individual. In CHI '08 Extended Abstracts on Human Factors in Computing Systems (CHI EA '08). ACM, New York, NY, USA, 3901-3904. DOI: https://doi.org/10.1145/1358628.1358952
- ↑ https://ieeexplore.ieee.org/document/1642261 H. Shibata, M. Kanoh, S. Kato and H. Itoh, "A system for converting robot 'emotion' into facial expressions," Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006., Orlando, FL, 2006, pp. 3660-3665. doi: 10.1109/ROBOT.2006.1642261
- ↑ https://ieeexplore.ieee.org/document/4755969 M. Zecca, N. Endo, S. Momoki, Kazuko Itoh and Atsuo Takanishi, "Design of the humanoid robot KOBIAN - preliminary analysis of facial and whole body emotion expression capabilities-," Humanoids 2008 - 8th IEEE-RAS International Conference on Humanoid Robots, Daejeon, 2008, pp. 487-492. doi: 10.1109/ICHR.2008.4755969
- ↑ https://www.researchgate.net/publication/235328873_Dominance_and_valence_A_two-factor_model_for_emotion_in_HCI Dryer, Christopher. (1998). Dominance and valence: A two-factor model for emotion in HCI.
- ↑ https://www.ncbi.nlm.nih.gov/pubmed/15812732 AM. Williams et al., 'Enhancing the therapeutic potential of hospital environments by increasing the personal control and emotional comfort of hospitalized patients.' (2005)
- ↑ https://www.ncbi.nlm.nih.gov/pubmed/15936178 A. Tessitore et al., 'Functional changes in the activity of brain regions underlying emotion processing in the elderly.' (2005)
- ↑ https://academic.oup.com/ppar/article-abstract/17/4/12/1456824?redirectedFrom=fulltext Adele M. Hayutin; Graying of the Global Population, Public Policy & Aging Report, Volume 17, Issue 4, 1 September 2007, Pages 12–17
- ↑ https://www.who.int/hrh/com-heeg/reports/en/ World Health Organisation, 'Working for health and growth: investing in the health workforce' (2016)
- ↑ https://www.youtube.com/watch?v=uf3ALGKgpGU