PRE2023 3 Group4: Difference between revisions
(added approach) |
(added theory and individual efforts for week 1) |
||
Line 106: | Line 106: | ||
=== Week 1 === | === Week 1 === | ||
{| class="wikitable" | |||
|'''Name''' | |||
|'''Total Hours''' | |||
|'''Break-down''' | |||
|- | |||
|Danique Klomp | |||
|13.5 | |||
|Intro lecture (2h), Group meeting (2h), Group meeting (2h), Literary search (4h), Writing summary LS (2h), Writing problem statement first draft (1,5h) | |||
|- | |||
|Liandra Disse | |||
|12 | |||
|Intro lecture (2h), group meeting (2h), Searching and reading literature (4h), writing summary (2h), group meeting (2h) | |||
|- | |||
|Emma Pagen | |||
|12 | |||
|Intro lecture (2h), group meeting (2h), literary search (4h), writing a summary of the literature (2h), writing the approach for the project (1h), updating the wiki (1h) | |||
|- | |||
|Isha Rakhan | |||
|11 | |||
|Intro lecture (2h), group meeting (2h), group meeting (2h), Collecting Literature and summarizing (5h) | |||
|- | |||
|Margit de Ruiter | |||
|13 | |||
|Intro lecture (2h), group meeting (2h), literature research (4h), writing summary literature (3h) group meeting (2h) | |||
|- | |||
|Naomi Han | |||
|2 | |||
|group meeting (2h) | |||
|} | |||
== Literary review == | == Literary review == | ||
'''State of the art''' | |||
The use of social robots has increased rapidly over time. Social robots are being developed specifically for interacting with humans and other robots. They use artificial intelligence and are equipped with tools such as sensors, cameras and microphones. These tools enable the robot to interact with humans <ref name=":0">Biba, J. (2023, March 10). ''What is a social robot?'' Retrieved from Built In: <nowiki>https://www.nature.com/articles/s41598-020-66982-y#citeas</nowiki> </ref>. These robots come in all different kinds of shapes and sizes. For example, there are social robots such as Pepper, that look more humanlike, and there are robots such as PARO, which is seal-like <ref>Geva, N., Uzefovsky, F., & Levy-Tzedek, S. (2020, June 17). ''Touching the social robot PARO reduces pain perception and salivary oxytocin levels''. Retrieved from Scientific reports: <nowiki>https://www.nature.com/articles/s41598-020-66982-y#citeas</nowiki> </ref>. These types of robots are now mostly used in service settings <ref>Borghi, M., & Mariani, M. (2022, September). ''The role of emotions in the consumer meaning-making of interactions with social robots''. Retrieved from Science Direct: <nowiki>https://www.sciencedirect.com/science/article/pii/S0040162522003687</nowiki> </ref>. | |||
As human-robot interactions are growing more important, more research is being done around these interactions, such as acceptance, trust, responsibility and anthropomorphism. This can be done by investigating different properties of the robots, such as voice, appearance, and facial expressions. These properties can be programmed in the robot in such a way that people can recognize certain emotions in the robot <ref name=":1">Chuah, S. H. W., & Yu, J. (2021). The future of service: The power of emotion in human-robot interaction. ''Journal of Retailing and Consumer Services'', ''61'', 102551. <nowiki>https://doi.org/10.1016/J.JRETCONSER.2021.102551</nowiki> </ref>. For example, a study has been done with the robot Sophia, who is developed to eventually work as a service robot in for example healthcare and education. She was given different emotional expressions, and pictures of Sohpia were posted on Instagram. The comments on these posts were then analyzed to examine people’s responses to emotions on robots <ref name=":1" />. This is only one example of many more similar studies. | |||
While research is still being done on human reactions to social robots, many of these robots are already being used in real world settings. They are mainly used as companions and support tools for children, but they are also used for providing services such as cleaning <ref name=":0" />. Two examples of social robots that are applied in the real world will be given. The first is the robot Misty. This robot is capable of many different facial expressions and can move its arms and head. Moreover, it has face and speech recognition to remember people and recognize intents <ref name=":0" /> <ref>Misty Robotics. (n.d.). ''Misty Robotics''. Retrieved from Misty Robotics: <nowiki>https://www.mistyrobotics.com/</nowiki> </ref> . Another example is the robot Pepper. This robot has a more humanoid appearance than Misty and is more advanced in its movements. Pepper is also able of perceiving human emotions and adapting its behavior appropriately. The robot is mostly used in companies and schools, such as Palomar College, where the robot is used to highlight and promote programs and services at the college. The students are able to ask it questions, such as “How do I get to my class?” <ref>Becerra, T. (2017, October 3). ''Palomar College welcomes Pepper the robot''. Retrieved from The Telescope: <nowiki>https://www.palomar.edu/telescope/2017/10/03/palomar-robot-pepper-debut/</nowiki> </ref>. | |||
'''How do students interact with robots?''' | |||
Students are an important user group for robots, since robots can be helpful educational tools. They could help students to grasp difficult concepts. Especially they can be useful in providing language, science of technology education. A robot could take on the role of a peer, a tool or a tutor in the learning activity <ref>Mubin, O., Stevens, C. J., Shahid, S., Mahmud, A. A., & Dong, J. (2013). A REVIEW OF THE APPLICABILITY OF ROBOTS IN EDUCATION. ''Technology for Education and Learning'', ''1''(1). <nowiki>https://doi.org/10.2316/journal.209.2013.1.209-0015</nowiki> </ref>. Also, students can learn a lot from interacting with robots. Building teamwork and improving communication skills are just some examples of the multiple benefits of using robotics in education <ref>Center for Innovation and Learning. (2023, November 21). ''Explore the seven benefits of robotics in education for students''. Center for Innovation and Education. <nowiki>https://cie.spacefoundation.org/7-benefits-of-robotics-for-students/</nowiki> </ref>. However, the implicit and multi-faceted impacts that this might bring into educational environments as a whole should be considered <ref>Shin, N., & Kim, S. (2007). Learning about, from, and with Robots: Students’ Perspectives. ''IEEE Xplore''. <nowiki>https://doi.org/10.1109/roman.2007.4415235</nowiki> </ref>. Another important aspect to stress is that exposure to robots at a relative young age prepares students for the future, it is likely that they will encounter robots in multiple industries. By familiarizing themselves with robots in an early spectrum, they will gain knowledge and benefit from this in their future careers. | |||
Students are an important target group for robots, because they represent future workforce and innovations. Understanding the needs of students is therefore important, since it can help developers design the robots so that they are engaging, user-friendly and educational. Teens have a desire for robots to be authentic, imperfect, and active listeners <ref>Björling, E. A., Thomas, K. A., Rose, E., & Çakmak, M. (2020). Exploring teens as robot operators, users and witnesses in the wild. ''Frontiers in Robotics and AI'', ''7''. <nowiki>https://doi.org/10.3389/frobt.2020.00005</nowiki> </ref>. | |||
In former research, a field study was conducted with qualitative interviews. The results showed a positive perception of the robot-supported learning environment, indicating a positive impact on the learning outcomes. Most students showed an additional value in the presence of the robot compared to traditional onscreen scenario or self-study and the robot increased their motivation, concentration and attention <ref>Donnermann, M., Schäper, P., & Lugrin, B. (2020). Integrating a Social Robot in Higher Education – A Field Study. ''IEEE Xplore''. <nowiki>https://doi.org/10.1109/ro-man47096.2020.9223602</nowiki> </ref>. | |||
Students also prefer robots that are easy to use and understand, it needs an intuitive interface and clear instructions. Apart from this, students like robots that can perform a wide range of activities and tasks and also, providing challenges and opportunities over time. They also like robots to be adaptable. Also, robots that can interact socially are interesting to students. They like robots that can understand and respond to human emotions, speech, gestures in order to have meaningful interactions and relationships. | |||
'''The importance of social robots being able to display emotions''' | |||
In many sectors, such as healthcare and education, social robots must be able to communicate with people in ways that are natural and easily understood. In order to make this human-robot interaction (HRI) feel natural and enjoyable for humans, robots must make use of human social norms <ref name=":2">Kirby, R., Forlizzi, J., & Simmons, R. (2010). Affective social robots. ''Robotics and Autonomous Systems'', ''58''(3), 322–332. <nowiki>https://doi.org/10.1016/J.ROBOT.2009.09.015</nowiki> </ref>. This requirement originates from humans anthropomorphizing robots, meaning that we attribute human characteristics to robots and engage and form relationships with them as if they are human <ref name=":2" /><ref name=":3">Breazeal, C. (2004). Designing Sociable Robots. ''Designing Sociable Robots''. <nowiki>https://doi.org/10.7551/MITPRESS/2376.001.0001</nowiki> | |||
</ref>. We use this to make the robot’s behavior familiar, understandable and predictable to us, and infer the robot’s mental state. However, for this to be a correct as well as intuitive inference, the robot’s behavior must be aligned with our social expectations and interpretations for mental states <ref name=":3" />. | |||
One very important integrated element in human communication is the use of nonverbal expressions of emotions, such as facial expressions, gaze, body posture, gestures, and actions <ref name=":2" /><ref name=":3" />. In human-to-human interaction as well as human-robot interaction, these nonverbal cues support and add meaning to verbal communication, and expressions of emotions specifically help build deeper and more meaningful relations, facilitate engagement and co-create experiences <ref name=":1" />. Besides adding conversational content, it is also shown that humans can unconsciously mimic the emotional expression of the conversational partner, known as emotional contagion, which helps to emphasize with others by simulating their feelings <ref name=":1" /><ref name=":2" />. Due to our tendency to anthropomorphize robots, it is possible that emotional contagion also occurs during HRI and can facilitate making users feel positive affect while interacting with a social robot <ref name=":1" />. | |||
Artificial emotions can be used in social robots to facilitate believable HRI, but also provide feedback to the user about the robot’s internal state, goals and intentions <ref name=":4">Fong, T., Nourbakhsh, I., & Dautenhahn, K. (2003). A survey of socially interactive robots. ''Robotics and Autonomous Systems'', ''42''(3–4), 143–166. <nowiki>https://doi.org/10.1016/S0921-8890(02)00372-X</nowiki> </ref>. Moreover, they can act as a control system through which we learn what drives the robots behavior and how he is affected by and adapts due to different factors over time <ref name=":4" />. Finally, the ability of social robots to display emotions is crucial in forming long-term social relationships, which is what people will naturally seek due to the anthropomorphic nature of social robots <ref name=":2" />. | |||
'''Measurements in HCI research''' | |||
In the past HCI (Human Computer Interaction) and HTI (Human Technology Interaction) research focused on improving the technological aspects of the interaction, but in more recent years increased interest has been developed into the aspects of user experience. User experience is still a broad area of research, yet in social robotics it has become increasingly relevant. User experience has many distinct aspects that are all a part of the overall experience, yet the basis of user experiences lies in the comfortable interaction with an agent. Making the interaction comfortable and likeable will create a sense of trust and eventually acceptance of the agent. | |||
The three factors mentioned above are all connected to each other. Specifically, trust and acceptance are linked. A paper by Wagner et al <ref>Wagner Ladeira, M. G. P., & Santini, F. (2023). Acceptance of service robots: a meta-analysis in the hospitality and tourism industry. ''Journal of Hospitality Marketing \& Management'', ''32''(6), 694–716. <nowiki>https://doi.org/10.1080/19368623.2023.2202168</nowiki> </ref> on a meta-analysis of acceptance in service robots described trust as a mediating factor between informational cues and acceptance of the service agent. However, there are also many contextual factors that play a role in this relationship. For example, acceptance and trust in agents seems to be smaller when the user is in a group than when the user uses the robot individually <ref>Martinez, J. E., VanLeeuwen, D., Stringam, B. B., & Fraune, M. R. (2023). Hey? ! What did you think about that Robot? Groups Polarize Users’ Acceptance and Trust of Food Delivery Robots. ''Proceedings of the 2023 ACM/IEEE International Conference on Human-Robot Interaction'', 417–427. <nowiki>https://doi.org/10.1145/3568162.3576984</nowiki> </ref>. | |||
To find these relationships and correlations between the many varied factors influencing the overall user experience, there need to be reliable measures that can be used to measure the presence, extend and underlying principles of them. Yet, one of the main challenges when it comes to HCI research is creating a reliable measurement. This challenge is present in most HCI domains, for example speech interfaces <ref>Clark, L., Doyle, P., Garaialde, D., Gilmartin, E., Schlögl, S., Edlund, J., Aylett, M., Cabral, J., Munteanu, C., Edwards, J., & R Cowan, B. (2019). The State of Speech in HCI: Trends, Themes and Challenges. ''Interacting with Computers'', ''31''(4), 349–371. <nowiki>https://doi.org/10.1093/iwc/iwz016</nowiki> </ref>, user engagement <ref>Doherty, K., & Doherty, G. (2018). Engagement in HCI: Conception, Theory and Measurement. ''ACM Comput. Surv.'', ''51''(5). <nowiki>https://doi.org/10.1145/3234149</nowiki> </ref> and online trust <ref>Kim, Y., & Peterson, R. A. (2017). A Meta-analysis of Online Trust Relationships in E-commerce. ''Journal of Interactive Marketing'', ''38''(1), 44–54. <nowiki>https://doi.org/10.1016/j.intmar.2017.01.001</nowiki> </ref>. The main reason for the lack of reliable and valid measures in HCI research is that these measures are only needed in the user experience research, which is new, as stated before. | |||
Still there are several attempts to create reliable measures for artificial agent acceptance and trust. First of all, one of the more well-known measures of acceptance is the Almere model that was proposed by Marcel Heerink et al. <ref name=":5">Heerink, M., Kröse, B., Evers, V., & Wielinga, B. (2010). Assessing Acceptance of Assistive Social Agent Technology by Older Adults: the Almere Model. ''International Journal of Social Robotics'', ''2''(4), 361–375. <nowiki>https://doi.org/10.1007/s12369-010-0068-5</nowiki></ref>. This measure consists of a questionnaire that covers twelve basic principles that range from induced anxiety to enjoyment and trust. The questionnaire itself consists of 41 questions. This model has an acceptable Cronbach’s alpha score of ~0.7 when it is used in the older adult and elderly population <ref name=":5" /><ref>Heerink, M. (2011). Exploring the influence of age, gender, education and computer experience on robot acceptance by older adults. ''Proceedings of the 6th International Conference on Human-Robot Interaction'', 147–148. <nowiki>https://doi.org/10.1145/1957656.1957704</nowiki> </ref>. However, when the measure is used for young adults, the reliability stays around the same <ref>Guner, H., & Acarturk, C. (2020). The use and acceptance of ICT by senior citizens: a comparison of technology acceptance model (TAM) for elderly and young adults. ''Universal Access in the Information Society'', ''19''(2), 311–330. <nowiki>https://doi.org/10.1007/s10209-018-0642-4</nowiki> </ref>. Although this is the measurement that was also proposed in the article that inspired this research, there are still many more measurements of acceptability and attitude towards robots <ref>Krägeloh, C. U., Bharatharaj, J., Sasthan Kutty, S. K., Nirmala, P. R., & Huang, L. (2019). Questionnaires to Measure Acceptability of Social Robots: A Critical Review. ''Robotics'', ''8''(4). <nowiki>https://doi.org/10.3390/robotics8040088</nowiki> </ref>. | |||
Measuring trust is harder as there is not one proposed overall method to measure trust, but this does not mean it cannot be measured. In a literary review of several papers measuring trust it was found that questionnaires are one of the most used methods to measure trust <ref>Bach, T. A., Khan, A., Hallock, H., Beltrão, G., & Sousa, S. (2022). A Systematic Literature Review of User Trust in AI-Enabled Systems: An HCI Perspective. ''International Journal of Human–Computer Interaction'', 1–16. <nowiki>https://doi.org/10.1080/10447318.2022.2138826</nowiki> </ref>. Sadly, the questionnaires used are not one standard set of questions, but a variety. This creates the added problem that studies are hard to compare to each other. As stated before, trust is connected to acceptance. In the Almere model discussed in the previous paragraph, trust is included as a basic factor. It, however, it is only measured using 2 statements “I would trust the robot if it gave me advice” and “I would follow the advice the robot gives me”. When trust is being measured on its own, it will need to be extended. Luckily there have been some measures proposed. One of these measurements comes from a review of several measurements of trust that were combined by Madsen and Gregor <ref>Madsen, M., & Gregor, S. D. (2000). ''Measuring Human-Computer Trust''. <nowiki>https://api.semanticscholar.org/CorpusID:18821611</nowiki> </ref>. In the same paper they proposed a questionnaire that consists of 5 distinct factors that each have 5 questions related to them. The overall Cronbach’s alpha of this measurement was found to be 0.85, which is a good score. | |||
The constructs measured above are not the only constructs that can be measured in Human-Computer interactions, but they are still the most prevalent in recent HCI and HTI research. Given the circumstances, they do give a great overview of the overall concepts in HCI research, as they contain most of the different basic principles in their foundation. This means that these measures are a great starting point for more in-depth research. | |||
'''How to display emotions as a robot (body posture, facial expressions, pitch, etc.) (Isha)''' | |||
A robot can display emotions when it combines body, facial and vocal expressions. | |||
The way such emotional reaction is expressed highly depends on the robot’s degree of anthropomorphism. For robots with a simple appearance, it may be sufficient to express emotions by means of e.g. lights or sounds. However, as the degree of anthropomorphism increases, it turns necessary to match the robot's behavior with the appearance to avoid falling into the uncanny valley <ref name=":6">Marcos-Pablos, S., & García‐Peñalvo, F. J. (2021). Emotional Intelligence in Robotics: A Scoping review. In Advances in intelligent systems and computing (pp. 66–75). <nowiki>https://doi.org/10.1007/978-3-030-87687-6_7</nowiki> </ref>. | |||
The idea behind the uncanny valley proposes that as robots keep approaching a more human-like appearance, people can experience a feeling of uneasiness / disturbance <ref>Wang, S., Lilienfeld, S. O., & Rochat, P. (2015). The Uncanny Valley: Existence and Explanations. Review Of General Psychology, 19(4), 393–407. <nowiki>https://doi.org/10.1037/gpr0000056</nowiki> </ref>. These experiences also occur as the robot’s user perceives a mismatch between the robot’s appearance and behavior. There are also differences in the way that the uncanny valley is perceived across different ages and cultures. As Eastern countries and children are less likely to be disturbed by this phenomenon <ref name=":6" />. | |||
Developers of humanoid robots found that next to body posture, hands also play a role in conveying emotions, as human hands can contribute to the human ability of emotional expression. These developers then created the emotion expression humanoid robot WE-4RII, with the integration of robot hands. This humanoid robot was eventually able to express emotion using facial expression, arms, hands, waist and neck motion. They also concluded that motion velocity is equally as important as body posture. “WE-4RII quickly moves its body for surprise emotional expression, but it slowly moves its body for sadness emotional expression.” <ref>Effective emotional expressions with expression humanoid robot WE-4RII: integration of humanoid robot hand RCH-1. (2004). IEEE Conference Publication | IEEE Xplore. <nowiki>https://ieeexplore.ieee.org/abstract/document/1389736?casa_token=LP_352U3xbQAAAAA:Yugjlzs5aZ-KEfzz2UxVjNIZDKTyRkeEXNjyImWL_TXrR1NHVd75pi6-ZKfHd3Zd10c5xykvxQ</nowiki> </ref> | |||
Next to that, vocal prosody also contributes to the quality of the emotion that is being displayed. In human-to-human interaction, patterns of pitch, rhythm, intonation, timing, and loudness contribute to our emotional expression. A sudden change in volume or pitch could emphasize excitement or emphasis. Or when the pitch rises at the end of a sentence, it will be more clear that the robot is asking a question, this could indicate confusion and / or curiosity <ref name=":7">Crumpton, J., & Bethel, C. L. (2015). A Survey of Using Vocal Prosody to Convey Emotion in Robot Speech. International Journal Of Social Robotics, 8(2), 271–285. <nowiki>https://doi.org/10.1007/s12369-015-0329-4</nowiki> </ref>. | |||
Studies have shown that humans will interpret both linguistic and non-linguistic emotion displaying sounds in an emotional way. But there is a preference towards the linguistic type of robot, as research has shown that people prefer human-like voices. In the example of a virtual car passenger, the driver appeared to be more attentive and less involved in accidents, as the virtual passenger’s speech matched the driver’s emotion. So it is not only beneficial to sound like a human being, but also the capability of matching the user’s emotions contributes to the emotion displaying quality of the robot <ref name=":7" />. | |||
== Sources == | == Sources == | ||
<references /> |
Revision as of 15:45, 17 February 2024
Group members
Name | Student Number | Current Study program | Role or responsibility |
---|---|---|---|
Margit de Ruiter | 1627805 | BPT | Note-taker |
Danique Klomp | 1575740 | BPT | Contact person |
Emma Pagen | 1889907 | BAP | End responsible Wiki update |
Liandra Disse | 1529641 | BPT | Planner |
Isha Rakhan | 1653997 | BPT | Contact person |
Naomi Han | 0986672 | BCS | Programming responsible |
Introduction to the course and project
Problem statement
Modern media is filled with images of highly sophisticated robots that speak, move and behave like humans would. The many movies, plays and books that are created speculate that these types of robots will be integrating into our daily lives in the near future. The idea of robots becoming increasingly more like humans is thus integrated into in our ideas. However, modern technology has not yet been able to catch up to this futuristic idea of what an artificial agent, like a robot, is able to do. This delay mainly comes from the lack of knowledge on how to replicate the behavior of humans in the hardware and programming of the artificial agents. One of the main areas that has been of growing interest is the implementation of emotions in robots and other artificial agents. Emotions of a human are not easy to replicate, as they consist of many different factors that make up the emotion. The research that will be presented in this wiki will also focus on emotions, but it will look at how these emotions have an effect on the acceptance of the robot. The question that will be answered is:
“Does a match between the displayed emotion of a social robot and the content of the robots spoken message influence the acceptance of this robot?”
Objectives
As a group, we outlined our objectives for our project. With our main objectives being contributing to knowledge about the role of emotions in social robot interactions and extending knowledge on the reliability of the acceptance measurement with the focus on young adults. As the Almere model is yet to be extensively tested on younger adults. In order to achieve these two main objectives, we have some smaller objectives that will guide us towards them. These concern conducting lab research and doing statistical and qualitative data analysis that are related to social and psychological research. Next to that, we are a multidisciplinary group, and are aiming towards working together in such a manner that every single group member is able to bring their own discipline to the table. And finally, properly programming and working with a robot is crucial to achieve our main objectives.
Users
The users in this research are young adolescents. They have specific needs and require certain characteristics of the social robot in order to have a pleasant social interaction. In general, these users would like the robots to be authentic, imperfect, and to be active listeners. Active listening helps to build trust between the human and the robot. Also, by listening and showing that the robot understands the conversation and the emotional state of the person, the robot can adapt its interactions according to this, which will lead to a more personalized and meaningful interaction. The users require the robot to be easy to understand and it should have an intuitive interface. As already mentioned above a bit, the users like robots that can understand and respond to human emotions in order to have a meaningful interaction.
Approach
In this study, the influence of expressing different types of emotions while displaying different emotions will be researched. These messages will be expressed by a robot. This robot will tell a positive story with both positive and neutral emotion, and a negative story with both a negative and neutral emotion. This means that there are four different conditions of the robot whose influence on the participants will be studied and compared. For further research, two mismatch conditions, such as a positive message with a negative emotion or vice versa, will also be used. The target group that will be asked as participants are young adults. They will be invited to the TUe to listen live to the messages of the robot. After hearing the message, the participants are asked to fill in a questionnaire, which will consist partly of questions that are based on the Almere model, and partly of open questions to give the participants the chance to explain their reactions to the robot more freely. These results will be analysed in both a quantitative and qualitative manner to draw conclusions on the differences in acceptance and trust the participants have for the robot in different conditions.
Planning
Each week, there will be a mentor meeting on Monday morning followed by a group meeting. Another group meeting will be held on Thursday afternoon and by Sunday afternoon the wiki will be updated for work done that week (weekly deliverable).
Week 1
- Introduction to the course and team
- Brainstorm to come up with ideas for the project and select one (inform course coordinator)
- Conduct literature review
- Specify problem statement, user group and requirements, objectives, approach, milestones, deliverables and planning for the project
Week 2
- Get confirmation for using a robot lab, and which robot
- Ask/get approval for conducting this study
- Create research proposal (methods section of research paper)
- If approval is already given, start creating survey, programming the robot or creating video of robot
Week 3
- If needed, discuss final study specifics, including planning the session for conducting the study
- If possible, finalize creating survey, programming the robot or creating video of robot
- Make consent form
- Start finding and informing participants
Week 4
- Final arrangements for study set-up (milestone 1)
- Try to start with conducting the study
Week 5
- Finish conducting the study (milestone 2)
Week 6
- Conduct data analysis
- Finalize methods section, such as including participant demographics and incorporate feedback
- If possible, start writing results, discussion and conclusion sections
Week 7
- Finalize writing results, discussion and conclusion sections and incorporate feedback, all required research paper sections are written (milestone 3)
- Prepare final presentation
Week 8
- Give final presentation (milestone 4)
- Finalize wiki (final deliverable)
- Fill in peer review form (final deliverable)
Individual effort per week
Week 1
Name | Total Hours | Break-down |
Danique Klomp | 13.5 | Intro lecture (2h), Group meeting (2h), Group meeting (2h), Literary search (4h), Writing summary LS (2h), Writing problem statement first draft (1,5h) |
Liandra Disse | 12 | Intro lecture (2h), group meeting (2h), Searching and reading literature (4h), writing summary (2h), group meeting (2h) |
Emma Pagen | 12 | Intro lecture (2h), group meeting (2h), literary search (4h), writing a summary of the literature (2h), writing the approach for the project (1h), updating the wiki (1h) |
Isha Rakhan | 11 | Intro lecture (2h), group meeting (2h), group meeting (2h), Collecting Literature and summarizing (5h) |
Margit de Ruiter | 13 | Intro lecture (2h), group meeting (2h), literature research (4h), writing summary literature (3h) group meeting (2h) |
Naomi Han | 2 | group meeting (2h) |
Literary review
State of the art
The use of social robots has increased rapidly over time. Social robots are being developed specifically for interacting with humans and other robots. They use artificial intelligence and are equipped with tools such as sensors, cameras and microphones. These tools enable the robot to interact with humans [1]. These robots come in all different kinds of shapes and sizes. For example, there are social robots such as Pepper, that look more humanlike, and there are robots such as PARO, which is seal-like [2]. These types of robots are now mostly used in service settings [3].
As human-robot interactions are growing more important, more research is being done around these interactions, such as acceptance, trust, responsibility and anthropomorphism. This can be done by investigating different properties of the robots, such as voice, appearance, and facial expressions. These properties can be programmed in the robot in such a way that people can recognize certain emotions in the robot [4]. For example, a study has been done with the robot Sophia, who is developed to eventually work as a service robot in for example healthcare and education. She was given different emotional expressions, and pictures of Sohpia were posted on Instagram. The comments on these posts were then analyzed to examine people’s responses to emotions on robots [4]. This is only one example of many more similar studies.
While research is still being done on human reactions to social robots, many of these robots are already being used in real world settings. They are mainly used as companions and support tools for children, but they are also used for providing services such as cleaning [1]. Two examples of social robots that are applied in the real world will be given. The first is the robot Misty. This robot is capable of many different facial expressions and can move its arms and head. Moreover, it has face and speech recognition to remember people and recognize intents [1] [5] . Another example is the robot Pepper. This robot has a more humanoid appearance than Misty and is more advanced in its movements. Pepper is also able of perceiving human emotions and adapting its behavior appropriately. The robot is mostly used in companies and schools, such as Palomar College, where the robot is used to highlight and promote programs and services at the college. The students are able to ask it questions, such as “How do I get to my class?” [6].
How do students interact with robots?
Students are an important user group for robots, since robots can be helpful educational tools. They could help students to grasp difficult concepts. Especially they can be useful in providing language, science of technology education. A robot could take on the role of a peer, a tool or a tutor in the learning activity [7]. Also, students can learn a lot from interacting with robots. Building teamwork and improving communication skills are just some examples of the multiple benefits of using robotics in education [8]. However, the implicit and multi-faceted impacts that this might bring into educational environments as a whole should be considered [9]. Another important aspect to stress is that exposure to robots at a relative young age prepares students for the future, it is likely that they will encounter robots in multiple industries. By familiarizing themselves with robots in an early spectrum, they will gain knowledge and benefit from this in their future careers.
Students are an important target group for robots, because they represent future workforce and innovations. Understanding the needs of students is therefore important, since it can help developers design the robots so that they are engaging, user-friendly and educational. Teens have a desire for robots to be authentic, imperfect, and active listeners [10].
In former research, a field study was conducted with qualitative interviews. The results showed a positive perception of the robot-supported learning environment, indicating a positive impact on the learning outcomes. Most students showed an additional value in the presence of the robot compared to traditional onscreen scenario or self-study and the robot increased their motivation, concentration and attention [11].
Students also prefer robots that are easy to use and understand, it needs an intuitive interface and clear instructions. Apart from this, students like robots that can perform a wide range of activities and tasks and also, providing challenges and opportunities over time. They also like robots to be adaptable. Also, robots that can interact socially are interesting to students. They like robots that can understand and respond to human emotions, speech, gestures in order to have meaningful interactions and relationships.
The importance of social robots being able to display emotions
In many sectors, such as healthcare and education, social robots must be able to communicate with people in ways that are natural and easily understood. In order to make this human-robot interaction (HRI) feel natural and enjoyable for humans, robots must make use of human social norms [12]. This requirement originates from humans anthropomorphizing robots, meaning that we attribute human characteristics to robots and engage and form relationships with them as if they are human [12][13]. We use this to make the robot’s behavior familiar, understandable and predictable to us, and infer the robot’s mental state. However, for this to be a correct as well as intuitive inference, the robot’s behavior must be aligned with our social expectations and interpretations for mental states [13].
One very important integrated element in human communication is the use of nonverbal expressions of emotions, such as facial expressions, gaze, body posture, gestures, and actions [12][13]. In human-to-human interaction as well as human-robot interaction, these nonverbal cues support and add meaning to verbal communication, and expressions of emotions specifically help build deeper and more meaningful relations, facilitate engagement and co-create experiences [4]. Besides adding conversational content, it is also shown that humans can unconsciously mimic the emotional expression of the conversational partner, known as emotional contagion, which helps to emphasize with others by simulating their feelings [4][12]. Due to our tendency to anthropomorphize robots, it is possible that emotional contagion also occurs during HRI and can facilitate making users feel positive affect while interacting with a social robot [4].
Artificial emotions can be used in social robots to facilitate believable HRI, but also provide feedback to the user about the robot’s internal state, goals and intentions [14]. Moreover, they can act as a control system through which we learn what drives the robots behavior and how he is affected by and adapts due to different factors over time [14]. Finally, the ability of social robots to display emotions is crucial in forming long-term social relationships, which is what people will naturally seek due to the anthropomorphic nature of social robots [12].
Measurements in HCI research
In the past HCI (Human Computer Interaction) and HTI (Human Technology Interaction) research focused on improving the technological aspects of the interaction, but in more recent years increased interest has been developed into the aspects of user experience. User experience is still a broad area of research, yet in social robotics it has become increasingly relevant. User experience has many distinct aspects that are all a part of the overall experience, yet the basis of user experiences lies in the comfortable interaction with an agent. Making the interaction comfortable and likeable will create a sense of trust and eventually acceptance of the agent.
The three factors mentioned above are all connected to each other. Specifically, trust and acceptance are linked. A paper by Wagner et al [15] on a meta-analysis of acceptance in service robots described trust as a mediating factor between informational cues and acceptance of the service agent. However, there are also many contextual factors that play a role in this relationship. For example, acceptance and trust in agents seems to be smaller when the user is in a group than when the user uses the robot individually [16].
To find these relationships and correlations between the many varied factors influencing the overall user experience, there need to be reliable measures that can be used to measure the presence, extend and underlying principles of them. Yet, one of the main challenges when it comes to HCI research is creating a reliable measurement. This challenge is present in most HCI domains, for example speech interfaces [17], user engagement [18] and online trust [19]. The main reason for the lack of reliable and valid measures in HCI research is that these measures are only needed in the user experience research, which is new, as stated before.
Still there are several attempts to create reliable measures for artificial agent acceptance and trust. First of all, one of the more well-known measures of acceptance is the Almere model that was proposed by Marcel Heerink et al. [20]. This measure consists of a questionnaire that covers twelve basic principles that range from induced anxiety to enjoyment and trust. The questionnaire itself consists of 41 questions. This model has an acceptable Cronbach’s alpha score of ~0.7 when it is used in the older adult and elderly population [20][21]. However, when the measure is used for young adults, the reliability stays around the same [22]. Although this is the measurement that was also proposed in the article that inspired this research, there are still many more measurements of acceptability and attitude towards robots [23].
Measuring trust is harder as there is not one proposed overall method to measure trust, but this does not mean it cannot be measured. In a literary review of several papers measuring trust it was found that questionnaires are one of the most used methods to measure trust [24]. Sadly, the questionnaires used are not one standard set of questions, but a variety. This creates the added problem that studies are hard to compare to each other. As stated before, trust is connected to acceptance. In the Almere model discussed in the previous paragraph, trust is included as a basic factor. It, however, it is only measured using 2 statements “I would trust the robot if it gave me advice” and “I would follow the advice the robot gives me”. When trust is being measured on its own, it will need to be extended. Luckily there have been some measures proposed. One of these measurements comes from a review of several measurements of trust that were combined by Madsen and Gregor [25]. In the same paper they proposed a questionnaire that consists of 5 distinct factors that each have 5 questions related to them. The overall Cronbach’s alpha of this measurement was found to be 0.85, which is a good score.
The constructs measured above are not the only constructs that can be measured in Human-Computer interactions, but they are still the most prevalent in recent HCI and HTI research. Given the circumstances, they do give a great overview of the overall concepts in HCI research, as they contain most of the different basic principles in their foundation. This means that these measures are a great starting point for more in-depth research.
How to display emotions as a robot (body posture, facial expressions, pitch, etc.) (Isha)
A robot can display emotions when it combines body, facial and vocal expressions.
The way such emotional reaction is expressed highly depends on the robot’s degree of anthropomorphism. For robots with a simple appearance, it may be sufficient to express emotions by means of e.g. lights or sounds. However, as the degree of anthropomorphism increases, it turns necessary to match the robot's behavior with the appearance to avoid falling into the uncanny valley [26].
The idea behind the uncanny valley proposes that as robots keep approaching a more human-like appearance, people can experience a feeling of uneasiness / disturbance [27]. These experiences also occur as the robot’s user perceives a mismatch between the robot’s appearance and behavior. There are also differences in the way that the uncanny valley is perceived across different ages and cultures. As Eastern countries and children are less likely to be disturbed by this phenomenon [26].
Developers of humanoid robots found that next to body posture, hands also play a role in conveying emotions, as human hands can contribute to the human ability of emotional expression. These developers then created the emotion expression humanoid robot WE-4RII, with the integration of robot hands. This humanoid robot was eventually able to express emotion using facial expression, arms, hands, waist and neck motion. They also concluded that motion velocity is equally as important as body posture. “WE-4RII quickly moves its body for surprise emotional expression, but it slowly moves its body for sadness emotional expression.” [28]
Next to that, vocal prosody also contributes to the quality of the emotion that is being displayed. In human-to-human interaction, patterns of pitch, rhythm, intonation, timing, and loudness contribute to our emotional expression. A sudden change in volume or pitch could emphasize excitement or emphasis. Or when the pitch rises at the end of a sentence, it will be more clear that the robot is asking a question, this could indicate confusion and / or curiosity [29].
Studies have shown that humans will interpret both linguistic and non-linguistic emotion displaying sounds in an emotional way. But there is a preference towards the linguistic type of robot, as research has shown that people prefer human-like voices. In the example of a virtual car passenger, the driver appeared to be more attentive and less involved in accidents, as the virtual passenger’s speech matched the driver’s emotion. So it is not only beneficial to sound like a human being, but also the capability of matching the user’s emotions contributes to the emotion displaying quality of the robot [29].
Sources
- ↑ 1.0 1.1 1.2 Biba, J. (2023, March 10). What is a social robot? Retrieved from Built In: https://www.nature.com/articles/s41598-020-66982-y#citeas
- ↑ Geva, N., Uzefovsky, F., & Levy-Tzedek, S. (2020, June 17). Touching the social robot PARO reduces pain perception and salivary oxytocin levels. Retrieved from Scientific reports: https://www.nature.com/articles/s41598-020-66982-y#citeas
- ↑ Borghi, M., & Mariani, M. (2022, September). The role of emotions in the consumer meaning-making of interactions with social robots. Retrieved from Science Direct: https://www.sciencedirect.com/science/article/pii/S0040162522003687
- ↑ 4.0 4.1 4.2 4.3 4.4 Chuah, S. H. W., & Yu, J. (2021). The future of service: The power of emotion in human-robot interaction. Journal of Retailing and Consumer Services, 61, 102551. https://doi.org/10.1016/J.JRETCONSER.2021.102551
- ↑ Misty Robotics. (n.d.). Misty Robotics. Retrieved from Misty Robotics: https://www.mistyrobotics.com/
- ↑ Becerra, T. (2017, October 3). Palomar College welcomes Pepper the robot. Retrieved from The Telescope: https://www.palomar.edu/telescope/2017/10/03/palomar-robot-pepper-debut/
- ↑ Mubin, O., Stevens, C. J., Shahid, S., Mahmud, A. A., & Dong, J. (2013). A REVIEW OF THE APPLICABILITY OF ROBOTS IN EDUCATION. Technology for Education and Learning, 1(1). https://doi.org/10.2316/journal.209.2013.1.209-0015
- ↑ Center for Innovation and Learning. (2023, November 21). Explore the seven benefits of robotics in education for students. Center for Innovation and Education. https://cie.spacefoundation.org/7-benefits-of-robotics-for-students/
- ↑ Shin, N., & Kim, S. (2007). Learning about, from, and with Robots: Students’ Perspectives. IEEE Xplore. https://doi.org/10.1109/roman.2007.4415235
- ↑ Björling, E. A., Thomas, K. A., Rose, E., & Çakmak, M. (2020). Exploring teens as robot operators, users and witnesses in the wild. Frontiers in Robotics and AI, 7. https://doi.org/10.3389/frobt.2020.00005
- ↑ Donnermann, M., Schäper, P., & Lugrin, B. (2020). Integrating a Social Robot in Higher Education – A Field Study. IEEE Xplore. https://doi.org/10.1109/ro-man47096.2020.9223602
- ↑ 12.0 12.1 12.2 12.3 12.4 Kirby, R., Forlizzi, J., & Simmons, R. (2010). Affective social robots. Robotics and Autonomous Systems, 58(3), 322–332. https://doi.org/10.1016/J.ROBOT.2009.09.015
- ↑ 13.0 13.1 13.2 Breazeal, C. (2004). Designing Sociable Robots. Designing Sociable Robots. https://doi.org/10.7551/MITPRESS/2376.001.0001
- ↑ 14.0 14.1 Fong, T., Nourbakhsh, I., & Dautenhahn, K. (2003). A survey of socially interactive robots. Robotics and Autonomous Systems, 42(3–4), 143–166. https://doi.org/10.1016/S0921-8890(02)00372-X
- ↑ Wagner Ladeira, M. G. P., & Santini, F. (2023). Acceptance of service robots: a meta-analysis in the hospitality and tourism industry. Journal of Hospitality Marketing \& Management, 32(6), 694–716. https://doi.org/10.1080/19368623.2023.2202168
- ↑ Martinez, J. E., VanLeeuwen, D., Stringam, B. B., & Fraune, M. R. (2023). Hey? ! What did you think about that Robot? Groups Polarize Users’ Acceptance and Trust of Food Delivery Robots. Proceedings of the 2023 ACM/IEEE International Conference on Human-Robot Interaction, 417–427. https://doi.org/10.1145/3568162.3576984
- ↑ Clark, L., Doyle, P., Garaialde, D., Gilmartin, E., Schlögl, S., Edlund, J., Aylett, M., Cabral, J., Munteanu, C., Edwards, J., & R Cowan, B. (2019). The State of Speech in HCI: Trends, Themes and Challenges. Interacting with Computers, 31(4), 349–371. https://doi.org/10.1093/iwc/iwz016
- ↑ Doherty, K., & Doherty, G. (2018). Engagement in HCI: Conception, Theory and Measurement. ACM Comput. Surv., 51(5). https://doi.org/10.1145/3234149
- ↑ Kim, Y., & Peterson, R. A. (2017). A Meta-analysis of Online Trust Relationships in E-commerce. Journal of Interactive Marketing, 38(1), 44–54. https://doi.org/10.1016/j.intmar.2017.01.001
- ↑ 20.0 20.1 Heerink, M., Kröse, B., Evers, V., & Wielinga, B. (2010). Assessing Acceptance of Assistive Social Agent Technology by Older Adults: the Almere Model. International Journal of Social Robotics, 2(4), 361–375. https://doi.org/10.1007/s12369-010-0068-5
- ↑ Heerink, M. (2011). Exploring the influence of age, gender, education and computer experience on robot acceptance by older adults. Proceedings of the 6th International Conference on Human-Robot Interaction, 147–148. https://doi.org/10.1145/1957656.1957704
- ↑ Guner, H., & Acarturk, C. (2020). The use and acceptance of ICT by senior citizens: a comparison of technology acceptance model (TAM) for elderly and young adults. Universal Access in the Information Society, 19(2), 311–330. https://doi.org/10.1007/s10209-018-0642-4
- ↑ Krägeloh, C. U., Bharatharaj, J., Sasthan Kutty, S. K., Nirmala, P. R., & Huang, L. (2019). Questionnaires to Measure Acceptability of Social Robots: A Critical Review. Robotics, 8(4). https://doi.org/10.3390/robotics8040088
- ↑ Bach, T. A., Khan, A., Hallock, H., Beltrão, G., & Sousa, S. (2022). A Systematic Literature Review of User Trust in AI-Enabled Systems: An HCI Perspective. International Journal of Human–Computer Interaction, 1–16. https://doi.org/10.1080/10447318.2022.2138826
- ↑ Madsen, M., & Gregor, S. D. (2000). Measuring Human-Computer Trust. https://api.semanticscholar.org/CorpusID:18821611
- ↑ 26.0 26.1 Marcos-Pablos, S., & García‐Peñalvo, F. J. (2021). Emotional Intelligence in Robotics: A Scoping review. In Advances in intelligent systems and computing (pp. 66–75). https://doi.org/10.1007/978-3-030-87687-6_7
- ↑ Wang, S., Lilienfeld, S. O., & Rochat, P. (2015). The Uncanny Valley: Existence and Explanations. Review Of General Psychology, 19(4), 393–407. https://doi.org/10.1037/gpr0000056
- ↑ Effective emotional expressions with expression humanoid robot WE-4RII: integration of humanoid robot hand RCH-1. (2004). IEEE Conference Publication | IEEE Xplore. https://ieeexplore.ieee.org/abstract/document/1389736?casa_token=LP_352U3xbQAAAAA:Yugjlzs5aZ-KEfzz2UxVjNIZDKTyRkeEXNjyImWL_TXrR1NHVd75pi6-ZKfHd3Zd10c5xykvxQ
- ↑ 29.0 29.1 Crumpton, J., & Bethel, C. L. (2015). A Survey of Using Vocal Prosody to Convey Emotion in Robot Speech. International Journal Of Social Robotics, 8(2), 271–285. https://doi.org/10.1007/s12369-015-0329-4