PRE2023 3 Group4
Group members
Name | Student Number | Current Study program | Role or responsibility |
---|---|---|---|
Margit de Ruiter | 1627805 | BPT | Note-taker |
Danique Klomp | 1575740 | BPT | Contact person |
Emma Pagen | 1889907 | BAP | End responsible Wiki update |
Liandra Disse | 1529641 | BPT | Planner |
Isha Rakhan | 1653997 | BPT | Contact person |
Naomi Han | 0986672 | BCS | Programming responsible |
Introduction to the course and project
Problem statement
Modern media is filled with images of highly sofficiticated robots that speak, move and behave like humans would. The many movies, plays and books that are created speculate that these types of robots will be integrating into our daily lives in the near future. The idea of robots becoming increasingly more like humans is thus integrated into in our ideas. However, modern technology has not yet been able to catch up to this futuristic idea of what an artificial agent, like a robot, is able to do. This delay mainly comes from the lack of knowledge on how to replicate the behavior of humans in the hardware and programming of the artificial agents. One of the main areas that has been of growing interest is the implementation of emotions in robots and other artificial agents. Emotions of a human are not easy to replicate, as they consist of many different factors that make up the emotion. The research that will be presented in this wiki will also focus on emotions, but it will look at how these emotions have an effect on the acceptance of the robot. The question that will be answered is:
“Does a match between the displayed emotion of a social robot and the content of the robots spoken message influence the acceptance of this robot?”