PRE2019 3 Group2: Difference between revisions

From Control Systems Technology Group
Jump to navigation Jump to search
No edit summary
Line 56: Line 56:
# Rosenthal-von der Pütten, A. M., Krämer, N. C., & Herrmann, J. (2018). The effects of humanlike and robot-specific affective nonverbal behavior on perception, emotion, and behavior. International Journal of Social Robotics, 10(5), 569-582.
# Rosenthal-von der Pütten, A. M., Krämer, N. C., & Herrmann, J. (2018). The effects of humanlike and robot-specific affective nonverbal behavior on perception, emotion, and behavior. International Journal of Social Robotics, 10(5), 569-582.
<ref>https://link.springer.com/article/10.1007/s12369-018-0466-7</ref>
<ref>https://link.springer.com/article/10.1007/s12369-018-0466-7</ref>
# Srinivasan, V., Murphy, R. R., & Bethel, C. L. (2015). A reference architecture for social head gaze generation in social robotics. International Journal of Social Robotics, 7(5), 601-616.
# Srinivasan, V., Murphy, R. R., & Bethel, C. L. (2015). A reference architecture for social head gaze generation in social robotics. International Journal of Social Robotics, 7(5), 601-616.<ref>https://link.springer.com/article/10.1007/s12369-015-0315-x</ref>
<ref>https://link.springer.com/article/10.1007/s12369-015-0315-x</ref>
# Jung, J., Kanda, T., & Kim, M. S. (2013). Guidelines for contextual motion design of a humanoid robot. International Journal of Social Robotics, 5(2), 153-169.
# Jung, J., Kanda, T., & Kim, M. S. (2013). Guidelines for contextual motion design of a humanoid robot. International Journal of Social Robotics, 5(2), 153-169.
<ref>https://link.springer.com/article/10.1007/s12369-012-0175-6</ref>
<ref>https://link.springer.com/article/10.1007/s12369-012-0175-6</ref>

Revision as of 11:46, 6 February 2020

Research on Idle Movements for Robots


Abstract

Group Members

Name Study Student ID
Stijn Eeltink Mechanical Engineering 1004290
Sebastiaan Beers Mechanical Engineering 1257692
Quinten Bisschop Mechanical Engineering 1257919
Daan van der Velden Mechanical Engineering 1322818
Max Cornielje Mechanical Engineering 1381989

Planning

Introduction

Problem statement

In an ideal world and in the future robots will interact with humans on a very social intelligent way. Robots demonstrate humanlike social intelligence and non-experts will not be able to distinguish robots and other human agents anymore. To accomplish this, robots need to develop a lot further. The social intelligence of robots needs to be increased a lot, but also the movement of the robots. Robots now don't move the way humans do. For instance when moving your arm to grab something. Humans tend to overshoot a bit. A robot specifies the target and moves the shortest way to the object. Humans try to take the least resistance path. So this means they also use their surroundings to reach for their target. For instance, lean on a table to cancel out the gravity force. Humans use their joints more than robots do. Another big problem for a robot's motion to look human is idle movement. For humans and every other living creature in the world, it is physically impossible to stand precisely still. Robots, however, when not in action stand completely lifeless. Creating a problem for the interaction between the robot and the corresponding person. It is unsure if the robot is turned on and can respond to the human, and it also feels unnatural. Another thing is that humans are always doing something or holding something while having a conversation. To improve the interaction with humans and robots. There has to be looked at human idle movements, which idle movements are most beneficial for the human-robot interaction and which idle movements are most realistic and manageable for a robot to perform without looking too weird. In this research, we will look at all these things by observing human idle movements, test robot idle movement and research into the most preferable movements according to contestants.

Objectives

It is still very hard to get a robot to appear in a humanistic natural way. Because we want to create this natural human-robot interaction we want to further develop the social robots on different levels. In this project, we will focus on the movements which make a robot appear more natural which are the idle movements. The objectives of our research will be to find out which idle movements make a robot appear in a more natural way. In this research, our information will be based on previous research that has been done on this subject. We will also gather information from interviews and surveys to get the opinion of possible future users. Next to this, we will also do an experiment using the NAO robot performing different idle movements. Because in the future humanoid robots will also improve we shall give some possible expectations for which idle movements we think will become important when these robots improve. All together we hope that this information will give greater insight into the use of idle movements on humanoid robots to be used in future research and projects on this subject.

Users

Approach, Milestones and Deliverables

State of the Art

  1. Torta, E. (2014). Approaching independent living with robots. Eindhoven: Technische Universiteit Eindhoven [1]
  2. Waldemar Karwowski (2007). Worker selection of safe speed and idle condition in simulated monitoring of two industrial robots [2]
  3. Raymond H. Cuijpers, Marco A. M. H. Knops (2015). Motions of Robots Matter! The Social Effects of Idle and Meaningful Motions [3]
  4. Toru Nakata, Tomomasa Sato and Taketoshi Mori (1998). Expression of Emotion and Intention by Robot Body Movement [4]
  5. Takayuki Kanda, Hiroshi Ishiguro, Michita Imai, and Tetsuo Ono (2003). Body Movement Analysis of Human-Robot Interaction [5]
  6. Thibault Asselborn, Wafa Johal and Pierre Dillenbourg (2017). Keep on moving! Exploring anthropomorphic effects of motion during idle moments [6]
  7. Cooney, M., Kanda, T., Alissandrakis, A., & Ishiguro, H. (2014). Designing enjoyable motion-based play interactions with a small humanoid robot. International Journal of Social Robotics, 6(2), 173-193. [7]
  8. Kocoń, M., & Emirsajłow, Z. (2012). Modelling the idle movements of human head in three-dimensional virtual environments. Pomiary Automatyka Kontrola, 58(12), 1121-1123. [8]
  9. Beck, A., Hiolle, A., & Canamero, L. (2013). Using perlin noise to generate emotional expressions in a robot. In Proceedings of the Annual Meeting of the Cognitive Science Society (Vol. 35, No. 35). [9]
  10. Satake, S., Kanda, T., Glas, D. F., Imai, M., Ishiguro, H., & Hagita, N. (2009, March). How to approach humans? Strategies for social robots to initiate interaction. In Proceedings of the 4th ACM/IEEE international conference on Human robot interaction (pp. 109-116). [10]
  11. Obaid, M., Sandoval, E. B., Złotowski, J., Moltchanova, E., Basedow, C. A., & Bartneck, C. (2016, August). Stop! That is close enough. How body postures influence human-robot proximity. In 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN) (pp. 354-361). IEEE. [11]
  12. Rosenthal-von der Pütten, A. M., Krämer, N. C., & Herrmann, J. (2018). The effects of humanlike and robot-specific affective nonverbal behavior on perception, emotion, and behavior. International Journal of Social Robotics, 10(5), 569-582.

[12]

  1. Srinivasan, V., Murphy, R. R., & Bethel, C. L. (2015). A reference architecture for social head gaze generation in social robotics. International Journal of Social Robotics, 7(5), 601-616.[13]
  2. Jung, J., Kanda, T., & Kim, M. S. (2013). Guidelines for contextual motion design of a humanoid robot. International Journal of Social Robotics, 5(2), 153-169.

[14]

  1. Straub, I. (2016). ‘It looks like a human!’The interrelation of social presence, interaction and agency ascription: a case study about the effects of an android robot on social agency ascription. AI & society, 31(4), 553-571.

[15]

  1. Song, H., Min Joong, K., Jeong, S.-H., Hyen-Jeong, S., Dong-Soo K.: (2009). Design of Idle motions for service robot via video ethnography. In: Proceedings of the 18th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN 2009), pp. 195–99

[16]

References