PRE2015 4 Groep1: Difference between revisions

From Control Systems Technology Group
Jump to navigation Jump to search
Line 143: Line 143:


- Ways to create a human-like presence:
- Ways to create a human-like presence:
  Using a soft skin made of materials such as silicone and soft vinyl
 
  Using humans to talk (teleoperate) instead of using a chat program, so the conversation are real and feel real.
  * Using a soft skin made of materials such as silicone and soft vinyl
  Unconscious motions such as breathing and blinking are generated automatically to give a sense that the android is a live.
  * Using humans to talk (teleoperate) instead of using a chat program, so the conversation are real and feel real.
  Minimal human design, so it can be any kind of person the user wants it to be using imagination. (Male/female, Young/old, Known person/Unkown person)
  * Unconscious motions such as breathing and blinking are generated automatically to give a sense that the android is a live.
  * Minimal human design, so it can be any kind of person the user wants it to be using imagination. (Male/female, Young/old, Known person/Unkown person)


== Stakeholders (USE) ==
== Stakeholders (USE) ==

Revision as of 22:40, 8 May 2016


Group members

  • Laurens Van Der Leden - 0908982
  • Thijs Van Der Linden - 0782979
  • Jelle Wemmenhove - 0910403
  • Joshwa Michels - 0888603
  • Ilmar van Iwaarden - 0818260

Short Project Description

The aim of this project is to create a anthropomorphic robot that can be used to hug others when separated by a large distance. This robot will copy or shadow a hugging movement performed by a human using motion capture sensors. In order to realize this goal the robot AMIGO will (if allowed and possible) be used to perform the hugs while the commandos are generated using Kinect sensors that capture movement done by a human.

Planning & Executed tasks

Week 1

Idee in één zin: Een robot die armbewegingen op afstand kan nabootsen, waarbij wij voornamelijk de nadruk leggen op een knuffelrobot (i.e. een robot die van afstand knuffels kan geven).

Soort robot: Amigo-knuffelrobot

Deelvragen/uitdagingen: 1. Hoe wordt de geleverde kracht van die de amigo/robot geeft aan de geknuffelde persoon terug gevoerd naar de knuffelde persoon (druksensoren, real time?)?

2. Hoe registreer je de beweging registreren van de knuffelende persoon (Ki-nect)?

3. Lkj


Actiepunten:

1. Mailtje over precieze inhoud presentatie volgende week maandag (25-4-2016) - Specifieke idee geven - Ook al USE aspecten? - Idee eindproduct geven - Ook al literatuur?

2. Literatuur/state of the art opzoeken Laurens en Thijs

3. Scenario/situatieschets Jelle

4. Notulen/brainstormsessie maken Joshwa

5. Opzetje Wikipagina maken Joshwa

Week 2

We continued to discuss our idea of a robot capable of shadowing a hugging motion using Kinect. This week we contacted someone from TechUnited and asked if it would be possible to use their robot AMIGO for our project.

Week 3

During our given presentation on Monday 2-5-2016 the teachers indicated that our plan still lacked clarity and our USE-aspects were missing. Thereafter we discussed this with the group for a few hours and then divided the tasks.

After we sent a second email to the people from TechUnited on the Monday we were invited to come to the robotics lab in the building "Gemini-Noord" on Tuesday evening in order to discuss our idea. While the initial plan was to work with AMIGO from the start if possible our plan changed a little bit. The people from TechUnited strongly advised to create a virtual simulation first(something that is done there a lot to test scenarios before applying this on the AMIGO itself) before considering applying this on the real AMIGO. If we could get the simulation to work properly we could consider trying the real AMIGO. The people from the lab told us which software and systems were to be used to make such a simulation.

Jelle, Joshwa and Laurens have discussed the matter since they will be working on the simulation. The three have installed some software and slightly began working on it.

Thijs has processed the USE-aspects in order to finally clarify what can be done to take USE into account for our robot. (als ik wat vergeten ben graag even toevoegen wat mist)

Ilmar has started working on the literature research regarding: State-of-the-art, User requirements of elderly people, ways of making a human-like presence

Week 4

Jelle, Joshwa and Laurens will work through tutorials to get to know the programming software used to make an AMIGO-simulation.

Week 5

Jelle, Joshwa and Laurens will finish the tutorials this week and hope to lay the groundwork for the AMIGO-simulation

Week 6

Jelle, Joshwa and Laurens will work on the AMIGO-simulation this week

Week 7

Jelle, Joshwa and Laurens hope to finish the AMIGO-simulation this week. If possible they can apply it on the real AMIGO

Week 8

Jelle, Joshwa and Laurens will if possible run tests with the real AMIGO using the AMIGO-simulation and prepare the final demonstration with either the AMIGO-simulation or AMIGO itself.

Ilmar will work on and finish the slides for the final presentation.

Week 9

The Wiki will receive its final update this week. The course-related presentations suggest that the final presentation is this week. Depending on the exact date this week will serve as a buffer to run some final tests with either AMIGO or the AMIGO-simulation itself.

Milestones project

Concerning the robot building/modifying process

1. Get robot skeleton
We have to acquire a robot mainframe we can modify in order to make a robot that has the functions we want it to have. Building an entire robot from scratch is not possible in eight weeks. If the owners allow us we can use the robot Amigo for this project.

2. Learn to work with its control system
Once we have the “template robot” we have get used to its control system and programming system. We must know how to edit and modify things in order to change its action.

3. Get all the required materials for the project
A list has to be made that includes everything we need to order or get elsewhere to execute the project. Then everything has to be ordered and collected.

4. Write a script/code to make the AMIGO do what you want
We will have to program the robot or edit the existing script of the robot to make it do what we want. This includes four stages:

4a Make it perform certain actions by giving certain commands
We must learn to edit the code to make sure the robot executes certain actions by entering a command directly.

4b Make sure these commands are linked to Kinect
Once we have the robot reacting properly on our entered commands we have to make sure these commands are linked to Kinect. We must ensure that the robot executes the action as a result of our own movements.

4c Include a talk function that gives the robot a telephone function
The robot must be armed with a function so that it reproduces word spoken by the man controlling it. This is like a telephone.

4d Make sure the robot is fully able to hug at will (is presentable to the public)
After the robot is Kinect driven we must modify it in order to be fully working according to plan. In this case it must be able to perform hugs exactly as we want it. As a real shadow of ourselves.

Concerning the Wiki

5. Have a complete wiki page of what was done
This milestone means that we simply have to possess a wiki page which describes our project well.

Concerning literature

6. State of the art
Find useful articles about existing shadow robotics and hugging robots.

Abstract societal problem analysis

One of the consequences of globalisation is that work will become more specialized. The job that fits you may no longer be found in your local area, in your city or even in your country. Work will separate people from their homes, their relatives. To provide for your family might imply not being able to bring your children to bed. People will get ever more lonely due to a lack of physical interaction with loved ones.

We hope that our robot can provide in the longing of people for physical contact with loved ones by enabling its users to hug others over a long distance. The receiver of the hug will have an anthropomorphic robot in their homes which will act as an avatar for the user sending the hug. The sender’s hugging motion is captured via camera’s and that data is used to make the robot emulate that motion. A microphone and speakers are used to let the users communicate spoken words via the robot. Virtual reality can be used to give the sender a more immersive hugging experience.

Scenario sketch usage

A common usage scenario of our robot might be as follows: let us consider a man named Stuart and his mother Margaret. Stuart is a busy man, he has a 9-to-5 job, children to take care of and he also tries to make some time free for meeting with friends and his hobbies. He barely has time to visit his mother Margaret living in a nursery home.

After a long day of work and having cleared the dinner table, Stuart calls his mother. She is glad that he called. It has been three weeks now since he last visited her and she misses him dearly. Stuart notices that his mother is a bit sad. His mother’s nursery home had recently bought some of those new hugging robots. This robot allows the inhabitants of the nursery home to have some physical contact with far away relatives. Stuart asks his mother if she would like to try the hugging robot again. Margaret agrees, although she is a tad skeptical. She still does not believe that a cold machine could replace the warm embrace of her son. Margaret tells her son that she would go one of the designated ‘hugging rooms’ where the robots are located and contact him there.

One hugging room is situated not too far from Margaret’s room. It consists of smaller compartments that allow for private conversations. There are five robots in this room, two of which are already in use. Margaret approaches one of the unused ones. She turns on the monitor placed next to it and logs in with her account. It is good that Stuart learned her to use Skype before, as the system is quite similar to this one. She hears a pinging noise, indicating that Stuart has come online.

Stuart had strapped himself into one of those virtual reality sets, including glasses and controllers for your hands. He had been skeptical of the VR-technique when it was first introduced in 2016, thinking they were just another eccentric gadget for the rich. However when prices dropped, they had soon taken their place in every home.

Stuart notices that his mother is online and proceeds to call her. She picks up the phone and they chat a little. His mother switches on the robot and Stuart suddenly sees his mother from the robot’s perspective. Before hugging his mother, he does some arm movements as to test whether he is actually in control. The robot’s arms still seem strange to him. They are soft, almost made of pillows and decorated as one of those horribly ugly armchairs found old houses. His tests all completed, Stuart tells his mother that he is ready, his voice now coming from the robot. Margaret steps closer to the robot and Stuart hugs her. Slowly at first, afraid to crush his mother’s fragile bones, but the machine’s pressure sensors prevent such a catastrophe from happening anyway. Margaret feels satisfied, happy to finally be able to hug her son ones again. After some time, they stop hugging and continue to talk some more, Stuart still in his robot form.

Luckily, Margaret did not panic like last time. She had felt uncomfortable being hugged by the robot, afraid that it might hurt her. She had to use the kill switch terminating the hug. The robot had slowly released her from its grasp and she and Stuart had picked up their usual way of calling when Margaret got back to her room.


State Of The art/ Literature research

State of the art:

- Telenoid: The Telenoid robot is a small white human-like doll used as a communication robot. The idea is that random people and caregivers can talk to the elderly people from a remote location using the internet, and brighten the day of the elderly people by giving them some company. A person can control the face and head of the robot using a webcam, in order to give people the idea of a human-like presence.

- Paro: Paro is a simple seal robot, able to react to his surroundings, using minimal movement and showing different emotional states. It is used in elderly nursing homes to improve the mood of elderly people and with that reduce the workload of the nursing staff. Paro did not only prove that a robot is able to improve the mood of elderly people but also that a robot is able to encourage more interaction and more conversations.

- Telepresence Robot for interpersonal Communication (TRIC): TRIC is going to be used in an elderly nursing home. The goal of TRIC is to allow elderly people to maintain a higher level of communication with loved ones and caregivers than via traditional methods. This paper further describes what discussion they made regarding the robot development process.

Interesting findings regarding the literature research:

- Ways to create a human-like presence:

 * Using a soft skin made of materials such as silicone and soft vinyl
 * Using humans to talk (teleoperate) instead of using a chat program, so the conversation are real and feel real.
 * Unconscious motions such as breathing and blinking are generated automatically to give a sense that the android is a live.
 * Minimal human design, so it can be any kind of person the user wants it to be using imagination. (Male/female, Young/old, Known person/Unkown person)

Stakeholders (USE)

NOG TOEVOEGEN

Challenges Design realization

Creating a robot that serves as some kind of avatar that exactly moves like you is an ambitious idea. It will not be possible to realize this over a very long distance as that would require a very strong signal with a big range. Since we will be focussing on a prototype that can copy just a(or some) basic action(s) within a close range it might be possible to realize this.

To make this work there are a few components that have to be taken into account:

  • The robot must be controlled from a (close) distance
  • The robot must be able to recognize some human movements or gestures
  • The robot must be able to translate these percepts into action

The first of these three components can be realized by using an arduino with a bluetooth module linked to a device, most likely a portable device or laptop. There exists a program or software called Kinect. This software uses a depth sensor to registrate movement. Xbox uses Kinect for certain games, allowing people to play a game using gestures and their own movement rather than a console. The Kinect software seems to be available for Windows.

On the internet a few tutorials and examples can be found on the internet of people using the Kinect software on a computer in order to control a device with gestures and arm movement. There is an example of a man that can make a robot hand copy his hand gestures using this software. Another example features a man making a robot kart move with his own gestures(uring Kinect and bluetooth).

As building an entire robot body could prove difficult we hope to borrow a robot body or prototype from the “right people” and arm it with tools mentioned above. As some people have proven the Kinect controlled robot to be possible, it should be possible to make a hugging robot.

--Links to the tutorials and clips of the examples are formulated above can be found at the end of this wiki under the name "Links subsection challenges design realization"

Requirements AMIGO

Must-have

  • The AMIGO must be able to process the arm-movements of the hugging person in considerable time (ideally in real time, but probably unrealistic) and mimic them credible and reasonable fluently to the person ‘to be hugged’.
  • The arms of the AMIGO must be able to embrace the person ‘to be hugged’. More specifically; the AMIGO must be able to make an embracing movement with his arms.

Should-have

  • There should be a force-stop function in the AMIGO so that the person ‘to be hugged’ can stop the hug anytime if he/she desires (for example because he/she feels uncomfortable).
  • The AMIGO should have a feedback function as to if and how much his arms are touching a person (pressure sensors).

Could-have

  • The AMIGO could get a message from the ‘hug-giver’, the person in another place wanting to give a hug.
  • The AMIGO could inform the ‘hug-receiver’ that a hug has been ‘send’ to him/her and ask if he/she wants to ‘receive’ the hug now.
  • The AMIGO could receive a message from the ‘hug-giver’ that the hug has ended.


Possible alternative usages of this technology (looking ahead)

The technology of telerobotics will be usefull for more applications than for social usages. Especially when working under circumstances that are dangerous for human bodies telerobotics can be of assistance. For example when working under the high pressure of the seabed, or when handling radioactive material and even the extreme circumstances of outer space.

Used Literature/Further reading

Links subsection "challenges design realization"

Advances in Telerobotics, AU: Manuel Ferre, Martin Buss, Rafael Aracil, Claudio Melchiorri, Carlos Balaguer ISBN: 978-3-540-71363-0 (Print) 978-3-540-71364-7 (Online) http://link.springer.com.dianus.libr.tue.nl/book/10.1007%2F978-3-540-71364-7

Telerobotics, AU: T.B. Sheridan † http://www.sciencedirect.com.dianus.libr.tue.nl/science/article/pii/0005109889900939

An Intelligent Simulator for Telerobotics Training, AU: Khaled Belghith et al. http://ieeexplore.ieee.org.dianus.libr.tue.nl/xpl/abstractAuthors.jsp?arnumber=5744073&tag=1

Telerobotic Pointing Gestures Shape Human Spatial Cognition, AU: John-John Cabibihan, Wing-Chee So, Sujin Saj, Zhengchen Zhang http://link.springer.com.dianus.libr.tue.nl/article/10.1007%2Fs12369-012-0148-9

Haptics in telerobotics: Current and future research and applications, AU: Carsten Preusche , Gerd Hirzinger http://link.springer.com.dianus.libr.tue.nl/article/10.1007%2Fs00371-007-0101-3

Remaining used literature

https://www.youtube.com/watch?v=KnwN1cGE5Ug

https://www.youtube.com/watch?v=AZPBhhjiUfQ

http://kelvinelectronicprojects.blogspot.nl/2013/08/kinect-driven-arduino-powered-hand.html

http://www.intorobotics.com/7-tutorials-start-working-kinect-arduino/

Anand B, Harishankar S Hariskrishna T.V. Vignesh U. Sivraj P. Digital human action copying robot 2013

http://singularityhub.com/2010/12/20/robot-hand-copies-your-movements-mimics-your-gestures-video/

http://www.telegraph.co.uk/news/1559760/Dancing-robot-copies-human-moves.html

http://www.emeraldinsight.com/doi/abs/10.1108/01439910310457715

http://www.shadowrobot.com/downloads/dextrous_hand_final.pdf

https://www.shadowrobot.com/products/air-muscles/

MORE TO FOLLOW