PRE2015 4 Groep1
Group members
- Laurens Van Der Leden - 0908982
- Thijs Van Der Linden - 0782979
- Jelle Wemmenhove - 0910403
- Joshwa Michels - 0888603
- Ilmar van Iwaarden - 0818260
Project Description
The aim of this project is to create a anthropomorphic robot that can be used to hug others when separated by a large distance. This robot will copy or shadow a hugging movement performed by a human using motion capture sensors. In order to realize this goal the robot AMIGO will (if allowed and possible) be used to perform the hugs while the commandos are generated using Kinect sensors that capture movement done by a human.
USE Aspects
Before designing the hugging robot it is important to analyze what are the benefits and needs of the users, the society and the enterprises. What might drive them to invest in the technology and what are their needs and wishes?
Who are the USE?
- Primary users: As the hugging robot intended use is to connect people that are separated from each other, the main primary users will be separated from their loved ones for a longer period. As such the primary user will be mainly elderly people, distant relatives or friends, and children or students.
- Secondary users: As primary users want to use the hugging robot, the secondary users will be instances where lot of the primary users can be found. As such hugging robots will be used by nursing or care homes and hospitals, private or boarding schools and educational instances that hold many international students like universities.
- Tertiary users: The hugging robot will probably be in high demand and being used many times by maybe different people. Therefore there will be a demand for maintenance. The tertiary user will be the maintenance staff as a result.
- Society: As the hugging robot will be placed in many government instances, national and local government will be the one that distributes the technology.
- Enterprise: The enterprises that will benefit from the hugging robot are the companies that will help produce the hugging robot. As such the virtual-reality enterprises and the robot producing companies are to benefit from this technology.
What are the needs of the USE?
- Primary user needs: As a large part of the primary users might new technology maybe complicated or intimidating, the hugging robot has to be safe physically as well as psychologically and easy to use. The fact that people are hugging the robot requires that it is comfortable to touch.
- Secondary user needs: As the secondary users are likely to have more than one robot, they would prefer a relatively cheap price. As they probably cannot afford to educate people to become experts with the robot, the robot has to be easy to install and use. A educational instance could reserve a room for the hugging robot, but in a hospital or nursing home, the patient might not be able to move. In that case it must be possible to move the robot to the patient. Therefore the robot has to be not too big and not too heavy. The fact that multiple people will make use of one robot might give rise to the wish that the appearance of the robot is adaptable.
- Tertiary user needs: As the robot has to be relatively cheap, the maintenance of the robot cannot be very intensive. This will require the robot to be easily cleaned and broken hardware and software to be easily accessible and replaced.
- Society needs: The hugging robot will be a device to connect people over large distances, in a better way than modern communication devices can. As such it will fight against loneliness and help strengthen family values.
- Enterprise needs: For the companies it is vital that the hugging robot will make a profit and to achieve this, the robot must be cheap to produce.
How can we process these things in to the project.
- Safety: In order to not damage the primary users the robot has to have pressure sensors for making sure the hug is comfortable and not painful. As an approaching robot might be frightening, the robot cannot give a hug until the user will allow. And by giving the robot an easy to reach kill switch, the user will not be trapped in case the robot might malfunction.
- Comfortable: To make sure the user will enjoy the hug, the robot has to have a soft skin that might be made of cushions and cannot be cold to the touch. By giving the robot a tablet, which might show a photo of the relative, and portraying a similar voice to that of the relative, we hope to make the user more at ease when alone with the robot. Dressing the robot in clothes and playing background music or sounds can also add to that effect. Giving the robots interface two separate buttons for phone function and movement activation, gives the user the choice whether or not he might want the robot to hug him and serves to give the user the sense that he is in control.
- Easy to use: As most people are already familiar with telephone functions, we want to design an interface that is as simple as that.
- Adaptable appearance: The robot can have a set of clothes and/or different skins to adapt to different situations.
Planning
Week 1
Idee in één zin: Een robot die armbewegingen op afstand kan nabootsen, waarbij wij voornamelijk de nadruk leggen op een knuffelrobot (i.e. een robot die van afstand knuffels kan geven).
Soort robot: Amigo-knuffelrobot
Deelvragen/uitdagingen: 1. Hoe wordt de geleverde kracht van die de amigo/robot geeft aan de geknuffelde persoon terug gevoerd naar de knuffelde persoon (druksensoren, real time?)?
2. Hoe registreer je de beweging registreren van de knuffelende persoon (Ki-nect)?
3. Lkj
Actiepunten:
1. Mailtje over precieze inhoud presentatie volgende week maandag (25-4-2016) - Specifieke idee geven - Ook al USE aspecten? - Idee eindproduct geven - Ook al literatuur?
2. Literatuur/state of the art opzoeken Laurens en Thijs
3. Scenario/situatieschets Jelle
4. Notulen/brainstormsessie maken Joshwa
5. Opzetje Wikipagina maken Joshwa
Week 2 We continued to discuss our idea of a robot capable of shadowing a hugging motion using Kinect. This week we contacted someone from TechUnited and asked if it would be possible to use their robot AMIGO for our project. Week 3 During our given presentation on Monday 2-5-2016 the teachers indicated that our plan still lacked clarity and our USE-aspects were missing. Thereafter we discussed this with the group for a few hours and then divided the tasks.
After we sent a second email to the people from TechUnited on the Monday we were invited to come to the robotics lab in the building "Gemini-Noord" on Tuesday evening in order to discuss our idea. While the initial plan was to work with AMIGO from the start if possible our plan changed a little bit. The people from TechUnited strongly advised to create a virtual simulation first(something that is done there a lot to test scenarios before applying this on the AMIGO itself) before considering applying this on the real AMIGO. If we could get the simulation to work properly we could consider trying the real AMIGO. The people from the lab told us which software and systems were to be used to make such a simulation.
Jelle, Joshwa and Laurens have discussed the matter since they will be working on the simulation. The three have installed some software and slightly began working on it.
Thijs has processed the USE-aspects in order to finally clarify what can be done to take USE into account for our robot. (als ik wat vergeten ben graag even toevoegen wat mist)
Ilmar has started working on the literature research regarding: State-of-the-art, User requirements of elderly people, ways of making a human-like presence
Week 4 Jelle, Joshwa and Laurens will work through tutorials to get to know the programming software used to make an AMIGO-simulation.
Week 5 Jelle, Joshwa and Laurens will finish the tutorials this week and hope to lay the groundwork for the AMIGO-simulation Week 6 Jelle, Joshwa and Laurens will work on the AMIGO-simulation this week Week 7 Jelle, Joshwa and Laurens hope to finish the AMIGO-simulation this week. If possible they can apply it on the real AMIGO Week 8 Jelle, Joshwa and Laurens will if possible run tests with the real AMIGO using the AMIGO-simulation and prepare the final demonstration with either the AMIGO-simulation or AMIGO itself.
Ilmar will work on and finish the slides for the final presentation. Week 9 The Wiki will receive its final update this week. The course-related presentations suggest that the final presentation is this week. Depending on the exact date this week will serve as a buffer to run some final tests with either AMIGO or the AMIGO-simulation itself.
Milestones project
Robot building/modifying process
1. Get robot skeleton
We have to acquire a robot mainframe we can modify in order to make a robot that has the functions we want it to have. Building an entire robot from scratch is not possible in eight weeks. If the owners allow us we can use the robot Amigo for this project.
2. Learn to work with its control system
Once we have the “template robot” we have get used to its control system and programming system. We must know how to edit and modify things in order to change its action.
3. Get all the required materials for the project
A list has to be made that includes everything we need to order or get elsewhere to execute the project. Then everything has to be ordered and collected.
4. Write a script/code to make the AMIGO do what you want
We will have to program the robot or edit the existing script of the robot to make it do what we want. This includes four stages:
4a Make it perform certain actions by giving certain commands
We must learn to edit the code to make sure the robot executes certain actions by entering a command directly.
4b Make sure these commands are linked to Kinect
Once we have the robot reacting properly on our entered commands we have to make sure these commands are linked to Kinect. We must ensure that the robot executes the action as a result of our own movements.
4c Include a talk function that gives the robot a telephone function
The robot must be armed with a function so that it reproduces word spoken by the man controlling it. This is like a telephone.
4d Make sure the robot is fully able to hug at will (is presentable to the public)
After the robot is Kinect driven we must modify it in order to be fully working according to plan. In this case it must be able to perform hugs exactly as we want it. As a real shadow of ourselves.
Wiki
5. Have a complete wiki page of what was done
This milestone means that we simply have to possess a wiki page which describes our project well.
Literature
6. State of the art
Find useful articles about existing shadow robotics and hugging robots.
Technical Aspects
Challenges Design realization
Creating a robot that serves as some kind of avatar that exactly moves like you is an ambitious idea. It will not be possible to realize this over a very long distance as that would require a very strong signal with a big range. Since we will be focussing on a prototype that can copy just a(or some) basic action(s) within a close range it might be possible to realize this.
To make this work there are a few components that have to be taken into account:
- The robot must be controlled from a (close) distance
- The robot must be able to recognize some human movements or gestures
- The robot must be able to translate these percepts into action
The first of these three components can be realized by using an arduino with a bluetooth module linked to a device, most likely a portable device or laptop. There exists a program or software called Kinect. This software uses a depth sensor to registrate movement. Xbox uses Kinect for certain games, allowing people to play a game using gestures and their own movement rather than a console. The Kinect software seems to be available for Windows.
On the internet a few tutorials and examples can be found on the internet of people using the Kinect software on a computer in order to control a device with gestures and arm movement. There is an example of a man that can make a robot hand copy his hand gestures using this software. Another example features a man making a robot kart move with his own gestures(uring Kinect and bluetooth).
As building an entire robot body could prove difficult we hope to borrow a robot body or prototype from the “right people” and arm it with tools mentioned above. As some people have proven the Kinect controlled robot to be possible, it should be possible to make a hugging robot.
--Links to the tutorials and clips of the examples are formulated above can be found at the end of this wiki under the name "Links subsection challenges design realization"
Requirements AMIGO
Exact Usage Scenario
The aim of this section is to provide an exact description of a hug that the AMIGO robot needs to perform during the final demonstration.
Assumptions
- The AMIGO robot’s shoulders are lower than the hug-receiver’s shoulders.
- The hug-sender has a clear view of the AMIGO robot and the hug-receiver without any cameras.
- The hug-sender can see what the AMIGO’s main camera sees using a display.
Hug description
- The hug-sender and the hug-receiver have already established a communication session via telephone.
- The hug-receiver turns the AMIGO robot on.
- The hug-sender turns the KINECT system on.
- The hug-sender performs several test movements: by taking several poses focused on the hug-sender’s arms and checking whether the AMIGO robot’s arms take on the same poses.
- The hug-sender spreads their arms to indicate they are ready to give the hug. The AMIGO robot also spreads its arms.
- Both the hug-sender as well as the hug-receiver are notified that a hug can now be given. This can be done for example by changing the AMIGO’s color or having it pronounce a certain message.
- The hug-receiver approaches the AMIGO robot.
- The hug-receiver begins to hug the AMIGO robot (a so-called ‘bear’-hug).
- The hug-receiver tells the hug-sender that they are ready to receive a hug from the AMIGO.
- The hug-sender makes a hugging movement by closing their arms.
- The AMIGO robot takes over after the hug-sender’s arms have reached a certain point. This is because the hug-sender cannot see the hug-receiver and the AMIGO’s arms clearly enough to give the hug-receiver a comfortable hug.
- By measuring the resistance through the AMIGO’s actuators, the AMIGO can estimate the amount of pressure being exerted on the hug-receiver. The AMIGO starts to slowly close its arms around the hug-receiver, starting with its upper arms and ending with the hands.
- (optional) By moving its arms closer together or farther apart, the hug-sender can make the AMIGO robot hug tighter or looser.
- The hug-sender or the hug-receiver indicates that they would like to end the hug.
- The AMIGO robot slowly spreads its arms outwards.
- The hug-receiver stops hugging the robot and walks away.
- The AMIGO robot and the KINECT system are turned off.
Must-have
- The AMIGO must be able to process the arm-movements of the hugging person in considerable time (ideally in real time, but probably unrealistic) and mimic them credible and reasonable fluently to the person ‘to be hugged’.
- The arms of the AMIGO must be able to embrace the person ‘to be hugged’. More specifically; the AMIGO must be able to make an embracing movement with his arms.
Should-have
- There should be a force-stop function in the AMIGO so that the person ‘to be hugged’ can stop the hug anytime if he/she desires (for example because he/she feels uncomfortable).
- The AMIGO should have a feedback function as to if and how much his arms are touching a person (pressure sensors).
Could-have
- The AMIGO could get a message from the ‘hug-giver’, the person in another place wanting to give a hug.
- The AMIGO could inform the ‘hug-receiver’ that a hug has been ‘send’ to him/her and ask if he/she wants to ‘receive’ the hug now.
- The AMIGO could receive a message from the ‘hug-giver’ that the hug has ended.
Software
Progress Report
The software running the AMGIO simulation (and hopefully the AMIGO robot itself) is an essential part of this project. After the first consultation with a team member of the Tech United Team (Janno …), it became clear that the programming language used to control the AMIGO robot, ROS, has quite a steep learning curve. It is because of this that we decided to dedicate three team members (Laurens, Joshwa and Jelle) to the task of getting to know the programming language over the course of the coming two weeks. The following is a report on progress of the programming of the AMIGO simulation and the obstacles encountered.
Week 4
The ROS version used for controlling the AMIGO runs on the Ubuntu 14.04 operating system. In the weekend preceding this week, we had tried to install Ubuntu on a USB stick. We deemed this to be a safer alternative than partitioning the hard drive of our TU/e notebooks. At the start of this week, we set to installing ROS and going through several tutorials. These tutorials were aimed at creating an environment for writing ROS programs and getting familiar with the terminal and ROS commands. The latter was quite confusing to us, as we had never used Linux and therefore the terminal before.
Halfway during the week, we encountered a problem. It appeared that we had installed a try-out version of Ubuntu which does not save any changes made at all. Luckily, we had found the following tutorial. This allows the user to reserve about 4GB on the USB stick for personal storage. After installing Ubuntu properly this time, we ran through the installation process and the tutorials again.
Despite the setbacks this week, we obtained a better understanding of the ROS. When Janno showed us how several things worked in ROS, we were mostly confused seeing all these commands for the first time. We had now obtained some insight as to what he was getting at. However, there were some issues that bothered us: we were still in the dark about how to actually program the AMIGO robot and we had spent our time doing the same tasks.
Week 5
This week Joshwa and Jelle spent more time on completing the beginner ROS tutorials. The subjects covered in these tutorials are on two ROS programs communicate using a so-called ROS-topic. These tutorials concluded by having you program two communicating ROS programs in ROSPython, a adaption of regular Python by ROS. Laurens focused on getting the TU/e’s own ROS programs and adaptations running, especially trying to start the ROS simulation. Whilst doing so, he encountered a big problem. The 4GB reserved earlier for personal storage proved to be insufficient for the extra software installed. He had then tried to resolve this problem by installing the Ubuntu software in a different way. After installation on a USB 2.0 stick, which are unsuitable for direct installation of an operating system, Laurens has got the simulation running, despite not having figured out how it works yet.
Research
State of the art:
- Telenoid: The Telenoid robot is a small white human-like doll used as a communication robot. The idea is that random people and caregivers can talk to the elderly people from a remote location using the internet, and brighten the day of the elderly people by giving them some company. A person can control the face and head of the robot using a webcam, in order to give people the idea of a human-like presence. --[1 Telenoid]
- Paro: Paro is a simple seal robot, able to react to his surroundings, using minimal movement and showing different emotional states. It is used in elderly nursing homes to improve the mood of elderly people and with that reduce the workload of the nursing staff. Paro did not only prove that a robot is able to improve the mood of elderly people but also that a robot is able to encourage more interaction and more conversations.
- Telepresence Robot for interpersonal Communication (TRIC): TRIC is going to be used in an elderly nursing home. The goal of TRIC is to allow elderly people to maintain a higher level of communication with loved ones and caregivers than via traditional methods. The robot is small and lightweight so it is easy to use for the elderly people, it uses a webcam and a LCD screen.
Interesting findings from the literature research:
- Ways to create a human-like presence:
- Using a soft skin made of materials such as silicone and soft vinyl.
- Using humans to talk (teleoperate) instead of using a chat program, so the conversation are real and feel real.
- Unconscious motions such as breathing and blinking are generated automatically to give a sense that the android is a live.
- Minimal human design, so it can be any kind of person the user wants it to be using imagination. (Male/female, Young/old, Known person/Unkown person)
- From the participant’s view, the basic requirement for interpersonal communication using telepresence is that the participants must realize whom the telepresence robot represents. The two main options are using an LCD screen or to create mechanical facial expressions. (Mechanical facial expressions increase humanoid characteristics and therefore encourages more communication.)
- User requirements for the elderly:
- Affordable
- Easy to use:
- Lightweight
- Easy / Simple Interface
- Automatically Recharge
- Loud speakers (capable of 85dB), because elderly poeple prefer louder sounds for hearing speech sounds.
- Maximum speed of 1.2 m/s (Average walking speed)
- First of all it is important to know that whenever someone has a negative attitude towards robots, the robot will feel less human-like and increase the experienced social distance between humans and embodied agents. Secondly a proactive robot in this study was seen as les machine-like and more dependable when interaction was complemented with physical contact between the human and agent. Whenever people have a positive attitude towards robots, and the robot is proactive than the social experienced distance will decrease between humans and agents.
- Both the robots Paro and Telenoid proved that elderly people are able to accept robots. (9/10 of the people who used Telenoid accepted it, and thought the robot was cute)
- Both the robots Paro and Telenoid proved that robots are able to improve the mood of elderly people, by encouraging them to have more conversations.
Articles
Telenoid
Telenoid 1 https://www.ituaj.jp/wp-content/uploads/2015/10/nb27-4_web_05_ROBOTS_usingandroids.pdf Telenoid 2 https://www.researchgate.net/profile/Shuichi_Nishio/publication/235821988_Teleoperated_Android_as_an_Embodied_Communication_Medium_A_Case_Study_with_Demented_Elderlies_in_a_Care_Facility/links/0f317534b8cdd1fc17000000.pdf
This article is about the Telenoid robot. Here they mention that an ageing society with increasing loneliness is becoming a problem, “These days, the social ties of family, neighbors and work colleagues do not bond people together as closely as they used to and as a result, the elderly are becoming increasingly isolated from the rest of society. When elderly people become more isolated, they can lose their sense of purpose, become more susceptible to crime, and may even end up dying alone. Preventing isolation is essential if we are to create a safe and secure environment in the super-ageing society that Japan is having to confront ahead of any other country.”
In order to confront this isolation problem they developed the robot Telenoid, a communication robot. The idea is that random people and caregivers can talk to the elderly people from a remote location using the internet, and brighten the day of the elderly people by giving them some company. A person can control the face and head of the robot using a webcam, in order to give people the idea of a human-like presence.
These articles were useful because they give some interesting USE-aspects we could implement in our hugging robot. Like things that make a robot more humanlike. This article also showed us that the elderly people indeed react positive on human-robot interaction, this might not sound interesting but it was one of my biggest fears, that robot would not be accepted by the elderly people.
Paro
Paro is a simple seal robot, able to react to his surroundings, using minimal movement and showing different emotional states. It is used in elderly nursing homes to improve the mood of elderly people and with that reduce the workload of the nursing staff. Paro did not only prove that a robot is able to improve the mood of elderly people but also that a robot is able to encourage more interaction and more conversations.
This article was not really useful, it just confirmed that robots can have a positive effect on people's mood and that it encourages conversations and interaction between the elderly people. I expected that this article would been useful because every article or report about social robots cited this article and mentioned the robot "Paro".
Telepresence Robot for interpersonal Communication
Used Literature/Further reading
Links subsection "challenges design realization"
Advances in Telerobotics, AU: Manuel Ferre, Martin Buss, Rafael Aracil, Claudio Melchiorri, Carlos Balaguer ISBN: 978-3-540-71363-0 (Print) 978-3-540-71364-7 (Online) http://link.springer.com.dianus.libr.tue.nl/book/10.1007%2F978-3-540-71364-7
Telerobotics, AU: T.B. Sheridan † http://www.sciencedirect.com.dianus.libr.tue.nl/science/article/pii/0005109889900939
An Intelligent Simulator for Telerobotics Training, AU: Khaled Belghith et al. http://ieeexplore.ieee.org.dianus.libr.tue.nl/xpl/abstractAuthors.jsp?arnumber=5744073&tag=1
Telerobotic Pointing Gestures Shape Human Spatial Cognition, AU: John-John Cabibihan, Wing-Chee So, Sujin Saj, Zhengchen Zhang http://link.springer.com.dianus.libr.tue.nl/article/10.1007%2Fs12369-012-0148-9
Haptics in telerobotics: Current and future research and applications, AU: Carsten Preusche , Gerd Hirzinger http://link.springer.com.dianus.libr.tue.nl/article/10.1007%2Fs00371-007-0101-3
Remaining used literature
https://www.youtube.com/watch?v=KnwN1cGE5Ug
https://www.youtube.com/watch?v=AZPBhhjiUfQ
http://kelvinelectronicprojects.blogspot.nl/2013/08/kinect-driven-arduino-powered-hand.html
http://www.intorobotics.com/7-tutorials-start-working-kinect-arduino/
Anand B, Harishankar S Hariskrishna T.V. Vignesh U. Sivraj P. Digital human action copying robot 2013
http://singularityhub.com/2010/12/20/robot-hand-copies-your-movements-mimics-your-gestures-video/
http://www.telegraph.co.uk/news/1559760/Dancing-robot-copies-human-moves.html
http://www.emeraldinsight.com/doi/abs/10.1108/01439910310457715
http://www.shadowrobot.com/downloads/dextrous_hand_final.pdf
https://www.shadowrobot.com/products/air-muscles/
MORE TO FOLLOW