PRE2015 4 Groep1: Difference between revisions
Line 84: | Line 84: | ||
Once we have the “template robot” we have get used to its control system and programming system. We must know how to edit and modify things in order to change its action.<br /> | Once we have the “template robot” we have get used to its control system and programming system. We must know how to edit and modify things in order to change its action.<br /> | ||
'''3. Get all the required materials for the project''' | '''3. Get all the required materials for the project'''<br /> | ||
A list has to be made that includes everything we need to order or get elsewhere to execute the project. Then everything has to be ordered and collected. | A list has to be made that includes everything we need to order or get elsewhere to execute the project. Then everything has to be ordered and collected. <br /> | ||
'''4. Write a script/code to make the AMIGO do what you want''' | '''4. Write a script/code to make the AMIGO do what you want'''<br /> | ||
We will have to program the robot or edit the existing script of the robot to make it do what we want. This includes four stages: | We will have to program the robot or edit the existing script of the robot to make it do what we want. This includes four stages: <br /> | ||
'''4a Make it perform certain actions by giving certain commands''' | '''4a Make it perform certain actions by giving certain commands'''<br /> | ||
We must learn to edit the code to make sure the robot executes certain actions by entering a command directly. | We must learn to edit the code to make sure the robot executes certain actions by entering a command directly. <br /> | ||
'''4b Make sure these commands are linked to Kinect''' | '''4b Make sure these commands are linked to Kinect''' <br /> | ||
Once we have the robot reacting properly on our entered commands we have to make sure these commands are linked to Kinect. We must ensure that the robot executes the action as a result of our own movements. | Once we have the robot reacting properly on our entered commands we have to make sure these commands are linked to Kinect. We must ensure that the robot executes the action as a result of our own movements. <br /> | ||
'''4c Include a talk function that gives the robot a telephone function''' | '''4c Include a talk function that gives the robot a telephone function''' <br /> | ||
The robot must be armed with a function so that it reproduces word spoken by the man controlling it. This is like a telephone. | The robot must be armed with a function so that it reproduces word spoken by the man controlling it. This is like a telephone. <br /> | ||
'''4d Make sure the robot is fully able to hug at will (is presentable to the public)''' | '''4d Make sure the robot is fully able to hug at will (is presentable to the public)''' <br /> | ||
After the robot is Kinect driven we must modify it in order to be fully working according to plan. In this case it must be able to perform hugs exactly as we want it. As a real shadow of ourselves. | After the robot is Kinect driven we must modify it in order to be fully working according to plan. In this case it must be able to perform hugs exactly as we want it. As a real shadow of ourselves. <br /> | ||
=== Concerning the Wiki === | === Concerning the Wiki === |
Revision as of 16:17, 8 May 2016
Group members
- Laurens Van Der Leden - 0908982
- Thijs Van Der Linden - 0782979
- Jelle Wemmenhove - 0910403
- Joshwa Michels - 0888603
- Ilmar van Iwaarden
Short Project Description
The aim of this project is to create a anthropomorphic robot that can be used to hug others when separated by a large distance. This robot will copy or shadow a hugging movement performed by a human using motion capture sensors. In order to realize this goal the robot AMIGO will (if allowed and possible) be used to perform the hugs while the commandos are generated using Kinect sensors that capture movement done by a human.
Planning & Executed tasks
Week 1
Idee in één zin: Een robot die armbewegingen op afstand kan nabootsen, waarbij wij voornamelijk de nadruk leggen op een knuffelrobot (i.e. een robot die van afstand knuffels kan geven).
Soort robot: Amigo-knuffelrobot
Deelvragen/uitdagingen: 1. Hoe wordt de geleverde kracht van die de amigo/robot geeft aan de geknuffelde persoon terug gevoerd naar de knuffelde persoon (druksensoren, real time?)?
2. Hoe registreer je de beweging registreren van de knuffelende persoon (Ki-nect)?
3. Lkj
Actiepunten:
1. Mailtje over precieze inhoud presentatie volgende week maandag (25-4-2016) - Specifieke idee geven - Ook al USE aspecten? - Idee eindproduct geven - Ook al literatuur?
2. Literatuur/state of the art opzoeken Laurens en Thijs
3. Scenario/situatieschets Jelle
4. Notulen/brainstormsessie maken Joshwa
5. Opzetje Wikipagina maken Joshwa
Week 2
We continued to discuss our idea of a robot capable of shadowing a hugging motion using Kinect. This week we contacted someone from TechUnited and asked if it would be possible to use their robot AMIGO for our project.
Week 3
During our given presentation on Monday 2-5-2016 the teachers indicated that our plan still lacked clarity and our USE-aspects were missing. Thereafter we discussed this with the group for a few hours and then divided the tasks.
After we sent a second email to the people from TechUnited on the Monday we were invited to come to the robotics lab in the building "Gemini-Noord" on Tuesday evening in order to discuss our idea. While the initial plan was to work with AMIGO from the start if possible our plan changed a little bit. The people from TechUnited strongly advised to create a virtual simulation first(something that is done there a lot to test scenarios before applying this on the AMIGO itself) before considering applying this on the real AMIGO. If we could get the simulation to work properly we could consider trying the real AMIGO. The people from the lab told us which software and systems were to be used to make such a simulation.
Jelle, Joshwa and Laurens have discussed the matter since they will be working on the simulation. The three have installed some software and slightly began working on it.
Thijs has processed the USE-aspects in order to finally clarify what can be done to take USE into account for our robot. (als ik wat vergeten ben graag even toevoegen wat mist)
Ilmar has worked on....(ik weet niet meer precies, even toevoegen a.u.b)
Week 4
Jelle, Joshwa and Laurens will work through tutorials to get to know the programming software used to make an AMIGO-simulation.
Week 5
Jelle, Joshwa and Laurens will finish the tutorials this week and hope to lay the groundwork for the AMIGO-simulation
Week 6
Jelle, Joshwa and Laurens will work on the AMIGO-simulation this week
Week 7
Jelle, Joshwa and Laurens hope to finish the AMIGO-simulation this week. If possible they can apply it on the real AMIGO
Week 8
Jelle, Joshwa and Laurens will if possible run tests with the real AMIGO using the AMIGO-simulation and prepare the final demonstration with either the AMIGO-simulation or AMIGO itself.
Ilmar will work on and finish the slides for the final presentation.
Week 9
The Wiki will receive its final update this week. The course-related presentations suggest that the final presentation is this week. Depending on the exact date this week will serve as a buffer to run some final tests with either AMIGO or the AMIGO-simulation itself.
Milestones project
Concerning the robot building/modifying process
1. Get robot skeleton
We have to acquire a robot mainframe we can modify in order to make a robot that has the functions we want it to have. Building an entire robot from scratch is not possible in eight weeks. If the owners allow us we can use the robot Amigo for this project.
2. Learn to work with its control system
Once we have the “template robot” we have get used to its control system and programming system. We must know how to edit and modify things in order to change its action.
3. Get all the required materials for the project
A list has to be made that includes everything we need to order or get elsewhere to execute the project. Then everything has to be ordered and collected.
4. Write a script/code to make the AMIGO do what you want
We will have to program the robot or edit the existing script of the robot to make it do what we want. This includes four stages:
4a Make it perform certain actions by giving certain commands
We must learn to edit the code to make sure the robot executes certain actions by entering a command directly.
4b Make sure these commands are linked to Kinect
Once we have the robot reacting properly on our entered commands we have to make sure these commands are linked to Kinect. We must ensure that the robot executes the action as a result of our own movements.
4c Include a talk function that gives the robot a telephone function
The robot must be armed with a function so that it reproduces word spoken by the man controlling it. This is like a telephone.
4d Make sure the robot is fully able to hug at will (is presentable to the public)
After the robot is Kinect driven we must modify it in order to be fully working according to plan. In this case it must be able to perform hugs exactly as we want it. As a real shadow of ourselves.
Concerning the Wiki
5. Have a complete wiki page of what was done This milestone means that we simply have to possess a wiki page which describes our project well.
Concerning literature
6. State of the art Find useful articles about existing shadow robotics and hugging robots.
Scenario: Problem Sketch and solution
One of the consequences of globalisation is that work will become more specialized. The job that fits you may no longer be found in your local area, in your city or even in your country. Work will separate people from their homes, their relatives. To provide for your family might imply not being able to bring your children to bed. People will get ever more lonely due to a lack of physical interaction with loved ones.
We hope that our robot can provide in the longing of people for physical contact with loved ones by enabling its users to hug others over a long distance. The receiver of the hug will have an anthropomorphic robot in their homes which will act as an avatar for the user sending the hug. The sender’s hugging motion is captured via camera’s and that data is used to make the robot emulate that motion. A microphone and speakers are used to let the users communicate spoken words via the robot. Virtual reality can be used to give the sender a more immersive hugging experience.
State Of The art/ Literature research
NOG TOEVOEGEN
Stakeholders (USE)
NOG MAKEN
Requirements AMIGO
Must-have
- The AMIGO must be able to process the arm-movements of the hugging person in considerable time (ideally in real time, but probably unrealistic) and mimic them credible and reasonable fluently to the person ‘to be hugged’.
- The arms of the AMIGO must be able to embrace the person ‘to be hugged’. More specifically; the AMIGO must be able to make an embracing movement with his arms.
Should-have
- There should be a force-stop function in the AMIGO so that the person ‘to be hugged’ can stop the hug anytime if he/she desires (for example because he/she feels uncomfortable).
- The AMIGO should have a feedback function as to if and how much his arms are touching a person (pressure sensors).
Could-have
- The AMIGO could get a message from the ‘hug-giver’, the person in another place wanting to give a hug.
- The AMIGO could inform the ‘hug-receiver’ that a hug has been ‘send’ to him/her and ask if he/she wants to ‘receive’ the hug now.
- The AMIGO could receive a message from the ‘hug-giver’ that the hug has ended.
Possible alternative usages of this technology (looking ahead)
The technology of telerobotics will be usefull for more applications than for social usages. Especially when working under circumstances that are dangerous for human bodies telerobotics can be of assistance. For example when working under the high pressure of the seabed, or when handling radioactive material and even the extreme circumstances of outer space.
Used Literature/Further reading
https://www.youtube.com/watch?v=KnwN1cGE5Ug
https://www.youtube.com/watch?v=AZPBhhjiUfQ
http://kelvinelectronicprojects.blogspot.nl/2013/08/kinect-driven-arduino-powered-hand.html
http://www.intorobotics.com/7-tutorials-start-working-kinect-arduino/
Anand B, Harishankar S Hariskrishna T.V. Vignesh U. Sivraj P. Digital human action copying robot 2013
http://singularityhub.com/2010/12/20/robot-hand-copies-your-movements-mimics-your-gestures-video/
http://www.telegraph.co.uk/news/1559760/Dancing-robot-copies-human-moves.html
http://www.emeraldinsight.com/doi/abs/10.1108/01439910310457715
http://www.shadowrobot.com/downloads/dextrous_hand_final.pdf
https://www.shadowrobot.com/products/air-muscles/
MORE TO FOLLOW