PRE2020 3 Group9: Difference between revisions

From Control Systems Technology Group
Jump to navigation Jump to search
 
(35 intermediate revisions by 4 users not shown)
Line 447: Line 447:
=Deliverable 1: Local navigation model=
=Deliverable 1: Local navigation model=


This chapter describes how a model of the path planning is made. It will also describe the steps that have been taken to try to achieve local navigation. Unfortunately the local navigation goal was not met, but a model of the path planning was created successfully.
This chapter describes how a model of the path planning is made. It will also describe the steps that have been taken to try to achieve local navigation. Unfortunately the local navigation goal was not met, but a model of the path planning was created successfully.
 
'''Why should Rubby move?'''
 
Rubby has to move so that it will always face the child. When the parents are working and want to see if their child is safe they can see on their monitor what the robot is filming. So the robot should always face with it's camera to the child. Also movement is needed since there will be made games in the future that include physical training. For example playing with a ball (by moving against a ball it will start rolling and the child will play with it). But it will also be important for the future when facial expression recognition will be included. So when the robot faces the child it can interact with it when certain emotions are detected. Local navigation is needed for avoiding obstacles when achieving these goals.


'''The difference between navigation, localization and path planning'''
'''The difference between navigation, localization and path planning'''
Line 471: Line 475:
'''Roadmap path planner'''
'''Roadmap path planner'''


When the map has been inflated by the dimension of the robot, mobileRobotPRM object is used as a roadmap path planner. The object uses the map to generate a roadmap, which is a network graph of possible paths in the map based on free and occupied spaces. By customizing the number of nodes, and the connection distance, to fit the complexity of the map and find an obstacle-free path from a start to an end location. After the map is defined, the mobileRobotPRM path planner generates the specified number of nodes throughout the free spaces in the map. A connection between nodes is made when a line between two nodes contains no obstacles and is within the specified connection distance. You can see this in the figure PRM Algorithm.
When the map has been inflated by the dimension of the robot, mobileRobotPRM object is used as a roadmap path planner. The object uses the map to generate a roadmap, which is a network graph of possible paths in the map based on free and occupied spaces. To find an obstacle-free path from start to end location, the number of nodes and connection distances are adapted so the complexity is fitted into the map. After the map is defined, the mobileRobotPRM path planner generates the specified number of nodes throughout the free spaces in the map. A connection between nodes is made when a line between two nodes contains no obstacles and is within the specified connection distance. You can see this in the figure PRM Algorithm.


'''Controller'''
'''Controller'''
Line 481: Line 485:


== Progress on Local Navigation ==
== Progress on Local Navigation ==
[[File:Simscape_only_movement.PNG|400px|thumb|right|Overwiev of simscape model]]
A start was made on  developing local navigation for the robot. To start off, a model of Rubby was made in the Unified Robot Description Format (urdf). This model was kinematically similar to the actual robot (it moved in exactly the same way), but the shape was simplified to make simulation easier. The goal was to load this model into a simulation environment and test the obstacle avoidance capabilities of the robot. A LiDAR sensor was to be used to identify obstacles and to avoid them, a program should be written in Matlab. Two different simulation environments were tested. The first one was the simulink simulation environment and the second one was Gazebo in cooperation with Matlab Simulink.  
A start was made on  developing local navigation for the robot. To start off, a model of Rubby was made in the Unified Robot Description Format (urdf). This model was kinematically similar to the actual robot (it moved in exactly the same way), but the shape was simplified to make simulation easier. The goal was to load this model into a simulation environment and test the obstacle avoidance capabilities of the robot. A LiDAR sensor was to be used to identify obstacles and to avoid them, a program should be written in Matlab. Two different simulation environments were tested. The first one was the simulink simulation environment and the second one was Gazebo in cooperation with Matlab Simulink.  
[[File:Simscape_only_movement.PNG|500px|thumb|right|LiDAR data from Gazebo]]
 
'''Simulink Simulation'''
'''Simulink Simulation'''


Line 492: Line 497:
The first step was achieved by analytically determining the motion of the robot for given wheel positions. The exact derivation of the motion of the robot is will not be elaborated on here, but the conclusion was that for a given velocity of both the wheels, the robot would move in the desired way. An overview of the simscape model that defined the motion of the robot can be seen in the figure.
The first step was achieved by analytically determining the motion of the robot for given wheel positions. The exact derivation of the motion of the robot is will not be elaborated on here, but the conclusion was that for a given velocity of both the wheels, the robot would move in the desired way. An overview of the simscape model that defined the motion of the robot can be seen in the figure.


The second step was achieved by using a test environment used in a webinar by Mathworks (MathWorks Robotics and Autonomous Systems Team, 2021).
The second step was achieved by using a test environment used in a webinar by Mathworks (MathWorks Robotics and Autonomous Systems Team, 2021). A video of the robot moving through this environment can be seen on: https://youtu.be/NxY3lH0EniM
[[File:rubbyingazebo.jpg|320px|thumb|right|Rubby in the gazebo environment]]


Adding sensors in the simulink environment has to be done entirely manually, meaning that for a LiDAR sensor with 180 rays, all 180 rays have to be defined and implemented manually. This is of course not a completely straightforward task and after some further consideration, it was decided to use the Gazebo simulation environment instead. This environment automatically generates all of the sensor data, which is a far more efficient way of doing things. Gazebo can also perform Co-simulations with Matlab simulink, which allowed us to use the pre-existing knowledge of this program to develop a control algorithm.  
Adding sensors in the simulink environment has to be done entirely manually, meaning that for a LiDAR sensor with 180 rays, all 180 rays have to be defined and implemented manually. This is of course not a completely straightforward task and after some further consideration, it was decided to use the Gazebo simulation environment instead. This environment automatically generates all of the sensor data, which is a far more efficient way of doing things. Gazebo can also perform Co-simulations with Matlab simulink, which allowed us to use the pre-existing knowledge of this program to develop a control algorithm.  


'''Gazebo Simulation'''
'''Gazebo Simulation'''
[[File:Lidar_data.jpg|320px|thumb|right| Kinematics in the Simscape model]]
[[File:Lidar_data.jpg|320px|thumb|right| LiDAR data from the Gazebo simulation]]
Since Gazebo has limited support on windows, it had to be used on a Linux system (Ubuntu in this case). Implementing the robot in Gazebo had some other challenges as well. To launch a world in Gazebo with the robot in it, ROS (Robot Operating System) packages had to be used. A package was made for both the Robot and the world that was designed within gazebo. To test the connection to matlab, a LiDAR sensor was placed in the world and its output was plotted in Matlab. When this LiDAR data corresponded to the data that was expected, see the figure on the right, an attempt was made at adding such a LiDAR sensor to the robot. Since the world file and the robot file were in two different file formats, it was not just a case of copy-pasting the code used to make the sensor from the world file and adding it to the robot. The support for Sensors in the urdf file format is small at best, and significant knowledge of the ROS and the Gazebo program is required to properly implement this. This knowledge is not present within the group however, and there was not enough time to properly acquire this within the time frame given for this project.
Since Gazebo has limited support on windows, it had to be used on a Linux system (Ubuntu in this case). Implementing the robot in Gazebo had some other challenges as well. To launch a world in Gazebo with the robot in it, ROS (Robot Operating System) packages had to be used. A package was made for both the Robot and the world that was designed within gazebo. The robot and a LiDAR sensor could then be seen in the world, as can be seen in the figure on the right.  To test the connection to matlab, the sensor data from this LiDAR was plotted in Matlab, an example of which can be seen in the figure. When this LiDAR data corresponded to the data that was expected, see the figure on the right, an attempt was made at adding such a LiDAR sensor to the robot. Since the world file and the robot file were in two different file formats, it was not just a case of copy-pasting the code used to make the sensor from the world file and adding it to the robot. The support for Sensors in the urdf file format is small at best, and significant knowledge of the ROS and the Gazebo program is required to properly implement this. This knowledge is not present within the group however, and there was not enough time to properly acquire this within the time frame given for this project.


'''Continuation'''
'''Continuation'''
Line 552: Line 558:




In [[#Appendix F: Designs in Blender|Appendix F]] you can find other design in Blender.
In [[#Appendix F: Designs in Blender|Appendix F]] more figures of the preliminary designs in Blender can be found.


==Final design==
==Final design==
Line 721: Line 727:
It is important that all stakeholders are involved in the development plan from begin to end.  The primary, secondary and tertiary user needs are therefore key elements in the future plan.  
It is important that all stakeholders are involved in the development plan from begin to end.  The primary, secondary and tertiary user needs are therefore key elements in the future plan.  
The following stakeholders are involved in the development of Rubby:
The following stakeholders are involved in the development of Rubby:
*Children: These are the primary users. Therefore, their needs form the most important factor in the future plan. They need to actually benefit from Rubby, the robot entertains them in an interactional way and is not only a toy, but also a 'buddy'. The goal is that the child will develop it's educational and general skills.   
*Children: These are the primary users. Therefore, their needs form the most important factor in the future plan. They need to actually benefit from Rubby, the robot entertains them in an interactional way and is not only a toy, but also a 'buddy'. The goal is that the child will develop its educational and general skills.   
*Parents: These are the secondary users. They benefit from Rubby in such a way that they can work from home more easily without being disturbed and also by having children that are happier and are more supported in their learning development. Therefore, their involvement in the future plan is of great importance.  
*Parents: These are the secondary users. They benefit from Rubby in such a way that they can work from home more easily without being disturbed and also by having children that are happier and are more supported in their learning development. Therefore, their involvement in the future plan is of great importance.  
*Teachers: They can be seen as the tertiary users. When children are ill they are still able to learn things at home. Development of basic skills can be supported by the robot so more time can be spent on more profound subjects. Hence, teachers' experiences and needs should also be taken into account.
*Teachers: They can be seen as the tertiary users. When children are ill they are still able to learn things at home. Development of basic skills can be supported by the robot so more time can be spent on more profound subjects. Hence, teachers' experiences and needs should also be taken into account.
Line 727: Line 733:
*Schools: These are important stakeholders, since there is a shortage of teachers and Rubby can therefore be of added value in education.  
*Schools: These are important stakeholders, since there is a shortage of teachers and Rubby can therefore be of added value in education.  
*Ministry of education: The same holds for this stakeholder. Furthermore, the implementation of Rubby can help to give each child equal chances to learn. Hence it might be interesting for the ministry to invest in the robot.  
*Ministry of education: The same holds for this stakeholder. Furthermore, the implementation of Rubby can help to give each child equal chances to learn. Hence it might be interesting for the ministry to invest in the robot.  
*Teaching method companies: These stakeholders benefit from the robot, as their educational programmes can also be involved in homes and not only in school settings. The experience of teaching method companies are therefore essential to the future plan.  
*Teaching method companies: These stakeholders benefit from the robot, as their educational programs can also be involved in homes and not only in school settings. The experience of teaching method companies are therefore essential to the future plan.  




Line 735: Line 741:
The robot needs to be made commercially available and also attractive. In order to achieve this, the implementation of Rubby starts at small-scale. It will be used in a small group; in this stage the users will be questioned about their experience and about required improvements. In this way, an evaluation of the preliminary state of the robot is made. These experiences and improvements on the robot based on the recommendations will be used to convince the stakeholders (parents and schools, but also teaching method companies and the education ministry) to invest in the robot.  
The robot needs to be made commercially available and also attractive. In order to achieve this, the implementation of Rubby starts at small-scale. It will be used in a small group; in this stage the users will be questioned about their experience and about required improvements. In this way, an evaluation of the preliminary state of the robot is made. These experiences and improvements on the robot based on the recommendations will be used to convince the stakeholders (parents and schools, but also teaching method companies and the education ministry) to invest in the robot.  


Concerning the financial aspect, since the robot may not become too expensive for parents (as becomes clear from the results of the survey) nor for schools  who want to invest in it, a loan system is introduced by which schools can invest in Rubby. In the long term, this wil result in an increased general willingness to invest in the robot.  
Concerning the financial aspect, since the robot may not become too expensive for parents (as becomes clear from the results of the survey) nor for schools  who want to invest in it, a loan system is introduced by which schools can invest in Rubby. This can be paid by the corona education subsidy that is introduced by the Dutch government to catch up. In the long term, this will result in an increased general willingness to invest in the robot.  


For parents that work at home for a company (due to the corona measures), the investment of them in the robot will be supported by that company. These companies will partially pay for the robot. After all, this will also benefit them, because parents will then be able to work better from home and thus be more productive.
For parents that work at home for a company (due to the corona restricted measures), the investment of them in the robot will be supported by that company. These companies will partially pay for the robot. After all, this will also benefit them, because parents will then be able to work better from home and thus be more productive.


Ultimately, when the robot is used on a larger scale, then also (producing) costs will drop, and hence it will become more and more interesting for more users.  
Ultimately, when the robot is used on a larger scale, then also (producing) costs will drop, and hence it will become more and more interesting for more users.  
Line 1,221: Line 1,227:


==Appendix E: Sketches external design==
==Appendix E: Sketches external design==
With respect to the earlier determined requirements, several sketches and outlines are made to visualize possible design solutions. In this appendix, all these sketches of the robot are shown:
[[File:Rubby1.png|300px|thumb|left|Sketch 1, front view]]                    [[File:Rubby2.png|110px|thumb|centre|Sketch 1, side view]]
[[File:Rubby3.png|300px|thumb|left||Sketch 2, front view]]                    [[File:Rubby4.png|200px|thumb|centre||Sketch 2, side view]]
[[File:Rubby5.png|300px|thumb|left|Sketch 3, front view]]                    [[File:Rubby6.png|220px|thumb|centre|Sketch 3, side view]]
 
[[File:Rubby7.jpeg|500px|thumb|left|Sketch of arm, side and front view]]   
[[File:Rubby8.jpeg|500px|thumb|center|Sketch 4, front and side view]]     
[[File:Rubby10.jpeg|500px|thumb|left|Sketch 5, front and side view]]       
[[File:Rubby9.jpeg|300px|thumb|center|Sketch 6, front view]]     
             
<br\>
==Appendix F: Designs in Blender==
The figures in this appendix show several views of the preliminary designs, created in Blender.
[[File:MicrosoftTeams-image.png|380px|left|thumb|Preliminary design 1, front view]]          [[File:MicrosoftTeams-image_(5).png|390px|right|thumb|Preliminary design 2, front view]]
[[File:MicrosoftTeams-image_(1).png|380px|left|thumb|Preliminary design 1, side view]]        [[File:MicrosoftTeams-image_(3).png|390px|right|thumb|Preliminary design 2, side view]]
[[File:MicrosoftTeams-image_(2).png|380px|left|thumb|Preliminary design 1, iso view]]        [[File:MicrosoftTeams-image_(4).png|390px|right|thumb|Preliminary design 2, iso view]]
<br\>






[[File:Rubby1.png|300px|alt text|left|]]                    [[File:Rubby2.png|110px|alt text|centre|]]




[[File:Rubby3.png|300px|alt text|left|]]                    [[File:Rubby4.png|200px|alt text|centre|]]




[[File:Rubby5.png|300px|alt text|left|]]                    [[File:Rubby6.png|220px|alt text|centre|]]




[[File:Rubby7.jpeg|500px|alt text]]     


[[File:Rubby8.jpeg|500px|alt text]]             


[[File:Rubby9.jpeg|300px|alt text]]     
             
[[File:Rubby10.jpeg|500px|alt text]]




==Appendix F: Designs in Blender==


[[File:MicrosoftTeams-image.png|400px|alt text]]
[[File:MicrosoftTeams-image_(1).png|400px|alt text]]
[[File:MicrosoftTeams-image_(2).png|400px|alt text]]
[[File:MicrosoftTeams-image_(3).png|400px|alt text]]
[[File:MicrosoftTeams-image_(4).png|400px|alt text]]
[[File:MicrosoftTeams-image_(5).png|400px|alt text]]


==Appendix G: Visualization of some design aspects==
==Appendix G: Visualization of some design aspects==
In this section, a few design aspects of the final design are shown.
[[File:arm1.png|400px|thumb|left|Design aspect: Rotation of inner arm along red axis]] 
[[File:arm2.png|355px|thumb|centre|Design aspect: Rotation of outer arm along red axis]]
[[File:cover1.png|335px|thumb|left|Design aspect: Covers around tracks for safety]]
[[File:cover2.png|400px|thumb|centre|Design aspect: Improved and sleek covers around tracks]]


[[File:arm1.png|400px|alt text]] 
[[File:arm2.png|355px|alt text]]


[[File:cover1.png|335px|alt text]]
[[File:cover2.png|400px|alt text]]


==Appendix H: Progress on simulating Rubby in simscape ==
==Appendix H: Progress on simulating Rubby in simscape ==
 
[[File:URDF_snip.PNG|200px|right]]    [[File:rubby_urdf_output.PNG|200px|right]]
After looking some more into the robotics toolbox in matlab, I found out that it is quite hard to model a mobile robot with this toolbox. This is due to the fact that this toolbox generates a RigidBodyTree model from a robot. This model basically looks like a tree, so it starts with a fixed (!) base and separate limbs can move with respect to that base. This works great for robot arms that stay in one place, but I was not able to get it to work with our robot.  
After looking some more into the robotics toolbox in matlab, I found out that it is quite hard to model a mobile robot with this toolbox. This is due to the fact that this toolbox generates a RigidBodyTree model from a robot. This model basically looks like a tree, so it starts with a fixed (!) base and separate limbs can move with respect to that base. This works great for robot arms that stay in one place, but I was not able to get it to work with our robot.  


I therefore went back to Simscape in matlab. Simscape can be used to model moving robots. So I started off with that. I checked the kind of files that can be imported by simscape and ended up with making a .urdf file of a rough version of rubby. It looks nowhere like our robot, but it has similar kinematics and that is what matters for now. The .urdf file that was generated looks as follows:
I therefore went back to Simscape in matlab. Simscape can be used to model moving robots. So I started off with that. I checked the kind of files that can be imported by simscape and ended up with making a .urdf file of a rough version of rubby. It looks nowhere like our robot, but it has similar kinematics and that is what matters for now. The .urdf file that was generated looks as follows:
[[File:URDF_snip.PNG|500px]]    [[File:rubby_urdf_output.PNG|500px|right]]
 


This little snippet of code shows the two most important things in a .urdf file: Links and joints. Links specify different solid bodies in the model and joints connect the two. The actual file is about 120 lines long, so it is no use showing that here, but when importing this file into matlab, the result can be seen in the figure above.
This little snippet of code shows the two most important things in a .urdf file: Links and joints. Links specify different solid bodies in the model and joints connect the two. The actual file is about 120 lines long, so it is no use showing that here, but when importing this file into matlab, the result can be seen in the figure above.
Line 1,270: Line 1,344:
After this, it was time to import the robot in simscape and to make it move. The first part was making the generated urdf file move the way we wanted to. The simscape model for this looks as follows:
After this, it was time to import the robot in simscape and to make it move. The first part was making the generated urdf file move the way we wanted to. The simscape model for this looks as follows:


[[File:Simscape_only_movement.PNG]]
[[File:Simscape_only_movement.PNG|400px]]


Where the two constants are the inputs for the angular velocity of the right and left wheel. This model resulted in the robot driving around happily. A robot moving through an empty world does have any use for us however, so the next step is to implement it in an environment and try to incorporate path planning/ obstacle avoidance.  
Where the two constants are the inputs for the angular velocity of the right and left wheel. This model resulted in the robot driving around happily. A robot moving through an empty world does have any use for us however, so the next step is to implement it in an environment and try to incorporate path planning/ obstacle avoidance.  
[[File:Rubby_in_room.PNG|200px|right]]
To implement the robot in a simulation environment, we have to make another file format, in this case a '''VRML''' file. This file can easily be exported from blender, so we can use the real robot model made by Paul and Jos. This can then be used in combination with a living room setting that was taken from the mathworks website. Now our robot looks as follows in the living room environment:


To implement the robot in a simulation environment, we have to make another file format, in this case a '''VRML''' file. This file can easily be exported from blender, so we can use the real robot model made by Paul and Jos. This can then be used in combination with a living room setting that was taken from the mathworks website. Now our robot looks as follows in the living room environment:


[[File:Rubby_in_room.PNG]]


This is still with an ugly Rubby model that I made myself, but this can be replaced with the actual model pretty easily. Now we have to make the robot actually move in this living room environment. This was done by using a VR-sink block in Simulink and providing it with inputs. The results from implementing this looks similar to the picture in the living room, only now Rubby is able to move.  
This is still with an ugly Rubby model that I made myself, but this can be replaced with the actual model pretty easily. Now we have to make the robot actually move in this living room environment. This was done by using a VR-sink block in Simulink and providing it with inputs. The results from implementing this looks similar to the picture in the living room, only now Rubby is able to move.  




The next step is to incorporate an obstacle avoidance algorithm. This is something I tried but could not get to work yet. For this, virtual sensors have to be implemented, either in the simscape model or in the .urdf file and this has to be coupled to the environment.  
The next step is to incorporate an obstacle avoidance algorithm. This is something I tried but could not get to work yet. For this, virtual sensors have to be implemented, either in the simscape model or in the .urdf file and this has to be coupled to the environment.
 


==Appendix I: Picture user needs==
==Appendix I: Picture user needs==
Line 1,460: Line 1,533:
|-
|-
| Emma Allemekinders
| Emma Allemekinders
| hours
| 12 hours
|  
| Wiki on path planning, make the powerpoint for the presentation with script.
|-
|-
| Jos Stolk
| Jos Stolk
| hours
| 12 hours
|
| Specifying and motivating of determined RPCs, creating a future plan, final work on wiki.
|-
|-
| Emma Höngens
| Emma Höngens
| hours
| 11 hours
| Improved the complete wiki and created a future plan
| Improved the complete wiki and created a future plan
|-
|-
| Hidde Huitema
| Hidde Huitema
| hours
| 11 hours
|  
| Write section on local navigation. Write guidelines for parents.
|}
|}


==Appendix Z: Planning==
==Appendix Z: Planning==
[[File:Planning.JPG|1000px|alt text]]
'''Week 1'''
{|border=1 style="border-collapse: collapse;"
|-
! '''Week''' !! Main tasks
|-
| Week 1
| Form groups, choose subject, problem statement, start research
|-
| Week 2
| Research, come up with solutions for problem statement, choose direction to go to
|-
| Week 3
| Elaborate research on user group, from persona's, create a survey for parents, start research to possible technical options
|-
| Week 4
| Sending out survey, start information seeking for path planning and navigation, do research to technical options
|-
| Week 5
| Data analysis survey, start with path planning and navigation, start with preliminary design of mechanical design, complete user needs, start with final RPC list, describe robot, write scenario's
|-
| Week 6
| Complete RPC list, path planning, navigation, finalize mechanical design and bill of materials, make a software plan
|-
| Week 7
| Complete path planning, complete navigation, make a future plan, describe guidelines for parents, finalize wiki
|-
| Week 8
| Presentation, handing in wiki
|}
 
----
----



Latest revision as of 13:36, 30 March 2021

Groupmembers

Name Studentnumber Department Responsibility
Emma Allemekinders 1317873 Mechanical engineering Model of path planning and software plan
Emma Höngens 1375946 Industrial Engineering & Innovation Sciences Future plan, software plan, and wiki
Paul Hermens 1319043 Mechanical engineering List of materials and mechanical design
Hidde Huitema 1373005 Mechanical engineering Model of path planning and guidelines for parents
Jos Stolk 1443666 Mechanical engineering Mechanical design and future plan

Introduction

Problem statement

Due to the COVID-19 pandemic, a lot of people have to work from home instead of from the office. This means that their working environment changed and in many cases is not optimal. However, the working environment is not the largest problem of working from home, for many parents children are the main problem. Because of several corona restriction, children are not always able to go to school, to their grandparents or after-school care. This results in two problems, namely boredom and distraction.

First of all, boredom. When children are at home with their working parent(s) instead of school, after-school care or at their grandparents' place, they get bored easily. They should be silent and not distract their parents, but what should they do then? Besides, when they cannot go to school, children miss educational content as well. This problem should be solved.

Secondly, when children are at home while the parent is working, they cannot concentrate well. Children need attention and when they get bored, the first they go to are their parents. This means that parents with children at home are not able to concentrate, have meetings and make their deadlines. So it is hard for parents to combine private life with work. This should be solved. Therefore, there should come a child-friendly robot that entertain children and also learn them new things.

Execution

This robot solution is created for the course Project Robots Everywhere (0LAUK0). In the project group 9 will mainly focus on the first outline of the mechanical design and the navigation of the robot, and what should be done in the future to develop a working robot.

In the process of the project, the group will first have a look at the state of the art via literature. Then, the user group is investigated via literature and a survey among the parents. Together with the state of the art, the robot can be specified and RPC's can be stated. The last step is to create and deliver the end products.

Deliverables

At the end of the project Robots Everywhere, the following products will be delivered by group 9:

Deliverables
Local navigation model: The model of path planning of the robot and the local navigation to move toward its target.
Mechanical design: Design of a child friendly robot based on literature and a survey.
Bill of materials: List with the materials and the costs of those materials to get a final selling price and insight what is needed to create the robot of this product.
Software plan:: The end product of the software plan will be described. This plan is needed for future development of the robot.
Future plan: Description of the involved stakeholders and the development steps needed to take the robot to the commercial market.
Guidelines for parents: Since the robot is used by younger children a description for parents is needed how to use it in a safe way.

State of the Art

In this chapter, literature about existing robots and their features are discussed. As the robot in this project is focused on keeping children busy when they, for example, cannot go to school, educational and entertainment functions will be discussed.

This is IROBI, a robot that is studied in an educational context.
This is QRIO, a robot that is studied in an educational context.
This is Robovie, a robot that is studied several times in an educational context.

Entertainment and Education

Entertainment

Entertainment of children through a robot instead of "normal" toys can be done by making use of an interactive interface. This interface can be a tablet or screen, voice commands, movements, or music (like singing).

Research about the use of tablets as entertainment material for children shows that children can be entertained rather easily by a tablet (Oliemat, Ihmeideh & Alkhawaldeh ,2018). Voice capabilities could be extra entertaining and makes a more natural interaction between robot and child possible (Meyns, van der Spand, Capiau, de Cock, van Steirterghem, van der Looven & van Waelvelde,2019). Also, beside stimulation of musical hearing music stimulates natural exercise.

Attention

Apart from the entertainment function itself, the robot should draw attention of the child that uses it. As children have a short attention span, this should be taken into account during the development process of the robot should check if the child is still paying attention. This awareness can be used to switch up the current activity. If, for example, children are playing games lose interest, the robot can detect this by a child's expressions and body language and the activity can be switched to another. Then the robot can ask if the child would like to do something else.

Children are able to use a tablet independently from a very young age (Flewitt, Messer & Kucirkove, 2014). The child will therefore be able to control the tablet by themselves and will therefore be able to choose their activity independently and according to their own attention span. When the robot senses that the child is fully distracted from the tablet however, it will prompt the child to do another activity with the robot. It is stated in (Torta, van Heumen, Cuijpers & Juola, 2012) that this is most easily achieved by sound cues. Waving gestures proved to be the second most effective means of attracting attention. This can be achieved by moving the arms attached to the robot.

Education

There are two sides of the educational aspects that should be discussed, namely the educational purpose and the working of educational robots.

The potential benefits of using robots for educational purposes has been shown in research of Ruiz-del-Solar (2010), Kanda, Hirano, Eaton and Ishiguro (2004), and Tanaka, Cicourel and Movellan (2007). It is therefore desirable that a robot nanny can also carry out educational tasks. These following tasks will be carried out by Rubby:

  • Talking to the child. It is shown in research of Weisleder and Fernald (2013) that speech directly targeted to children plays a large role in their future development. This is a task that can easily be carried out by the robot. It can read books and tell stories to the child to stimulate the development of their vocabulary.
  • Games for development of memory and numerical skills. Passolunghi and Costa (2014) have shown, that playing specific games can greatly improve a child’s working memory and numerical skills. These games can be implemented in a tablet for example and can be played by the child.
  • Check writing. It is argued in Amorim, Jeon, Abel, Felisberto, Barbosa and Dias (2020) however that playing games on a platform similar to Squla, significantly increased the reading and writing skills of preschoolers. This is a task that can also be carried out by the robot. A tablet can be implemented on which the Squla games can be played and the robot can give positive reinforcement to stimulate playing these games.

The educational aspects of robots are investigated a lot and this knowledge can help us in the development in our robot. In several studies, robots that have such educational functions, were implemented in a children’s environment to investigate the impact of the robot on the children. Examples of these existing robots are Robovie, QRIO and IROBI (Tanaka, Cicourel & Movellan, 2007; Han, Jo, Park & Kim, 2005). Two Robovie robots have been implemented at a primary school in order to investigate whether they could form relationships with children and whether these children might learn from them. Results from this study, along with those of the studies performed with QRIO and IROBI, showed that robots are actually able to teach kids the English language (Kanda et al., 2004; Ishiguro, Ono, Imai, Maeda, Kanda & Nakatsu, 2001), can achieve strong social bonds with toddlers for a long period of time (Tanaka et al, 2007) and can increase the learning interest and academic achievement (Kory & Breazeal, 2014).


A few important conditions are needed to achieve this. First of all, the more predictable the robot is, the worse the quality of interaction. This means a robot should ‘grow’ with the child in order to increase the effectiveness of learning and communication. Secondly, it is preferred that a robot is adapted very well to the personal characteristics of the user; since each child has a different way and pace of learning. Furthermore, appearance, behavior and content of the robot are the three main factors that have impact on the robot’s effectiveness. Finally, an important note is that robots should do not replace parents, tutors or teachers; they are only for supplementing them in what they already do (Kory & Breazeal, 2014).

As a conclusion, robots with an educational function have a significant added value. Children’s language skills can be improved and especially their motivation to learn increases. Verbal interaction with the child to learn words appears to be a relevant and feasible function, and also playing memory games is a feature of a robot that already exists (Kory & Breazeal, 2014). The extra educational function of writing check increases the diversity of learning areas; Writing is a more active, whereas language learning is a more unconscious process.

Practical functions

Another important feature of a robot that is used by children, is safety. There are several options that could be used in the design of a robot to keep a child safe.

  • Motion control for the robot, to keep an eye on the child when it is moving around. Research of Yoshimi, Nishiyama, Sonoura, Nakamoto, Tokura, Sato, Ozaki, Matshuhira and Mizoguchi (2006) shows that the following behavior can be used to keep the child in view and therefore ensure proper functioning of the robot at all times.
  • It should be possible to move slowly to not be dangerous. This in combination with a soft material will drastically decrease the change of dangerous operation. Research Taufatofua, Heath, Ramirez-Brinez, Sommer, Durantin, Kong, Wiles and Pounds (2018) shows faster movement on its own is not as bad. It is a higher mass with fast movement that creates a worse impact if in contact with a child. If the drive mechanism is moved towards the core of the robot, the weight of the arms can be reduced which requires less torque, and will therefore be safer.
  • Observe dangerous objects in close range of the child and notify caregivers. This is done to increase the safety of young children. As children are very curious they can interact with sharp, hot or other kind of dangerous objects without knowing it. If the parent is busy with other activities they wouldn’t see it without a warning of the robot.
  • Monitoring the child is also important. Parents always should be able to know if their child is safe. A logical solution for this would be a camera, but this could bring privacy issues. biosignals to detect if a child is falling or voice technology also could be an option.

User

In this chapter first the target group will be defined and the specifications of these users will be given. As a result a table of user needs is given. The primary user group are Dutch children of 4- and 5-years old that go to kindergarten. The needs of the secondary user group, parents of these children, also come forward in this table.

User specifications

The primary users are Dutch children of 4- and 5-years old that go to kindergarten. The educational and entertainment content depends on the nationality and age group. This group is the initial group for the development of this robot, but in further development after this project also other countries and age categories could be included. The secondary users are the parents of the primary users.

Persona's

To get a better feeling and ability of identify with the user group, persona's are described based on data of Dutch research institutes. Below David and Ava are described. In Appendix A you can find the persona descriptions of Melissa and Noah, a single mother with a 5-year old son.

The majority of the Dutch households with children are led by two parents (Volksgezondheidenzorg.info, 2019). To get a feeling for the users of the robot that is developed in this project, the following persona's gives an example of a two parent household.

David and Ava

David

David is a male of 44 years old. He is married with Karin, a general practitioner of 39 years old. Together, they have two children: James, a boy of 8 years old and Ava, a girl of 4 years old. They live in a family home in a village surrounded by meadows.  

David works from home, at his home office, for four days a week. He is a logistic consultant and works with companies from over the whole world. He has a lot of conference calls in a week and there is also a lot of paper work to be done. Ava, his little daughter, is at home at Friday and Wednesday afternoon. She can be disturbing, because she wants to have attention from her father David and she is not able to keep herself busy with hobbies, like James is. Kindergarten is not possible for Ava as this only takes place a few villages away. David hopes that there will be a technology that supports him with caring for Ava, so he can continue working. 

Ava

Ava is a girl of 4 years old. She lives in a family home with her brother James and her parents David and Karin. She goes to school 4 days a week. At Friday she is at home with her father, because she and the other toddlers of her school are free at that day. Ava likes to play with dolls, cars, paint and dance with her family members.  

When she is at home on Wednesday afternoon and Friday, she feels bored. She does not know what to do on these days. James goes to his friends, the hockey club or is at school, so she cannot play with him. On these days she wants to play with daddy, but he has to work all day. When she goes to her father, he is talking to other people on the screen and he has not time for her. Sometimes he get angry when she chats with him during work hours.

On these days, she has to keep herself busy the whole day. David, her dad puts her in the living room or in his office with her toys. She always plays herself with puzzles, read her books and play with toy trains. After a while, she asks her father if she can watch the television and he always says yes so she does not disturb him. However, watching television for a long time is boring, she thinks. So she hopes that there is someone or something that can play with her when she is at home alone with her father.

Development children

In this section the development of children from 3 to 6 is described per year following guidelines for parents written by Engelhart, Win, Vinke and de Win (2010). The development of children is categorized into physical, intellectual, attachment, social, language, and play development.

This picture shows an illustration of a typical Dutch play ground of kindergarten.

Development 3-4 year

When children enter kindergarten they are 4 years old. In the year before they go through some important development and to create some feeling of the state of development of children on the moment they can start with the use of the robot, this age group is described as well.

First of all, the physical development. Children of 3-4 years are able to jump 40-60 cm far, walk on stairs on its own, open and close a door on its own, cycle with training wheels, climb on a climbing frame, cannot suddenly turn or stop a movement, and becomes more convenient and independent. Secondly intellectual development. Children can make harder puzzles, lay pictures in the right order, can concentrate longer, have interest in shapes and series, start asking more questions, play role games with other children, fantasy grows, and it starts to like drawing, making, music, and counting. In third place, attachment development. Children have enough language knowledge to understand their parents, are not fearful for parents leaving and can regulate tension. Then, social development, children get insight in what it can do good and what not and are able to build confidence with positive stimulation. In fifth place, language development. Children like learning new words and try them out, ask many 'why'-questions and over use grammatic rules. Last but not least, play development. Children can cut with scissors, pour a drink without spilling, think about what they want to draw, draw more shape-figures instead of scribbling, and can imitate drawing a circle.

Development 4-5 year

In 'groep 1' (grade 1) children are 4 and sometimes 5 years old.

First of all, the physical development. Children of 4-5 years have good regular body movement skills, change position often (running, climbing, jumping, balancing), can work with a computer mouse, and can catch a ball. Secondly intellectual development, children go to grade 1, can talk, know colors and start writing their own name. In third place, attachment development. Children are already quite independent on that age. Then, social development, children get to know different rules at school than they used to know at house, the socializing process start, and can estimate which they are able to do. In fifth place, language development. Children make longer sentences and their pronunciation becomes good. Last but not least, play development. Children can play on their own outside, can play memory, start drawing people and shapes, and do role playing.

Development 5-6 year

In 'groep 2' (grade 2) children are 5 and sometimes 6 years old.

First of all, the physical development. Children of 5-6 years can throw a ball and catch a ball, become competitive, can jump rope, start with practicing a sport, and can well imitate movements. Secondly intellectual development, children go to grade 2, are interested in their history and their future, have insight in traffic, have higher self-esteem, and is able to spell words with 3 to 4 letters. Then, social development, children want to be independent. In fourth place, language development. Children are quite good with words and can have a proper conversation with someone else without too many mistakes. Last but not least, play development. Children can work multiple days on a drawing.

Content school

This picture shows an illustration of how a Dutch kindergarten look like.

In Appendix D, an elaborate overview what is learned in year 1 and 2 of primary school in the Netherlands can be found. Many aspects that are (usually) learned in kindergarten are based on social and emotional interaction. Existing technologies are not yet able to completely support humans in these interaction and this is especially the case for younger children. In the table below there can be found on what components of primary school content the robot can build on.

Discipline Skill Learning material
Dutch
  • Vocabulary
  • Listening/reading
  • Learning new words
  • Understanding texts (and adapt strategy)
English
  • Vocabulary
  • Listening
  • Learning new words by stories (passive vocabulary) and Using new words by games (active vocabulary)
  • Listening to stories
Mathematics
  • Numbers: concepts
  • Numbers: calculations
  • Measuring
  • Counting in order, with steps, from different starting points, counting amount of things, estimate amount
  • Subtracting and adding (visually)
  • Measuring instruments for mass, money, comparing sizes of things
Orientation to yourself and the world
  • Space around you
  • Plants, animals and humans
  • Weather, universe, agriculture and logistics
  • Recognition of plants and animals, human body, phases in reproduction cycle
Artistic orientation
  • Music
  • Cultural heritage
  • Listening to music, musical instruments
  • Telling stories about important cultural heritage
Physical activity
  • Moving on music
  • Dancing, rhythm of music

Interaction with children

In parent to child interaction there is a natural balance between the control of the parent in comparison to the child’s own demand for independence (Lundberg, 2007). Which means even parents won’t always be able to guide their children to what the parents want for example playing with blocks. This balance depends on the age.

Young children have way less control over their activity and independence than older children. If the robot is used, the same effect will occur. The robot will not have constant control over the activity of the child. If a child seems to want different interactions the robot can use this information. The child could want to interact with the room or environment or maybe the robot with another application. As young children have an attention span of 10-15 minutes, it is important to change some things up after such a time span (Torta, 2012).

This is important to increase the effectiveness for the parents of working or doing something else without using a lot of focus to watch your child.

Beside from the focus benefit for the benefit, Rubby can also increase educational areas using games. From research (Herodotou, 2018) children using in this study angry birds regularly showed positive impact on the maths comprehension.

Parents' opinion

Parents are not the direct users of the robot, but they are indirectly involved. First of all, the robot is developed with the aim to give them more rest. Secondly, parents know their children best and have the responsibility over them. Therefore, a survey was send to parents with questions about a robot for children with educational and entertainment features. In Appendix C the informed consent, questions including answer options and an elaborate result section can be found. Below we will discuss which questions were asked, what the main conclusions were per question category and the points that will be used in the development of the robot.

For the survey Limesurvey was used. This picture shows the first question.

First of all, the robot 'Rubby' was introduced to the participants of the survey. This introduction looked the following:

Rubby is a robot that is made to support parents in focusing on their work at home when their young, school-going children (4- and 5-years old) are around. When a parent has a meeting or another important activity that requires a peaceful work environment, Rubby can keep a child busy during these moments.

Rubby moves by wheels, has arms and a body on which a tablet is mounted. It is able to detect the motion of the child. The tablet, motion detection technologies and interaction technologies make it possible for Rubby to entertain the child with games, teach new things (broading vocabulary, learning to write, etc.) and do physical activities like dancing. Every activity has a span of 10 minutes ( meet the limited attention span of 4- and 5-year old children) and when finished, Rubby asks the child if it want to continue or have another activity.

In this way, a child has various activities to do in the time a parent does not have time and parents will be able to work without too many disturbances.


The following introductory questions were asked.

  • Do you think that Rubby would be able to support you as a parent with working from home during the COVID-19 pandemic lockdowns? Why/why not?
  • What are the most important features of a supporting robot like Rubby do you think? Why?
  • Do you think that your child(ren) would (have) like(d) to play with Rubby? Why?
  • How long is your child able to concentrate on something?
  • Do you think that your child will be able to keep him/herself busy with the variety of activities or Rubby when you are working?

A significant majority of the participants think that this robot could be of support during working from home in COVID-19 lockdowns and also a majority is positive about it. However, parents are concerned about the interaction between child and a non-human object, listening of the child and the fragility of the robot. The majority of the parents think their child would like the robot and are able to keep themselves busy with the robot.


Secondly, parents were asked how they think about safety measures like a camera on the robot. Because of privacy reasons, we also investigated parents' opinions about other measures than camera's. The following questions were asked in the section questions about safety improvement(s) of the robot:

  • Do you want to have your child in another room than you are with a robot like Rubby?
  • Do you think adding a camera to Rubby is a good idea? Why?
  • What would you add to Rubby for safety when parent(s) and the child are in a different room? Why?

As a result, it appeared that the majority of the parents think it is a good idea to add a camera to the robot to provide safety, but also voice technology got support as a safety measure.


In the last content-related section the participants were asked to give input towards the development of the robot by the following questions:

  • Do you have any concerns towards Rubby? Please write them down below.
  • Do you have any tips/recommendations for Rubby? Please write them down below.

The main worries of parents about a robot like this are safety of the child and the robot, safety of the data and the interaction between child and robot.


Lastly, questions about participants' children and other demographics were asked to have a good overview how the sample looks like.

Parents prefer a robot like this that is clearly a toy and is not too human-like.


Conclusion

The following points were learnt from this survey and will be used in the development of the robot:

  • Make sure the robot is hard to break and make sure it does not fall.
  • Do not use it long or include breaks.
  • The variety of activities should be included so many parents will make use of it
  • The robot should renew, so it offers new activities and the toddler will maintain its interest
  • We should include a camera on the robot, but only film the child when the camera is activated by the parent. Also, including voice technology as an extra check can be handy to monitor the child’s behavior.
  • The data of the robot should be safe.
  • Good guidelines for the room are needed to keep the safety for the room’s stuff.
  • The robot should not look like a human, but more like a toy.
  • There should be guidelines for the parents so it is clear what the responsibilities the robot has and which the parents has.

User Needs

The user needs are based on user specifications described above and based on results of the survey that is done among parents. The overview that was used for the user needs in the presentation can be seen in Appendix I.

User need description Primary users Secondary users
The user shall not get bored. X
The user shall be supported in his/her development. X
The user shall have a good time when other human-beings are not able to play with him/her. X
The user shall be able to choose between different activities. X
The user shall not be harmed by the robot when it is used in the right way. X
The user needs a system that prevents from catching up work. X
The user shall be able to work and have online meetings in peace, without disturbances from children. X
The user shall have the control over his/her children (turning on the robot, safety). X
The user wants his/her children to have a good time. X
The user wants his/her children to learn new things. X
The user shall be able to keep an eye on his/her children X

Rubby

The robot that will partially be developed in this project, is called 'Rubby', a combination of 'buddy' and 'robot'. In this chapter, Rubby will be defined.

The results of the survey and background literature have lead to an idea of a robot. First the requirements, preferences, and constraints are discussed. Then, the robot itself, Rubby, is described. Last but not least, a scenario is given to illustrate the usage of Rubby in daily life.


Specifications robot

In this section of the chapter the RPC’s for robot Rubby are given and the technical requirements are listed. These RPC's are motivated and are accompanied with a description on how it will be applied to the robot. The overview that was used for the RPC's in the presentation can be seen in Appendix J.

RPC's for this project

Rationale Application to Rubby
Requirements
The Robot shall be able to move itself with a speed of 4km/h while preventing collisions with objects in the room  This will ensure the safety for the child Rotational motors will be used to drive the tracks at moderate speed
The Robot shall have two arms which are able to rotate about 180 degrees and move spherically  With this, the physical interaction is enhanced Servo motors will be used in order to meet this requirement
The Robot shall be able to be wireless (on battery not with a wire) Being wireless increases the practical ease The battery will be located within the robot, and is able to be charged
The Robot shall have a child-friendly casing and shape  Child-friendly casing ensures both attraction from the child to robot and safety The robot's appearance will be child-friendly (based on literature and survey)
The Robot interface shall be placed on itself where the child can reach it well This is required since the child has to physically make choices on the robot The interface will be put on the front of the robot's main body
Guidelines for parents: Since the robot is a toy it should have a description for safe use for a child This will ensure the safety when using the robot Guidelines will be made concerning the robot, the surroundings and the child
The Robot shall include an emergency button located on the Robot to stop the robot functions In case the robot's functioning fails or becomes harmful, this should be interrupted. An emergency button will be placed at the back of the robot
The Robot shall have voice-recognition sensor on itself where at all time voice can be recognized This requirement is an addition in order to increase the interaction with the user A voice-recognition sensor will be placed on the robot
Preferences
The robot shall be as cheap as possible This preference will increase the affordability and therefore the willingness of using the robot This preference is taken into account in the design (bill of materials)
Constraints
The Robot size shall not be longer than a child of 4/5 years old  This is a constraint that ensures the child is attracted by the robot and does not feel dominated This is taken into consideration in the measurements of the final design


Introduction to Rubby

In this section of the chapter Rubby will be described. Rubby is the translation of the RPC's stated above to a 'real' robot.

Description

Rubby is a robot solution that helps parents focusing on their work at home. The robot Rubby keeps the child busy by functioning as a playmate while taking the desires and attention span of the child into account. This means that parents do not have to focus constantly on their children and give them intensive attention during working hours. Especially during the lockdowns of the COVID-19 pandemic this robot is a great help in daily life.

The Rubby is a robot of around half a meter high and is able to move around by means of wheels. It is able to detect motion of the child and if necessary, the robot can move slowly as well. Rubby has arms and also a body on which a tablet is mounted. This tablet is used as an interface on which the child can play (educational and entertaining) games. Rubby is able to interact with the child in several ways.

First of all, it can respond to the child's speech.

Secondly, it can speak to the child, which is for example used in telling stories, improving the child’s vocabulary.

Finally, the interaction is enhanced by the movements of Rubby, as it can move the arms and roll back and forth. The tablet on the robot’s body can be used in an entertainment as well as an educational way. The change in these kinds of interaction enhances the social connection between Rubby and the child. The robot takes into account the loss of attention of the child by asking whether or not he should switch to a different game.

Scenario

To get a better feeling of how Rubby would be used in daily life, scenario's of the persona's that were described in the Usersection are made. Below a scenario of David and Ava are described. In Appendix B you can find a scenario with Melissa and Noah.


Scenario David and Ava

This picture shows David working from home in peace while Ava plays with Rubby.

It is Friday morning, around 8.30 AM. Ava wakes up because her father David is opening the curtains in her room. ‘Good morning sweetheart’ he says to her. David let her sleep later on than on school days, so he could already do some work. Today he has some important meetings and he has a deadline in the afternoon, so it is going to be a busy day. But first he breakfast with Ava and goes to the supermarket to do some shopping for the day and the coming weekend.

At 10.00 AM, Ava and David are back from the shop. Now, the day can really begin. David opens up his laptop and starts working from the table in the living room. Ava sits in the living room at the carpet and starts playing with her dolls. After half an hour, she starts to get bored. David gives her some fruit and tells her that he has a meeting the next hour. ‘Yes, then I can play with Rubby’ she says and she wonders what the robot has in mind to do for her this time. When David would have announced a meeting 3 months ago, Ava would be disappointed as she had to be silent all time and she got bored. Nowadays, when her father should not be disturbed by Ava she can hang out together with Rubby, a robot that acts a playmate for toddlers so their parents can work or meet without being disturbed.

After taking her last little piece of fruit and sip of her drink, David get Rubby out of the charger in the corner of the room. He puts the robot on the carpet, turns it on and set up the robot for an hour to an hour and a half. “See you in an hour darling” he says to Ava, sits down at the table and puts on his noise-cancelling headphones.

“Hi Ava, I am Rubby I am not a human like you and your father, but I am a robot” Rubby says to Ava. “Hi Rubby, what are we gonna do today?” Ava asks the robot. “We start with a counting game” and the tablet on his belly light up. After 10 minutes he asks Ava “Do you want to go something else?”. “Yes” she answers. She learned that she can only use little commands for the robot and she has to pronounce it clear. When you talk with humans you can have real conversations, so she knows from this that Rubby is a robot.

“Now, we will do a puzzle.” The game on the screen on his belly change and Ava starts putting pieces on the right spots. After 10 minutes Rubby asks again if Ava want to something else, “no” she answers and continues with the puzzle. After 5 minutes he asks again and Ava indicates she want to do something else. “Let’s do some exercise Ava!” Rubby says and he starts to explain what they will do. With his arms the robot indicates what Ava has to with her arms and with his wheels he shows the direction of the steps Ava has to take. After 10 minutes Rubby says “Let’s do a color game” and the screen on the belly lightens up again. This gets on for coming 40 minutes and then David’s meeting is finished. David finds it nice that his daughter does not get bored while he has a meeting and that he is not distracted. It feels good to know that Ava gets a variety of activities to do by Rubby alternated with entertainment, learning and physical tasks.

After long meetings he always takes a walk, so today he takes a walk with Ava. After their half an hour walk, they lunch together. After lunch Ava should take some rest and she read some picture books for half an hour. Then she can watch the television for half an hour. After this she can play with her dolls or something else. This afternoon, David has to finish something important and he also has a meeting. Fortunately, Rubby supports him with keeping Ava busy for an hour and a half.

Due to Rubby David has an effective working day. And also due to Rubby Ava had a nice day with a lot of fun activities. Before they had Rubby as a support, David had to work in his free time to catch up with his work. This weekend, when the whole family is home, they can do a board game or watch the television together with David.


Deliverable 1: Local navigation model

This chapter describes how a model of the path planning is made. It will also describe the steps that have been taken to try to achieve local navigation. Unfortunately the local navigation goal was not met, but a model of the path planning was created successfully.

Why should Rubby move?

Rubby has to move so that it will always face the child. When the parents are working and want to see if their child is safe they can see on their monitor what the robot is filming. So the robot should always face with it's camera to the child. Also movement is needed since there will be made games in the future that include physical training. For example playing with a ball (by moving against a ball it will start rolling and the child will play with it). But it will also be important for the future when facial expression recognition will be included. So when the robot faces the child it can interact with it when certain emotions are detected. Local navigation is needed for avoiding obstacles when achieving these goals.

The difference between navigation, localization and path planning

Before we explain how the models work, the difference between navigation, localization and path planning should be explained. Localization only describes where the robot is in an environment, it does not know what the environment looks like. Path planning uses algorithms to figure out what the environment looks like, often it uses an existing map and defines an obstacle-free path. When that path is made, a controller is made to follow the defined path. Local navigation is a combination of localization and path planning. The robot moves through an environment. By making use of sensors, it can define whether there is an obstacle on it's way or not. Than it will use algorithms to drive past that obstacle towards its target. So the main difference is that local navigation does not make use of an already existing map, it figures out the map by moving through the environment.

Path planning model

The Binary Occupancy Grid
PRM Algorithm

This model demonstrates how to compute an obstacle-free path between two locations on a given map using the Probabilistic Roadmap (PRM) path planner. PRM path planner constructs a roadmap in the free space of a given map using randomly sampled nodes in the free space and connecting them with each other. Once the roadmap has been constructed, you can query for a path from a given start location to a given end location on the map. This model has been made in MATLAB.

Binary Occupancy grid

In this model, the map is represented as an occupancy grid map using imported data. A binary occupancy grid of a room has been made. In the figure on the right you can see a sofa, a round table and a rectangular table. Also two random objects have been placed in the room that you could see as toys which a child did leave somewhere in the room. When sampling nodes in the free space of a map, PRM uses this binary occupancy grid representation to deduce free space.

Inflated map

Furthermore, PRM does not take into account the robot dimension while computing an obstacle-free path on a map. Hence, you should inflate the map by the dimension of the robot, in order to allow computation of an obstacle-free path that accounts for the robot's size and ensures collision avoidance for the actual robot. The difference between the inflated map and not-inflated map can be seen to the right. In the first picture the map has more straight lines while in the second map you can see that the black images are bold.

Roadmap path planner

When the map has been inflated by the dimension of the robot, mobileRobotPRM object is used as a roadmap path planner. The object uses the map to generate a roadmap, which is a network graph of possible paths in the map based on free and occupied spaces. To find an obstacle-free path from start to end location, the number of nodes and connection distances are adapted so the complexity is fitted into the map. After the map is defined, the mobileRobotPRM path planner generates the specified number of nodes throughout the free spaces in the map. A connection between nodes is made when a line between two nodes contains no obstacles and is within the specified connection distance. You can see this in the figure PRM Algorithm.

Controller

When the obstacle-free path from a start to an end location has been made, the robot with a controller can be made which will make sure that the robot follows the defined path. First the robot model has been initialized. The simulated robot has kinematic equations for the motion of a two-wheeled differential drive robot. The inputs to this simulated robot are linear and angular velocities. Based on the path defined above and a robot motion model, you need a path following controller to drive the robot along the path. The path following controller provides input control signals for the robot, which the robot uses to drive itself along the desired path. The settings of the controller are the desired waypoints, linear velocity and angular velocity. The result can be seen in the video:

Video of path planning: https://www.youtube.com/watch?v=_Ffpt758Jns&ab_channel=RobotsEverywhere

Progress on Local Navigation

Overwiev of simscape model

A start was made on developing local navigation for the robot. To start off, a model of Rubby was made in the Unified Robot Description Format (urdf). This model was kinematically similar to the actual robot (it moved in exactly the same way), but the shape was simplified to make simulation easier. The goal was to load this model into a simulation environment and test the obstacle avoidance capabilities of the robot. A LiDAR sensor was to be used to identify obstacles and to avoid them, a program should be written in Matlab. Two different simulation environments were tested. The first one was the simulink simulation environment and the second one was Gazebo in cooperation with Matlab Simulink.

Simulink Simulation

Simulink was chosen first due to the fact that it is a Matlab extension that the group members all had some experience with. To simulate a mobile robot in Simulink, the following steps have to be taken:

  1. The kinematics of the robot have to be defined
  2. A map of the test environment has to be made
  3. Virtual sensors have to be added to the robot
  4. The robot has to be placed inside the simulation environment and the control algorithm has to be tested.

The first step was achieved by analytically determining the motion of the robot for given wheel positions. The exact derivation of the motion of the robot is will not be elaborated on here, but the conclusion was that for a given velocity of both the wheels, the robot would move in the desired way. An overview of the simscape model that defined the motion of the robot can be seen in the figure.

The second step was achieved by using a test environment used in a webinar by Mathworks (MathWorks Robotics and Autonomous Systems Team, 2021). A video of the robot moving through this environment can be seen on: https://youtu.be/NxY3lH0EniM

Rubby in the gazebo environment

Adding sensors in the simulink environment has to be done entirely manually, meaning that for a LiDAR sensor with 180 rays, all 180 rays have to be defined and implemented manually. This is of course not a completely straightforward task and after some further consideration, it was decided to use the Gazebo simulation environment instead. This environment automatically generates all of the sensor data, which is a far more efficient way of doing things. Gazebo can also perform Co-simulations with Matlab simulink, which allowed us to use the pre-existing knowledge of this program to develop a control algorithm.

Gazebo Simulation

LiDAR data from the Gazebo simulation

Since Gazebo has limited support on windows, it had to be used on a Linux system (Ubuntu in this case). Implementing the robot in Gazebo had some other challenges as well. To launch a world in Gazebo with the robot in it, ROS (Robot Operating System) packages had to be used. A package was made for both the Robot and the world that was designed within gazebo. The robot and a LiDAR sensor could then be seen in the world, as can be seen in the figure on the right. To test the connection to matlab, the sensor data from this LiDAR was plotted in Matlab, an example of which can be seen in the figure. When this LiDAR data corresponded to the data that was expected, see the figure on the right, an attempt was made at adding such a LiDAR sensor to the robot. Since the world file and the robot file were in two different file formats, it was not just a case of copy-pasting the code used to make the sensor from the world file and adding it to the robot. The support for Sensors in the urdf file format is small at best, and significant knowledge of the ROS and the Gazebo program is required to properly implement this. This knowledge is not present within the group however, and there was not enough time to properly acquire this within the time frame given for this project.

Continuation

Despite the fact that there was no tangible result to this part of the project, some important steps in developing and verifying a local navigation system were definitely taken. A test environment and defined which can easily be launched in Gazebo, this test environment contains a living room environment as well as a model of the Robot. A simulink file was created that can connect to the Gazebo environment and retrieve LiDAR data from this environment. This can also be used to actuate the wheels of the robot and make it move. The steps that still have to be taken are as follows:

  1. A sensor has to be added to the robot model in the urdf file format.
  2. A control algorithm has to be developed that can enable the robot to avoid obstacles in its way
  3. The control algorithm has to be tested in the simulation environment.

The sensor can be added to the model by extending the existing file defining Rubby, plugins have to be loaded added that allow Gazebo to see sensors from the urdf file format. Incorporating these plugins required pretty in depth knowledge of the urdf file format and the ROS system. Since that was lacking, there was not enough time to properly finish this task.

A control algorithm can be written in a similar way to (Peng et al., 2015) This can be implemented in a matlab script, which can generate an input torque to the wheels, which can then be sent to Gazebo to perform the co-simulation.

Deliverable 2: Mechanical design

In this chapter the end product of the mechanical design will be described and shared. During the project the steps that are taken to achieve this are posted in this part of the wiki.

Design concepting

In order to design a robot, and especially its exterior appearance, several aspects are to be taken into account. Studies have shown the functions and shapes of a robot are crucial for the effectiveness of interaction between the robot and its user. This is the starting point of the design of Rubby. Sketches are made of different components of the robot as well as of the robot as a whole. Here the RPCs are taken into account in order to comply with the determined requirements.

Additionally, the following findings were important with regards to the external design of Rubby:

  • Sound effects: The addition of sound effects is the most effective and feasible way to keep the user’s attraction to the robot (Torta, van Heumen, Cuijpers & Juola, 2012).
  • Eye-contact: Eye-contact does not necessarily improve the attention of the participants. The robot’s gazing behaviour may be suited in cases where the visual attention is already on the robot (Torta et al., 2012).
  • Essence of shape: The shape are and interactional functions of a robot are important for the development of personal robots (Goetz, Kiesler & Powers, 2003).
  • The importance of human-likeness: Robots need a moderate level of human-likeness; however, robots still need to be easily distinguishable from human beings (Tung, 2016).
  • Anthropomorphic appearance: A moderate level of anthropomorphic appearance is required combined with appropriate social cues (Tung, 2016).
  • Children tend to attribute intelligence, biological function and mental states to a robot (Tung, 2016).
  • Humanoid features: A few visual human-like features can achieve the elicitation of children’s preferences (Tung, 2016).
  • General appearance: A humanoid robot appearance is predominantly preferred over a robot with pure mechanical appearance (Walters, Syrdal, Dautenhahn, te Boekhorst & Koay, 2017).
  • Impression of a robot: The initial impression and evaluation of a robot is essential (Walters et al., 2017).


Based on this, the following design choices regarding the external design of Rubby were concluded:

  • Basic shape of the robot should be similar to that of a human being.
  • The addition of sound effects: music sounds and speech.
  • The addition of arms. These arms can turn in order to increase the level of human-likeness and to increase the attention attraction.
  • The addition of a head. To ensure the robot does not look too much like a human being, the incorporation of a moving mouth when speaking is left out.
  • Concerning the eye contact, ‘fake’ eyes can be added, however physically rolling and blinking eyes are left out, since this might work counterproductive.
  • The addition of wheels. This emphasizes the robotic nature instead being a human. The movable aspect is important to improve the emotional response of the child to the robot.

Taking these aspects into account, together with the RPCs made, several sketches and outlines are made to visualize possible solutions. The most relevant sketches are shown below:


The design concept of the robot with wheels. The design concept of the robot with wheels and wheel covers. The design concept of the robot with tracks.

In Appendix E, an overview of all sketches can be found together with more detailed sketches.

Preliminary design

Subsequently, the 2D sketches as presented in appendix H are further elaborated and optimized. The sketches are combined and put into 3D with a computer program. This results in the preliminary designs.
Below, these 3D design can be seen. In Appendix F, some more figures can be seen of these 3D models. Two main preliminary designs are the result; One with normal wheels, two attached to the main body, and one wheel at the rear of the robot in order to ensure stability. The second design is one that uses tracks. With this, the robot also maintains stability and will be able to drive more easily on non-smooth surfaces. Both models form the base for the final design, that will be described in the following section.

The robot design with wheels. The robot design with tracks.


In Appendix F more figures of the preliminary designs in Blender can be found.

Final design

In this phase, conclusions are made on the design and design choices are discussed. The several specifications of the final design are considered. A bill of materials (BoM) is presented based on the design choices. This BoM is part of deliverable 3. There, also a price indication for the robot is presented.

The robot's main body will contain a battery and processor. The system will contain three motors. The first motor will be to rotate the inner arm around its own axis (red arrow in figure 1 in Appendix G) and can be mounted on the inside of the main body. The second motor will rotate the outer arm around the red axis in the second figure in Appendix G. The easiest option is to mount the outer motor at the end of the inner arm, however, it will increase the mass of the arm a bit. This second motor could be positioned inside the main body and then transfer the rotation via a belt or some alternative. The third motor is used to drive the tracks in order to ensure the movement of the robot.

The weight of our robot can be a compared with a Nao robot, about the same size and material. The total weight of a Nao robot is 5 kg. This would give an estimate for an entire arm of around 400 grams.

If the arms would then be 20 cm long, the required torque can be calculated. The outer motor will put a force on the COG (center of gravity) of the arm 10cm from itself and using gravity force, T = r*Fz = 10*0.200= 2 kg cm. Online a servo can be found of 4.1 kg cm torque (Robotshop, n.d.). This servo is 46 grams and is relatively cheap. This motor can also be used for the inner arm.

Concerning the motor needed for the tracks, the total mass of the robot is estimated at max 5 kg and the radius of the upper wheel at 5 cm. The torque needed for this motor equals equals T=radius*mass of robot= 5*5= 25 kg cm. A motor for this can be found (Amazon, n.d.). So this can be used to drive the tracks.

Concerning the battery, if we assume a laptop capacity battery, of about 5 Amp hours, and the DC motors of the tracks at 0.8A each or 1.6A total. Then the rated time of operation before charging would be 5 divided by 1.6, which is about 3 hours. This calculation can be done because the tracks will be the main power consumption.

The head will be a fixed part, with a face which is static. The robot’s appearance is colourful, with use of bright colours for the track covers and main body. This increases the attraction level of the child to the robot.

The tracks will be protected with a cover. This ensures the safety for the child, so that it does not become entangled with the tracks. See figure 3 and 4 in Appendix G.

The tablet and the processor (a Raspberry Pi) will be linked but bought separately. The tablet will contain one main app built for this robot which includes the proposed functionalities: entertainment and educational. This tablet will be connected to the Raspberry Pi in order to control the servos attached to the arms and tracks. The Raspberry Pi also performs the processing of the camera sensor and distance sensors and run path planning from this data.

See the figures below for the final design. The camera (black element) is put above the face to have the greatest visibility. The distance sensors in red will be mounted one at each side, just above the tracks, except for the front where two distance sensors are used as this will be the main movement direction. More information about that path is helpful for mapping and therefore path planning.

The final design seen from the right side. The final design seen from the front with descriptions. The final design seen from the left side.


In Appendix G you can find a further visualization of some design aspects.

Deliverable 3: Bill of materials

Based on the mechanical requirements of the final design, as mentioned in deliverable 2, a bill of materials is presented, together with the corresponding costs.

  • Of the [Hitec HS-422 Servo Motor], 4 of these would be used for the arms coming down to approximately 4*12 = 48 euro.
  • 2 [DC Motor] will be used to drive the tracks. 2 of these are required and therefore these will cost 2*60 = 120 euro.
  • A [distance sensor] is used for the path planning. Also, a [720p camera] will be added. This will cost 13 euro.
  • As described, a battery of about 5 Ah is used. The price of such a battery would be about 30 euro.
  • The price of the plastic body is estimated at 10 euro. The [track gears] that will drive the tracks are used 6 times and this boils down to a price of 6*15 = 90 euro.
  • The Raspberry Pi is used to control the motors and to process the data of the camera and distance sensors. This processor will cost 30 euro.
  • The tablet including the app providing the entertainment and education functions, is estimated to cost 300 euro.


Material Price Quantity Subtotal price
Servo motor arms €12,- 4 €48,-
DC motor tracks €60,- 2 €120,-
Plastic body €10,- 1 €10,-
Tablet €300,- 1 €300,-
Gear for tracks €15,- 6 €90,-
Battery €30,- 1 €30,-
Processor (Raspberry Pi) €30,- 1 €30,-
Camera 720p €13,- 1 €13,-
Distance sensor €6.50 5 €32.50
€673.50

Deliverable 4: Software plan

RPC's for the software engineer

In this chapter the end product of the software plan will be described. This plan is needed for future development of the robot.

In the following table, the requirements, preferences and constraints the software engineer should take into account when developing the software for the robot are described.

Requirements
The Robot shall be able to charge itself when the battery percentage is 5 percent. And it should go back to its charger when the Robot is turned “off". 
The Robot shall have some sort of interface on which entertaining games/educational games can be played by the child   
The Robot shall be able to make sounds, including pre-programmed talks like “Well done” and music 
The Robot shall be able to provide: Physical games, entertainment games and educational games 
The Robot shall be able to send a signal to the parents when something is wrong with the child 
The Robot shall include an emergency button located on the Robot to stop the robot functions 
The Robot shall be able to detect when the child is bored / can’t keep its concentration 
The Robot shall have voice-recognition for its pre-programmed games 
Preferences
The Robot is able to detect emotions of the child 
Constraints
The Robot shall not tell the child what to do or not 
The Robot shall be designed in such a way that the data is safe and no user data can leak 
The layout of Squla is simple and full of colors, this is a good example how the app could look like


Explanation of what should be implemented

The software of Rubby should be developed as for a tablet. Except for the robot, tablet will not be used, so everything should be in one app. This app should contain educational and entertainment games. The content of the educational games should smoothly fit into the Dutch school program, for example Squla. The entertainment apps are partially entertaining games that fit into the world of children’s perception, musical games and physical games.

These physical games are not only with the tablet itself, but also with the arms of the robot. This means that the software engineer should create a connection between the arms and the tablet, so they can cooperate to give the child some physical activity.

Beside from the games, the robot should play sounds and talk. This means that the app also should contain speaking technology. Furthermore, the robot should also be able to understand what the child is saying as good as possible, to have interaction with children. This means that speech recognition technology should be implemented.


Deliverable 5: Future plan

A future plan is needed in order to give a clear description of the involved stakeholders and which steps are needed to be taken to get the robot to the commercial market and how it should be developed in the future.

The future plan therefore consists of the following topics:

  • A description of the possible involved stakeholders
  • How the robot will be implemented into the commercial market and all corresponding steps
  • A plan on further development of the robot


Stakeholders and their interest in Rubby

It is important that all stakeholders are involved in the development plan from begin to end. The primary, secondary and tertiary user needs are therefore key elements in the future plan. The following stakeholders are involved in the development of Rubby:

  • Children: These are the primary users. Therefore, their needs form the most important factor in the future plan. They need to actually benefit from Rubby, the robot entertains them in an interactional way and is not only a toy, but also a 'buddy'. The goal is that the child will develop its educational and general skills.
  • Parents: These are the secondary users. They benefit from Rubby in such a way that they can work from home more easily without being disturbed and also by having children that are happier and are more supported in their learning development. Therefore, their involvement in the future plan is of great importance.
  • Teachers: They can be seen as the tertiary users. When children are ill they are still able to learn things at home. Development of basic skills can be supported by the robot so more time can be spent on more profound subjects. Hence, teachers' experiences and needs should also be taken into account.
  • Schools: These are important stakeholders, since there is a shortage of teachers and Rubby can therefore be of added value in education.
  • Ministry of education: The same holds for this stakeholder. Furthermore, the implementation of Rubby can help to give each child equal chances to learn. Hence it might be interesting for the ministry to invest in the robot.
  • Teaching method companies: These stakeholders benefit from the robot, as their educational programs can also be involved in homes and not only in school settings. The experience of teaching method companies are therefore essential to the future plan.


Financial and commercial implementation plan

The robot needs to be made commercially available and also attractive. In order to achieve this, the implementation of Rubby starts at small-scale. It will be used in a small group; in this stage the users will be questioned about their experience and about required improvements. In this way, an evaluation of the preliminary state of the robot is made. These experiences and improvements on the robot based on the recommendations will be used to convince the stakeholders (parents and schools, but also teaching method companies and the education ministry) to invest in the robot.

Concerning the financial aspect, since the robot may not become too expensive for parents (as becomes clear from the results of the survey) nor for schools who want to invest in it, a loan system is introduced by which schools can invest in Rubby. This can be paid by the corona education subsidy that is introduced by the Dutch government to catch up. In the long term, this will result in an increased general willingness to invest in the robot.

For parents that work at home for a company (due to the corona restricted measures), the investment of them in the robot will be supported by that company. These companies will partially pay for the robot. After all, this will also benefit them, because parents will then be able to work better from home and thus be more productive.

Ultimately, when the robot is used on a larger scale, then also (producing) costs will drop, and hence it will become more and more interesting for more users.


Future development plan

The development of the robot is a continuous process in which the robot will be increasingly optimized to fulfill the user needs and requirements. This should be done in cooperation with all actors involved:

  • Educationalists & psychologists: Especially since Rubby will become mainly a education tool rather than only a toy keeping children busy, educationalists are one of the main actors. They know what educational aspects are important to be included, what the best variety is in this. This expertise will also manifest in the fact that they understand the development of young children. Cooperation with these experts are key to a successful development of the robot.
  • Teachers: Teachers also have an important contribution in the future development. They can indicate what should be included and what not, also with respect to the social placing of the robot; eventually the robot is an aid, not a replacement of the teacher.
  • Parents: They should also be involved in the steps of the development of Rubby in such a way that they indicate what works for them and what not. They can specify requirements in order to achieve the most efficient and effective outcome.
  • Children: For children, it is important that their needs will be fulfilled. In the future development, this means it needs to be taken into consideration what they like to see and what is effective or not.
  • Schools: Schools may contribute to this development when it comes to practical implementation; What will be the best way to deploy the robot in classrooms or other educational settings?
  • Software engineers: These are involved in further development when it comes to improvements and additions in implementing apps the software-wise connection with the components of the robot. Also, in the future, this software might be improved by making it's structure simpler.
  • Mechanical engineers: This concerns the mechanical improvements that are made in the future on the robot. The design might need improvements when it comes to safety or physical possibilities of the robot.

Deliverable 6: Guidelines for parents

General Guidelines

Parents know their children best and have the ultimate responsibility over them. Therefore, keep in mind that the robot's interaction with the child may have impact on the child's decision-making, way of thinking and behaviour. Always attend the child in order to supervise it and maintain the control over the kid and his activity. Although the robot is created to be used for children -taking into account their relatively short attention span- still there need to be focus of the supervisor in case something with the robot goes wrong, creating an unsafe situation for the child.

Do not let the child play with the robot for too long. The maximum period of time before the robot needs to be charged is estimated at more than 3 hours. Be aware of the fact that the robot is an addition to the other toys of the kid, rather than a replacement thereof.

The camera on top of the robot only films when it is activated by the parent. This is done by a button on its back. Note that an activated camera is not indispensable for a proper functioning of the robot. The camera can be turned off at all times whenever this is preferred.

The robot is created to use when a parent is in the same room and a relatively safe environment. This ensures a safer operation of the robot and the child. The robot could for any kind of reason behave not properly. Worst case would be the motors running at their highest torque. In such a case the stop button at the back can always be used to shut off the power from the battery to the motors. Even if all motors run at their maximum torque, the design was chosen in such a way that the motors need a low torque in the first place to operate properly. This means that it won't be able to crush anything fragile like a child his or hers arm or finger. The tracks, are driven from the top gear which means it is very hard to come into direct contact with it. The tracks also have guards to prevent accidental contact with them.

Guidelines for the environment

Rubby will not be able to physically restrain a child in its care and can therefore not prevent it from harming itself or damaging anything in its surroundings. A safe environment has to be realized before the child can be left in the care of Rubby for any amount of time. Creating a safe environment can be broken down into the following tasks:

  1. Selecting the right environment
  2. Removing dangerous objects
  3. Make sure Rubby can be used safely

Selecting the right environment

Selecting the right environment for using Rubby is the only way to guarantee the safety of the child. A list of features that should be accounted for in the chosen environment can be seen below, this list does not cover all of the conceivable dangerous features in a room and the responsibility for creating a safe environment always lies with the user.

  • Avoid anything that can cause severe harm if a child were to fall from it (stairs, balconies, ladders, open windows, etc.).
  • Make sure all heavy objects are secured or outside the reach of the child (Make sure closets, shelves and other items have no chance of falling on the child).
  • Avoid features that could burn the child, such as fireplaces, electric heaters, burning stoves, hot ovens, etc.
  • Select a room in which all sharp corners are covered.
  • Cover up all electrical outlets to avoid the child giving itself an electrical shock.

Removing dangerous objects

After the right environment for using the robot has been chosen, all other objects that can cause harm to a child have to be removed from this environment. These objects include:

  • Objects that are a chocking hazard, such as very small toys, (plastic) bags, balloons, small holiday decorations, etc.
  • Objects with which a child can cut itself, such as scissors or knives.
  • Objects with which a child can burn itself, such as lighters, candles or matches

Make sure Rubby can be used safely

There are some instances in which the safety features of Rubby can not work properly, this could lead to Rubby hitting a child, or it tipping over on a child. To avoid this, some safety measures have to be taken.

  • Avoid pets in the environment of Rubby, pets might hinder its ability to properly track the child, which could lead to collisions.
  • Avoid slopes with an angle larger than 20 degrees, the robot might tip over on steep slopes and fall on top of the child.
  • The robot uses a sensor that cannot sense glass. It is therefore important to cover glass surfaces with another object to make sure the robot does not run into it.

Note: This section is not a complete list of possible dangers, users (or parents in case of use by children) should always carefully check the environment for safety and the responsibility for a safe environment always lies with them.

Appendix

Appendix A: Persona's

More than 10% of the households with children is led by one parent and 80% of these parents are women (Volksgezondheidenzorg.info, 2019). In one-parent households, the majority has one child (Nederlands Jeugdinstituut, 2020). 

Melissa and Noah

Melissa

Melissa is a female of 31 years old. She is not married and has a son, Noah, of 3 years old. They live together in an apartment at the edge of a town. She likes to walk with her son to the playground nearby their home, play with him and watch animation videos with him. Once in a month, she meet with her friends in a restaurant or a bar. Melissa’s mother babysit for Noah. 

Melissa works for five days a week as a secretary in an advertising agency. Before the COVID-19 pandemic, she worked for four days at the office and on Friday from home. Nowadays, she works from home five days a week. As Noah is not allowed to go to kindergarten every day, he is also at home for several days a week. She finds it hard to concentrate with Noah at home too: he attract a lot of attention and makes a lot of noise. Because she cannot do much on her job during the day, Melissa works until late in the evenings to complete her work.

Noah

Noah is a boy of 3 years old. He lives with his mother Melissa in an apartment at the edge of town. He likes to hear stories read by his mom, play with his toy cars and build towers from building blocks.

Noah does not go to school as he is too young. He went to kindergarten four times a week before the COVID-19 pandemic, but nowadays he is at home for multiple days during the week. He tries to talk with his mom Melissa, but she is grumpy when he does as she has to work. She parks him in the same room as she is with a box full of toys. He plays with his color puzzles and other toys, but after ten minutes he get bored and starts another activity. He uses sounds while playing and his mother get annoyed by this. A few toys later, he starts asking questions to his mom, but she does not answer and talks to other people on the screen. He tries to play again with his toys, but after a while he tries again to attract the attention of his mother. It does not work, she gets angry and puts him in a chair so he can watch television and she is not disturbed by him.

Appendix B: Scenario

Scenario Melissa and Noah

This week, Melissa and Noah are both at home. Usually, Melissa works at home and Noah goes to school. But this week, Noah is at home as well because his teacher got infected with the corona virus and so Noah’s complete class must be quarantined.

Melissa has a full work week with a few deadlines, two meetings and much administration to do. So it is unfortunate that Noah is at home whole week as he distract her often when she is working. Fortunately, Noah’s school borrowed parents of quarantined children a robot that keeps them busy. Rubby teaches the toddlers about calculating, language, motor control playfully, the school told Melissa when she picked up the robot and Noah yesterday. She hopes it works so she can continue working and Noah can keep on learning.

Melissa starts working at 7 AM, one hour before she wakes up Noah. In this way, she has some more time for Noah. At 8 AM Melissa walk to Noah’s bedroom and wakes him up. They get him dressed and have some breakfast. Melissa tells Noah that she has a lot of work to do, so he should keep himself busy. Noah looks a little sad to his mom. Then she tells him he will get help with this from a robot. “A robot?” asks Noah surprised. “Yes a robot. But first we will brush your teeth and then you can choose to draw or to play with your cars”.

Melissa works until 10.30 AM. Sometimes Noah came to her and then she gave him another toy to play with. Now they make some fruit and drinks for Noah and herself. She feels that Noah starts to get a little bored by keeping himself busy, so after they finished their coffee break, Melissa gets Rubby and activate the robot. She leaves Noah with the robot and walks back to her desk to work. She hears Rubby talk to Noah and sees he is all ears. Melissa put her headphones on and continue working. Once in a while she takes a look at Noah and sees him thinking, laughing and talking to the robot. After more than an hour of effective working, Melissa decides that it is time for lunch. She sits at the table with Noah and while they are eating she asks what he thinks of the robot. He immediately starts talking enthusiastically about the games he played, the dancing he did and the things he learned by Rubby.

Melissa is happy to hear this. Upfront she felt guilty to keep her child busy with a robot, but it appears that Noah enjoys it and he also learns from it. Melissa thinks Rubby is a good support for her when Noah is at home and she has to work. He has a nice time and does not distract her. It is not possible to use it the complete day, but it can fill up the gaps Noah do not know what to do. So, Melissa can much better work than during the first lockdown.

Appendix C: Survey

Consent form survey

Information form for participants

This questionnaire being part of a design course is performed by Emma Allemekinders, Paul Hermens, Emma Höngens, Hidde Huitema and Jos Stolk, students under the supervision of Elena Torta of the Control System Technology group at Eindhoven University of Technology. 

Before participating, you should understand the procedure followed in this study, and give your informed consent for voluntary participation. Please read this page carefully.

About this study

This study has the goal to examine what the requirements of parents would be for a robot that assists parents in combining work from home and children. You will fill in this survey with questions about the functions of such a robot.

This study will take 5 minutes to complete and does not involve any risks.

Voluntary Participation

Your participation is completely voluntary. You can stop participation at any time. You can also withdraw your permission to use your data up to after completing this survey.

Confidentiality and use, storage, and sharing of data

This study has been approved by Elena Torta, teacher of the bachelor course Robots Everywhere of Eindhoven University of Technology. In this study experimental data will be stored. The anonymized dataset that, to the best of our knowledge and ability will not contain information that can identify you, will be used in this research and stored on a TU/e OneDrive account.

Further information

If you want more information about this study, the study design, or the results, you can contact Emma Höngens (contact email: e.hongens@student.tue.nl). You can report irregularities related to scientific integrity to confidential advisors of the TU/e, whose contact information can be found on www.tue.nl.

Certificate of consent

By starting this study, I indicate that I have read and understood the study procedure, and I agree to voluntarily participate.

Questions Survey

Introduction robot

Rubby is a robot that is made to support parents in focusing on their work at home when their young, school-going children (4- and 5-years old) are around. When a parent has a meeting or another important activity that requires a peaceful work environment, Rubby can keep a child busy during these moments. Rubby moves by wheels, has arms and a body on which a tablet is mounted. It is able to detect the motion of the child. The tablet, motion detection technologies and interaction technologies make it possible for Rubby to entertain the child with games, teach new things (broading vocabulary, learning to write, etc.) and do physical activities like dancing. Every activity has a span of 10 minutes ( meet the limited attention span of 4- and 5-year old children) and when finished, Rubby asks the child if it want to continue or have another activity. In this way, a child has various activities to do in the time a parent does not have time and parents will be able to work without too many disturbances.

  • Do you think that Rubby would be able to support you as a parent with working from home during the COVID-19 pandemic lockdowns?
    • Not at all
    • Slightly
    • Somewhat
    • Really well
  • Why/why not?
  • What are the most important features of a supporting robot like Rubby do you think?
    • Interaction of the robot with the child
    • Entertainment of the child by a robot
    • Assisting the child in learning new things
    • Motivate the child to do physical activities, using the robot’s movements
    • It does not matter how the robot keeps the child busy, as long as parents can work without too many disturbances
    • Different, namely ….
  • Why?

Only for parents that have/had at least one child during one or two of the lockdowns of 4 or 5 years old.

  • Do you think that your child(ren) would (have) like(d) to play with Rubby?
    • Yes
    • Sometimes
    • No
  • Why?
  • How long is your child able to concentrate on something?
  • Do you think that your child will be able to keep him/herself busy with the variety of activities or Rubby when you are working?
    • Not at all
    • Slightly
    • Really well

Questions about safety improvement(s) of the robot

This first prototype of Rubby is focused on keeping children busy so their parents can work. Rubby and the child are in the same room as the working parent is responsible for the safety of the child.

  • Do you want to have your child in another room than you are with a robot like Rubby?
    • Yes, that’s fine for me
    • Yes, but only if there are extra safety measures taken (like a camera in the room or guidelines for a safer room)
    • No, never

There are several technologies that can keep an eye on a child when interacting with a robot like Rubby. The most common is adding a camera to the robot so parents can look at their child. This has advantages like good monitoring, but a camera also bring disadvantages with it. For example the privacy of the child and the data security.

  • Do you think adding a camera to Rubby is a good idea?
    • Yes
    • Yes, but the camera only should be on when the parent activate it
    • No
  • Why?

There are also other ways to monitor your child when using Rubby, but these technologies are still in the development phase and do not work optimally. Examples are an emotion bracelet and voice recognition. When the child wears a bracelet, emotions can be measured on basis of bio signals Voice technology also is an option. When the child cries, a signal can be sent to the parent.

  • What would you add to Rubby for safety when parent(s) and the child are in a different room?
    • A camera
    • A bracelet with bio signal technology
    • Voice technology
    • Nothing
    • Different, namely:
  • Why?

Last questions about the robot

  • Do you have any concerns towards Rubby? Please write them down below.
  • Do you have any tips/recommendations for Rubby? Please write them down below.

Questions about children

  • How many children do you have?
  • How many of your children are boys?
  • How many of your children are girls?
  • In which age range are your children? (Multiple answers possible)
    • 0-3
    • 4-5
    • 6-10
    • 11-15
    • 16 years or older

Demographics

  • How old are you?
  • To which gender identity do you identify most?
    • Female
    • Male
    • Not listed
    • Prefer not to answer
  • What is your highest obtained education level?
    • Primary school
    • Secondary school
    • MBO
    • HBO
    • University
    • Not listed, ...
  • Where are you come from?

End Survey

Thank you for participating in this survey about the supporting robot Rubby. Is there anything you want to add or let us know?

Results

Age children

Sample description

The participants of the survey were adults with children. The participants were recruited online: the tutor and the group members sent the survey to parents they know. The average age of the participants was 38.8 and ranged between 31 and 50. There were 12 participants: 11 females and 1 males. The children of the participants mostly are in the age range of 0 to 3 years old (9 out of 12). 25% of the parents have children that are in the age range of 4 to 5 years old (3 out of 12). There are 2 parents out of 12 that have children that are 6 to 10 and do not have children between 4 and 5 years old. A third of the participants have children that are 11 years or older. The number of children range between 1 to 4 and the average number of children is 2.25. Participants have 0 to 3 boys with an average of 1.25 and 0 to 3 girls with an average of 1.1.

Data analysis

Support of Rubby
Remarks support

Support of robot during COVID-19 pandemic lockdowns

Significant majority of the participants (91%) think that Rubby could have support them during working from home in COVID-19 pandemic lockdowns. And a majority is positive about it (70%). The concerns people have toward the use of Rubby are mostly about the interaction between the child and a non-human, listening of the child and the fragility of the robot.














Opinion of parents about the robot's features
Remarks of the features of the robot

Features of robot

There is no clear answer what parents think is the most important feature of Rubby. From this we can conclude that the variety of activities of Rubby fit into most parents view. Parents think that the variety of tasks would be good and that the interaction of robots leads to engagement among children.





Children would like the robot

Majority of the parents (8/11) think that their children will like Rubby. The main reason for this is the newness of it and exploring things.




Children will keep themselves busy with the robot

The majority (8/10) think that their children probably will keep themselves busy more or less.



Parents' opinion to let their children alone with the robot in room





Children alone with Rubby in a room

50% of the parents think that they slightly want to have their children in another room than they are. From this we can conclude that an extra safety measure is needed to make this possible.

Parents' opinion about use of a camera
Remarks about camera use
Parents' opinion about safety measures
Remarks about these safety measures

Safety technology

The majority of the participants (9/12) think it is a good idea to adding a camera to the robot is a good idea. However, most of them (8/12) think it is best to only put the camera on, when the parent activate it. Some parents think privacy is important and others think that safety comes for privacy and other parents do not think that safety should not be left to a robot.


Half of the participants think it would be best to add a camera to the robot to provide safety. Others think that voice technology would be a better idea (3/12) and others even nothing. Voice technology is liked because it can recognize and a participant think it would be good to combine this with a camera. Another remark was that a parent was worried about the safety of the house.




























Recommendations and concerns

Concerns parents have towards a robot like this one
Recommendations parents have for this robot

Participants were asked to give some tips and share their concerns about the robot. In the table you can find them. Some of them are translated because they were in Dutch. Parents are mainly worried about the safety of the child and the robot, safety of the data and the interaction between child and robot instead of interaction between child and parent. There were not that many recommendations. A good tip was to make the robot distinguishable from human-beings and in particular from parents. And there was also mentioned to set boundaries for the responsibility of the robot.










Conclusion

There are things we learnt from the survey among parents we will use in the design of Rubby.

  • Make sure the robot is hard to break and make sure it does not fall.
  • Do not use it long or include breaks.
  • The variety of activities should be included so many parents will make use of it
  • The robot should renew, so it offers new activities and the toddler will maintain its interest
  • We should include a camera on the robot, but only film the child when the camera is activated by the parent. Also, including voice technology as an extra check can be handy to monitor the child’s behavior.
  • The data of the robot should be safe.
  • Good guidelines for the room are needed to keep the safety for the room’s stuff.
  • The robot should not look like a human, but more like a toy.
  • There should be guidelines for the parents so it is clear what the responsibilities the robot has and which the parents has.

Appendix D: School content

In the Netherlands, education is organized by means of core objectives (SLO, 2020a) per discipline. These have to be obtained by children when they finish primary school. How schools organize their education is free, as long as the core objectives are obtained. This means that schools follow different learning processes and it cannot be said what is learned per year.

However, TULE, an organization consisting of education experts, describes which steps in the learning process has to be taken to obtain the core objectives. These sub objectives are divided per periods of 2 years. This project focusses on children that go to kindergarten, so the information about ‘group 1-2’ (year 1 and 2) is of interest. The objectives focusses on Dutch, English, mathematics, orientation to yourself and the world, artistic orientation and physical education (TULE, n.d.). Apart from the steps per school year, TULE and SLO also made so-called ‘content cards’. These documents describe per discipline where should be focused on for the first years of primary school. Below you can find an overview that describes the discipline and its core objectives and the execution of these for year 1 and 2 of primary school. The information from the tables is all from SLO (2020b).

Dutch Language education focusses on proficiency in Dutch. This proficiency is divided into four skills: copying, describing, structuring and evaluating. Dutch is used in writing, reading and conversations (TULE, n.d.).

DutchTable.jpg

English Education in English on primary school focusses on speaking and reading simple texts (TULE, n.d.).

EnglishTable.jpg

Mathematics During mathematics lessons on primary school, children gradually become familiar with numbers, sizes, shapes, structures and fitted relations and operations (TULE, n.d.).

MathTable.jpg

Orientation to yourself and the world This proficiency focuses on the children, interaction with people and the world. It is divided in four sections: Humans and society, nature and technology, space, time (TULE, n.d.).

OrientTable.jpg

Artistic orientation This discipline focusses on children’s appreciation of cultural and artistic expression and to express themselves in this domain (TULE, n.d.).

Art1Table.jpg

Art2Table.jpg

Physical education Physical education focuses on motoric skills and social skills of children. Children experience physical activity as fun (TULE, n.d.).

Skill category
Balancing
Swinging
Jumping
Climbing
Tumbling
Aiming
Juggling
Playing with a goal
Playing tag
Wrestle games
Moving on music

Appendix E: Sketches external design

With respect to the earlier determined requirements, several sketches and outlines are made to visualize possible design solutions. In this appendix, all these sketches of the robot are shown:

Sketch 1, front view
Sketch 1, side view


Sketch 2, front view
Sketch 2, side view


Sketch 3, front view
Sketch 3, side view


Sketch of arm, side and front view
Sketch 4, front and side view
Sketch 5, front and side view
Sketch 6, front view

<br\>

Appendix F: Designs in Blender

The figures in this appendix show several views of the preliminary designs, created in Blender.

Preliminary design 1, front view
Preliminary design 2, front view
Preliminary design 1, side view
Preliminary design 2, side view
Preliminary design 1, iso view
Preliminary design 2, iso view


<br\>
































Appendix G: Visualization of some design aspects

In this section, a few design aspects of the final design are shown.

Design aspect: Rotation of inner arm along red axis
Design aspect: Rotation of outer arm along red axis
Design aspect: Covers around tracks for safety
Design aspect: Improved and sleek covers around tracks


Appendix H: Progress on simulating Rubby in simscape

URDF snip.PNG
Rubby urdf output.PNG

After looking some more into the robotics toolbox in matlab, I found out that it is quite hard to model a mobile robot with this toolbox. This is due to the fact that this toolbox generates a RigidBodyTree model from a robot. This model basically looks like a tree, so it starts with a fixed (!) base and separate limbs can move with respect to that base. This works great for robot arms that stay in one place, but I was not able to get it to work with our robot.

I therefore went back to Simscape in matlab. Simscape can be used to model moving robots. So I started off with that. I checked the kind of files that can be imported by simscape and ended up with making a .urdf file of a rough version of rubby. It looks nowhere like our robot, but it has similar kinematics and that is what matters for now. The .urdf file that was generated looks as follows:


This little snippet of code shows the two most important things in a .urdf file: Links and joints. Links specify different solid bodies in the model and joints connect the two. The actual file is about 120 lines long, so it is no use showing that here, but when importing this file into matlab, the result can be seen in the figure above.

After this, it was time to import the robot in simscape and to make it move. The first part was making the generated urdf file move the way we wanted to. The simscape model for this looks as follows:

Simscape only movement.PNG

Where the two constants are the inputs for the angular velocity of the right and left wheel. This model resulted in the robot driving around happily. A robot moving through an empty world does have any use for us however, so the next step is to implement it in an environment and try to incorporate path planning/ obstacle avoidance.

Rubby in room.PNG

To implement the robot in a simulation environment, we have to make another file format, in this case a VRML file. This file can easily be exported from blender, so we can use the real robot model made by Paul and Jos. This can then be used in combination with a living room setting that was taken from the mathworks website. Now our robot looks as follows in the living room environment:


This is still with an ugly Rubby model that I made myself, but this can be replaced with the actual model pretty easily. Now we have to make the robot actually move in this living room environment. This was done by using a VR-sink block in Simulink and providing it with inputs. The results from implementing this looks similar to the picture in the living room, only now Rubby is able to move.


The next step is to incorporate an obstacle avoidance algorithm. This is something I tried but could not get to work yet. For this, virtual sensors have to be implemented, either in the simscape model or in the .urdf file and this has to be coupled to the environment.

Appendix I: Picture user needs

The user needs as how they were shown in the presentation.

The user needs overview of the presentation

Appendix J: Picture RPC's

The RPC's as how they were shown in the presentation.

The RPC's overview of the presentation



Appendix Y: Logbook

Week 1

Name Total time Tasks
Paul Hermens 9 hours Looking into possible topics, writing about features of the robot
Emma Allemekinders 15 hours Brainstorm three ideas ,Problem statement, Planning, 5 papers about interaction child-Robot, getting to know the Wiki
Jos Stolk 8 hours Think of project topics, discuss possibilities, searching for literature (about ethics and SotA)
Emma Höngens 8 hours Ideas project, brainstorm meeting, literature research about privacy
Hidde Huitema 8 hours Think of different topics for the project. Find literature about the state of the art and the ethics of a robot nanny

Week 2

Name Total time Tasks
Paul Hermens 8 hours Check planning, added survey questions to surveymonkey, setting requirements and resulting functions necesssary of the robot
Emma Allemekinders 10 hours New planning, Survey ideas, RPC's, Persona's 5 years old, table with an overview of the three child development states
Jos Stolk 7 hours Research into educational functions, how should this be implemented; robot description
Emma Höngens 12 hours Research to children's abilities, persona's, scenario, survey
Hidde Huitema 8 hours Work on requirements and functions for the robot, find papers about the possible benefits of educational functions and physical activities

Week 3

Name Total time Tasks
Paul Hermens 11 hours Research and writing decision making and understanding of children at young age, wiki updates and adding parts to wiki
Emma Allemekinders 11 hours Think of solution to RPC's, looking into robot system toolbox, divide RPC's + improve them, make a list of the deliverables, work on the wiki-page
Jos Stolk 11 hours Research on mechanical design aspects, investigate key factors robot appearance, define design choices, start conception external design, adding parts to wiki.
Emma Höngens 12 hours Putting survey in program, improving survey with feedback, research to Dutch education system, structuring and completing wiki
Hidde Huitema 9 Hours Figuring out the potential uses of URDF and simcenter for this project. Learning the basics of URDF. Working on the list of deliverables

Week 4

Name Total time Tasks
Paul Hermens 7 hours Look into design and recreate in 3D. First look at mechanical design.
Emma Allemekinders 8 hours Figure out path planning model + add it to wiki
Jos Stolk 7 hours Mechanical design: create 3d model based on 2d sketches.
Emma Höngens 8 hours Survey, data analyses, results description
Hidde Huitema 15 Hours Start Working on simulating Rubby in a living room setting

Week 5

Name Total time Tasks
Paul Hermens 9 hours Refining some of the 3D model and then looking for a solution to create the whole robot including Bill of Materials.
Emma Allemekinders 10 hours Trying to get a file format from blender to URDF-format, wiki
Jos Stolk 9 hours Refine the preliminary designs into concrete final design, add technical specifications, bill of materials, work on wiki
Emma Höngens 12 hours Wiki, improving cohesion, scenario, user needs
Hidde Huitema 15 hours Trying to get co-simulation between matlab and gazebo working

Week 6

Name Total time Tasks
Paul Hermens 10 hours Added some of the design in the model for example the camera, tablet example and distance sensors. Started at parent guidelines
Emma Allemekinders 12 hours Making a binary occupancy grid for path planning, using PRM algorithm, make a controller, make a movie of the working path planning, put it on wiki
Jos Stolk 11 hours Clearing up the wiki, specifications on the (final) design (figures), starting on deliverable 6.
Emma Höngens 11 hours Improved the complete wiki and created a software plan
Hidde Huitema 15 hours Worked on Simulink/gazebo simulation

Week 7

Name Total time Tasks
Paul Hermens 14 hours Made mechanical design, bill of materials and software plan implemented in the presentation. Typed script for presentation and then recorded the presentation
Emma Allemekinders 12 hours Wiki on path planning, make the powerpoint for the presentation with script.
Jos Stolk 12 hours Specifying and motivating of determined RPCs, creating a future plan, final work on wiki.
Emma Höngens 11 hours Improved the complete wiki and created a future plan
Hidde Huitema 11 hours Write section on local navigation. Write guidelines for parents.

Appendix Z: Planning

Week 1

Week Main tasks
Week 1 Form groups, choose subject, problem statement, start research
Week 2 Research, come up with solutions for problem statement, choose direction to go to
Week 3 Elaborate research on user group, from persona's, create a survey for parents, start research to possible technical options
Week 4 Sending out survey, start information seeking for path planning and navigation, do research to technical options
Week 5 Data analysis survey, start with path planning and navigation, start with preliminary design of mechanical design, complete user needs, start with final RPC list, describe robot, write scenario's
Week 6 Complete RPC list, path planning, navigation, finalize mechanical design and bill of materials, make a software plan
Week 7 Complete path planning, complete navigation, make a future plan, describe guidelines for parents, finalize wiki
Week 8 Presentation, handing in wiki

References

  • Amorim, A. N., Jeon, L., Abel, Y., Felisberto, E. F., Barbosa, L. N. F., & Dias, N. M. (2020). Using Escribo Play Video Games to Improve Phonological Awareness, Early Reading, and Writing in Preschool. Educational Researcher, 49(3), 188–197. https://doi.org/10.3102/0013189x20909824
  • Engelhart, E., Win, H., Vinke, J. G., & de Win, H. (2010). Ontwikkelmeter jeugd (2nd ed.). CPS Onderwijsontwikkeling en advies.
  • Flewitt, R., Messer, D., & Kucirkova, N. (2014). New directions for early literacy in a digital age: The iPad. Journal of Early Childhood Literacy, 15(3), 289–310. https://doi.org/10.1177/1468798414533560
  • Goetz, J., Kiesler, S., & Powers, A. (2003). Matching robot appearance and behavior to tasks to improve human-robot cooperation. The 12th IEEE International Workshop on Robot and Human Interactive Communication, 2003. Proceedings. ROMAN 2003. Proceedings. RO-Man 2003. The 12th IEEE International Workshop on Robot and Human Interactive Communication. https://doi.org/10.1109/roman.2003.1251796
  • Han, J., Jo, M., Park, S., & Kim, S. (2005). The educational use of home robots for children. ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, 2005., 378–383. https://doi.org/10.1109/roman.2005.1513808
  • Herodotou, C. (2018), Mobile games and science learning: A comparative study of 4 and 5 years old playing the game Angry Birds. Br J Educ Technol, 49: 6-16. https://doi.org/10.1111/bjet.12546
  • Ishiguro, H., Ono, T., Imai, M., Maeda, T., Kanda, T., & Nakatsu, R. (2001). Robovie: an interactive humanoid robot. Industrial Robot: An International Journal, 28(6), 498–504. https://doi.org/10.1108/01439910110410051
  • Kanda, T., Hirano, T., Eaton, D., & Ishiguro, H. (2004). Interactive Robots as Social Partners and Peer Tutors for Children: A Field Trial. Human-Computer Interaction, 19(1), 61–84. https://doi.org/10.1207/s15327051hci1901&2_4
  • Kory, J. & Breazeal, C. (2014). "Storytelling with robots: Learning companions for preschool children's language development," The 23rd IEEE International Symposium on Robot and Human Interactive Communication, Edinburgh, 2014, pp. 643-648, doi: 10.1109/ROMAN.2014.6926325.
  • Lundberg, S., Romich, J. & Tsang, K.P. (2007). "Decision-Making by Children," IZA Discussion Papers 2952, Institute of Labor Economics (IZA).
  • Meyns, P., van der Spank, J., Capiau, H., De Cock, L., Van Steirteghem, E., Van der Looven, R., & Van Waelvelde, H. (2019). Do a humanoid robot and music increase the motivation to perform physical activity? A quasi-experimental cohort in typical developing children and preliminary findings in hospitalized children in neutropenia. International Journal of Human-Computer Studies, 122, 90–102. https://doi.org/10.1016/j.ijhcs.2018.07.010
  • Oliemat, E., Ihmeideh, F., & Alkhawaldeh, M. (2018). The use of touch-screen tablets in early childhood: Children’s knowledge, skills, and attitudes towards tablet technology. Children and Youth Services Review, 88, 591–597. https://doi.org/10.1016/j.childyouth.2018.03.028
  • Peng, Y., Qu, D., Zhong, Y., Xie, S., Luo, J., & Gu, J. (2015, augustus). The obstacle detection and obstacle avoidance algorithm based on 2-D lidar. 2015 IEEE International Conference on Information and Automation. https://doi.org/10.1109/icinfa.2015.7279550
  • Tanaka, F., Cicourel, A., & Movellan, J. R. (2007). Socialization between toddlers and robots at an early childhood education center. Proceedings of the National Academy of Sciences, 104(46), 17954–17958. https://doi.org/10.1073/pnas.0707769104
  • Taufatofua, J., Heath, S., Ramirez-Brinez, C. A., Sommer, K., Durantin, G., Kong, W., Wiles, J., & Pounds, P. (2018). Designing for Robust Movement in a Child-Friendly Robot. 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 7667–7674. https://doi.org/10.1109/iros.2018.8593414
  • Timmons, B. W., Naylor, P.-J., & Pfeiffer, K. A. (2007). Physical activity for preschool children — how much and how? Applied Physiology, Nutrition, and Metabolism, 32(S2E), S122–S134. https://doi.org/10.1139/h07-112
  • Torta, E., van Heumen, J., Cuijpers, R. H., & Juola, J. F. (2012). How Can a Robot Attract the Attention of Its Human Partner? A Comparative Study over Different Modalities for Attracting Attention. Social Robotics, 288–297. https://doi.org/10.1007/978-3-642-34103-8_29
  • Trost, S. G., Sirard, J. R., Dowda, M., Pfeiffer, K. A., & Pate, R. R. (2003). Physical activity in overweight and nonoverweight preschool children. International Journal of Obesity, 27(7), 834–839. https://doi.org/10.1038/sj.ijo.0802311
  • Walters, M. L., Syrdal, D. S., Dautenhahn, K., te Boekhorst, R., & Koay, K. L. (2007). Avoiding the uncanny valley: robot appearance, personality and consistency of behavior in an attention-seeking home scenario for a robot companion. Autonomous Robots, 24(2), 159–178. https://doi.org/10.1007/s10514-007-9058-3
  • Yoshimi, T., Nishiyama, M., Sonoura, T., Nakamoto, H., Tokura, S., Sato, H., Ozaki, F., Matsuhira, N., & Mizoguchi, H. (2006). Development of a Person Following Robot with Vision Based Target Detection. 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, 5286–5291. https://doi.org/10.1109/iros.2006.282029