Mobile Robot Control 2020 Group 3: Difference between revisions
TUe\s137911 (talk | contribs) |
TUe\s137911 (talk | contribs) |
||
Line 41: | Line 41: | ||
'''Input data processing''' | '''Input data processing''' | ||
* Laser range finder interpretation | * Laser range finder interpretation | ||
''Inputs: distance and angle from LaserData Interpreting data generated by the laser range finder, using a 2D SLAM algorithm.'' | |||
* Odometer interpretation | * Odometer interpretation | ||
''Inputs: OdometryData Calculates speed of the mobile robot integrating position values, relays the data to the SLAM algorithm.'' | ''Inputs: OdometryData Calculates speed of the mobile robot integrating position values, relays the data to the SLAM algorithm.'' |
Revision as of 13:05, 26 May 2020
Team members
M.N. de Boer (Martijn)
G.J.L. Creugers (Gijs)
P. Leonavicius (Pijus)
A.L. Nooren (Anna Lisa)
M.K. Salahuddin (Mohamed Kaleemuddin)
S. Narla (Shashank)
Design Document
Requirements:
The following requirements regarding task performance, safety and sofware should be satisfied by the simulated PICO robot:
- Task performance
- The robot must be able to recognize target cabinets
- The robot is capable of planning a path and is able to adapt to unexpected circumstances, for instance a closed door.
- PICO can rotate in place, in order to re-position when in front of a cabinet
- Must be able to announce the completion of the current objective
- The robot should not be inactive for more than 25 seconds.
- The robot has to be able to detect static and dynamic object and present them in the world model.
- Safety
- The robot avoids bumping into walls and doors
- The robot avoids collisions with static and dynamic obstacles
- PICO must obey the limits on translation and rotation velocity
- PICO should maintain a 5cm Stopping distance from the obstacle.
- Software
- The software is started by a single executable
- The software can be easily updated .
- The User-interaction should be minimal and User-friendly.
Functions:
Input data processing
- Laser range finder interpretation
Inputs: distance and angle from LaserData Interpreting data generated by the laser range finder, using a 2D SLAM algorithm.
- Odometer interpretation
Inputs: OdometryData Calculates speed of the mobile robot integrating position values, relays the data to the SLAM algorithm.
- Sensor Fusion
Combining sensory information from multiple sensors can have uncertainty.This module can help to have reliable information flow to correlate and deconstruct data.
- Vector map data interpretation
A function used for structuring data obtained from the provided map of the testing area. To be used as inputs for position estimation and path planning functions.
Mapping world model
- Surroundings detection:
Comparing the expected surroundings based on the vector map and the output of the LRF interpretation function.
- Obstacle recognition:
Given the found surroundings, the robot has to decide whether the surrounding are known walls or unknown obstacles as mark them accordingly.
- Position estimation:
Comparing the expected surroundings using the provided vector map and the outputs of the LRF and odometry interpretation functions.