PRE2022 3 Group5

From Control Systems Technology Group
Revision as of 15:17, 11 February 2023 by J.lap@student.tue.nl (talk | contribs) (Added paper about SLAM)
Jump to navigation Jump to search

Group members

Name Student id Role
Vincent van Haaren 1 Human Interaction Specialist
Jelmer Lap 1569570 LIDAR & Environment mapping Specialist
Wouter Litjens 1751808 Chasis & Drivetrain Specialist
Boril Minkov 1564889 Data Processing Specialist
Jelmer Schuttert 1480731 Robotic Motion Tracking Specialist
Joaquim Zweers 1734504 Actuation and Locomotive Systems Specialist

Project Idea

The project idea we settled on is designing a crawler robot to autonomously create 3d maps of difficult to traverse environments so humans can plan routes through small unknown spaces

Project planning

Week Description
1 Group formation
2 Prototype design plans done

Bill of Materials created & Ordered components

break Carnaval Break
3 Monday: Split into sub-teams

work started on prototypes for LIDAR, Locomotion and Navigation

4 Thursday: Start of integration of all prototypes into robot demonstrator
5 Thursday: First iteration of robot prototype done [MILESTONE]
6 Buffer week - expected troubles with integration
7 Environment & User testing started [MILESTONE]
8 Iteration upon design based upon test results
9 Monday: Final prototype done [MILESTONE] & presentation
10 Examination

State of the Art

Literature Research

Overview
Paper Title Reference Reader
Modelling an accelerometer for robot position estimation [1] Jelmer
An introduction to inertial navigation [2] Jelmer
Position estimation for mobile robot using in-plane 3-axis IMU and active beacon [3] Jelmer
Mapping and localization module in a mobile robot for insulating building crawl spaces [4] Jelmer L

Modelling an accelerometer for robot position estimation

The paper discusses the need for high-precision models of location and rotation sensors in specific robot and imaging use-cases, specifically highlighting SLAM systems (Simultaneous Localization and Mapping systems).

It highlight sensors that we may also need: " In this system the orientation data rely on inertial sensors. Magnetometer, accelerometer and gyroscope placed on a single board are used to determine the actual rotation of an object. "

It mentions that, in order to derive position data from acceleration, it needs to be doubly integrated, which tents to yield great inaccuracy.

drawback: the robot needs to stop after a short time (to re-calibrate) when using double-integration to minimize error-accumulation: “Double integration of an acceleration error of 0.1g would mean a position error of more than 350 m at the end of the test”.

An issue in modelling the sensors is that rotation is measured by gravity, which is not influenced by for example yaw, and gets more complicated under linear acceleration. The paper modelled acceleration, and rotation according to various lengthy math equations and matrices, and applied noise and other real-word modifiers to the generated data.

It notably uses cartesian and homogeneous coordinates in order to seperate and combine different components of their final model, such as rotation and translation. These components are shown in matrix form and are derived from specification of real-world sensors, known and common effects, and mathematical derivations of the latter two.

The proposed model can be used to test code for our robot's position computations.

An introduction to inertial navigation

This paper (as report) is meant to be a guide towards determining positional and other navigation data from interia based sensors like gyroscopes, accelerometers and IMU's in general.

It starts by explaining the inner workings of a general IMU, and gives an overview of an algorithm used to determine position from said sensors' readings using integration, showing what intermitted values represent using pictograms.

It then proceeds to discuss various types of gyroscopes, their ways of measuring rotation (such as light inference), and resulting effects on measurements, which are neatly summarized in equations and tables. It takes a similar for Linear acceleration measurement devices.

In the latter half the paper, concepts and methods relevant to processing the introduced signals are explained, and most importantly it is discussed how to partially account for some of the errors of such sensors. It starts by explaining how to account for noise using allan variance, and shows how this effects the values from a gyroscope.

Next, the paper introduces the theory behind tracking orientation, velocity and position. It talks about how errors in previous steps propagate through the process, resulting in the infamously dangerous accumulation of inaccuracy that plagues such systems.

Lastly, it shows how to simulate data from the earlier discussed sensors. Notably though the previous paper already discussed a more accurate and recent algorithm (building on this paper).

Position estimation for mobile robot using in-plane 3-axis IMU and active beacon

The paper highlights 2 types of positioning determination: Absolute (does not depend on previous location) and Relative (does depend on previous location). It goes on to highlight advantages and disadvantages of several location determination systems. It then proposes a navigation system that mitigates as much of the flaws as possible.

The paper continues by describing the sensors used to construct the in plane 3 axis IMU: - x/y accelerometer, - z-axis gyroscope

Then, the ABS is described. It consists of 4 beacons mounted to the ceiling, and 2 ultrasonic sensors attached to the robot. The technique essentially uses radio frequency triangulation to determine the absolute position of the robot. The last sensor described is an odometer, which needs no further explanation.

Then, the paper discusses the model used to represent the system in code. Most notably the system is somewhat easier to understand, as the in-plane measurements mean that much of the robot position's complexity is restricted to 2 dimensions. The paper also discusses the used filtering and processing techniques such as a karman filter to combat noise and drift. The final processing pipeline discussed is immensely complex due to the inclusion of bounce, collision and beacon-failure handling.

Lastly, the paper discusses the result of their tests on the accuracy of the system, which shown a very accurate system, even when the beacon is lost.

Mapping and localization module in a mobile robot for insulating building crawl spaces

This paper describes a possible use case of the system we are trying to develop. According to studies referenced by the authors the crawl spaces in many european buildings can be a key factor in heat loss in houses. Therefore a good solution would be to insulate below floor to increase the energy efficiency of these buildings. However this is a daunting task since it requires to open up the entire floor and applying rolls of insulation. The authors then propose the creation of a robotic vehicle that can autonomously drive around the voids between floors and spray on foam insulation. There already exist human operated forms of this product, however the authors suggest an autonomous vehicle can save time and costs. A big problem with the Simultanious localization and mapping (SLAM) problem in underfloor environments according to the authors is the presence of dust, sand, poor illumination and shadows, this makes the mapping very complex according to the authors.

A proposed way to solve the complex mapping problem is by using both camera and laser vision combined to create accurate maps of the environment. The authors also describe the 3 reference frames of the robot, they consist of the robot frame, the laser frame and the camera frame. The laser provides a distance and with known angles 3d points can be created which can then be transformed into the robot frame. The paper also describes a way of mapping the color data of the camera onto the points

The authors continue to explain how the point clouds generated from different locations can be fit together into a single point cloud with an iterative closest point (ICP) algorithm. The point clouds generated by the laser are too dense for good performance on the ICP algorithm. Therefore the algorithm is divided in 3 steps, point selection, registration and calidation. During point selection the amount of points is drastically reduced, by downsampling and removing floor and ceiling. Registration is done by running an existing ICP algorithm on different rotations of the environment. This ICP algorithm returns a transformation matrix that forms the relation between two poses and one that maximizes an optimization function is considered to be the best. The validation step checks if the proposed solution for alignment of the clouds is considered good enough. Finally the calculation of the pose is made depending on the results of the previous 3 steps.

Lastly, the paper discusses the results of some expirements which show very promising results in building a map of the environment

  1. Z. Kowalczuk and T. Merta, "Modelling an accelerometer for robot position estimation," 2014 19th International Conference on Methods and Models in Automation and Robotics (MMAR), Miedzyzdroje, Poland, 2014, pp. 909-914, doi: 10.1109/MMAR.2014.6957478.
  2. Woodman, O. J. (2007). An introduction to inertial navigation (No. UCAM-CL-TR-696). University of Cambridge, Computer Laboratory.
  3. T. Lee, J. Shin and D. Cho, "Position estimation for mobile robot using in-plane 3-axis IMU and active beacon," 2009 IEEE International Symposium on Industrial Electronics, Seoul, Korea (South), 2009, pp. 1956-1961, doi: 10.1109/ISIE.2009.5214363.
  4. Mapping and localization module in a mobile robot for insulating building crawl spaces. (z.d.). https://www.sciencedirect.com/science/article/pii/S0926580517306726