Firefly Eindhoven - Remaining Sensors: Difference between revisions
(→Lidar) |
|||
(9 intermediate revisions by 2 users not shown) | |||
Line 1: | Line 1: | ||
==IMU== | ==IMU== | ||
Inertial Measurement Unit or IMU is a sensor board on the drone which is use to measure the orientation of the drone. The IMU consists of three sensors: | |||
<ul> | |||
<li>Accelerometer</li> | |||
<li>Gyroscope</li> | |||
<li>Magnetometer</li> | |||
</ul> | |||
The accelerometer measures the acceleration of the object, using which the orientation of the drone can be estimated from the direction of the gravitation vector. The gyroscope measures the angular velocity of the object from which the angular displacements can be calculated. Even though these are some of the very standard sensors for UAV's, the accelerometer is very sensitive to any external forces: presence of external forces influences the measurement accuracy of the sensor. As for the gyroscope, the measurements of the gyroscope are usually quite accurate after calibration but however ,as time progresses, the angular displacement estimation becomes very inaccurate as it accumulates the integrated error and drift in velocity measurement. | |||
This however, can be fixed by use of sensor fusion algorithms. | |||
==Lidar== | ==Lidar== | ||
The lidar is used for height estimation. LIDAR is an abbreviation and stands for 'Laser Imaging Detection And Ranging'. The sensor is located on the bottom of the drone and emits light in the direction perpendicular to the plane of the drone. This light goes down and reflects on the ground and travels back to the sensor again. The sensor can measure when the light comes back. The speed of light is known and the sensor measures the time between sending and receiving the light signal. Based on these parameters, one can find the distance to the ground. Note that the LIDAR does not take into account the roll and pitch of the drone. If the drone is rotated slightly the distance that the light travels before it hits the ground, will be larger, and the estimation of the height will also be higher. By some simple trigonometry, one can compensate for this discrepancy. | |||
==Optical flow== | ==Optical flow== | ||
Optical flow refers to estimation of apparent velocities of certain objects in an image. This is done by measuring the optical flow of each frame using which velocities of objects can be estimated. It is 2D vector field where each vector is a displacement vector showing the movement of points from first frame to second. By estimating the flow of points in a frame, the velocity of the moving camera can be calculated. | Optical flow refers to estimation of apparent velocities of certain objects in an image. This is done by measuring the optical flow of each frame using which velocities of objects can be estimated. It is 2D vector field where each vector is a displacement vector showing the movement of points from first frame to second. By estimating the flow of points in a frame, the velocity of the moving camera can be calculated. The estimation of velocity allows the use of advanced control schemes for the drone such as LQR because simple controllers such as PID just differentiate the position to damp the system, but this introduces amplification of noise within the control loop. A technique like LQR would directly make use of the velocities and does not have this problem. | ||
'''Working Principle''' | '''Working Principle''' | ||
Line 19: | Line 30: | ||
In the above equation, the space differentials of intensity are refereed to as image gradients and the above equation is termed as Optical flow Equation. The time differentials of the pixel position <math>x, y</math> are the unknowns which determine the optical flow. | In the above equation, the space differentials of intensity are refereed to as image gradients and the above equation is termed as Optical flow Equation. The time differentials of the pixel position <math>x, y</math> are the unknowns which determine the optical flow. | ||
There are multiple algorithms in order to solve this problem: | There are multiple algorithms in order to solve this problem, the most popular ones being: | ||
<ul> | <ul> | ||
<li> | <li>Lucas Kanade Method</li> | ||
<li> | <li>Phase Correlation</li> | ||
</ul> | </ul> | ||
'''Open MV Camera''' | |||
In order to perform optical flow estimation, the team decided to use the Open MV Camera. The Open MV Camera can be programmed via the Open MV IDE. The language used in the IDE is Python with some additional libraries that are meant only for Open MV. In addition, Open MV provides multiple examples which can directly be used to solve the problem. | |||
The optical flow in the open mv can be achieved by the 'displacement' function in the open mv ide. This function makes use of the phase correlation method to estimate the optical flow. However due to limited computational capacity of the on-chip processor the algorithm first reduced the quality of the images captured and then performs the velocity estimation. | |||
In order to estimate the velocities of the drone, it was decided that a few LEDs would be placed on the ground and by meauring the optical flow of the LEDs, the velocties of the drone could be measured. However, as the camera quality of the Open MV was very limited and this was even more deteriorated by the algorithm, we decided to modify the existing code. | |||
In order to reduce the computational power consumption, we used a blob detection that would detect the LED's and perform the phase correlation on that, this would limit the number of pixels over which phase correlation was performed. | |||
However, after multiple tests and trials it was determined that the hardware of the Open MV camera would make optical flow possible in very specific and static conditions and our conditions for the show were very dynamic, therefore the team decided to drop the idea for the moment. | |||
==Sensor fusion== | ==Sensor fusion== | ||
In order to overcome the limitations of each sensor, sensor fusion algorithms can be utilized. As the orientation of the drone is already estimated within the low level software by Avular, it was decided that only the position and velocity of the drone should be estimated in the top level software, using the low level estimate for the orientation. | |||
===Proposed filter=== | |||
The proposed observer estimates the position and velocity of the drone with respect to the world frame's three different axis, i.e. the state is chosen as | |||
<math> vec{x} = (x, \dot{x}, y, \dot{y}, z, \dot{z})^T </math> | |||
It is an instationary single stage Kalman filter, meaning that all information gets fused in a single update step, but the Kalman gain is not predefined but calculated on the basis of the current covariances. This is needed as the rates of the different sensors are not the same and the covariance of the sensors is a function of the current state, and thus the covariance of the measurements and the current state are not constant, introducing the need for a dynamic Kalman gain. | |||
===Process model=== | |||
The chosen process model is a constant velocity model, as the magnitude of the accelerations of the drone are such that the velocity can be regarded as constant between two consecutive time steps of the filter. The covariance of the process model is a white noise acceleration model to account for the unmodelled accelerations of the drone. | |||
===Sensors=== | |||
The following sensors are used in the filter: | |||
# UWB system: this system currently provides the x and y position of the drone based on the measured ranges to the stationary beacons. The system was first extended to also provide the two solutions for the z position by adding another equation to the closed form triangulation algorithm. The covariance of this sensor is taken to be the variance of the measured position during a static measurement. | |||
# LIDAR: the LIDAR measures the distance to the ground in the direction of this sensor. Therefore, the orientation of the drone is needed to predict this measurement. The LIDAR has a resolution of 1 cm, i.e. the discrete character of the sensor is noticeable during flight. The noise level of the LIDAR is within this 1 cm, so the covariance cannot be obtained by a static measurement. For the first iteration of the filter, the sensor will be dealt with as if it was continuous (i.e. just a linear prediction model) and the 3<math>\sigma</math> bounds are taken to be 1 cm, as no noise is observed during a static measurement. This indicates that all possible noise lies within the 1 cm resolution, thus setting the 99.7% confidence interval to 1 cm as a first order of approximation. | |||
# Barometer: the barometer measures the height of the drone, but is extremely noisy and suffers from drift, especially when the height of the drone exceeds approximately 50 cm. To this extent, the offset of the barometer is dynamically estimated in the filter and hardly uses the barometer above 50 cm. | |||
# Ultrasound and top camera: the filter offers the possibility to include estimates from the top camera and ultrasound systems by dynamically building the measurement covariance matrices and innovation vector based on flags if a new measurement for each sensor is available. However, these sensors are currently not used in the filter as these measure the position of the drone, but might have a position dependent offset with respect to the UWB. Simply trying to incorporate these measurements together with the UWB will cause these offsets to be labelled as noise, decreasing the quality of the position estimate. In order to properly include these measurements, these offsets should be investigated and included in the prediction model for each sensor. | |||
===Results=== | |||
The filter is currently being implemented and tested, so no results can be shown at this point in time. |
Latest revision as of 19:56, 26 May 2018
IMU
Inertial Measurement Unit or IMU is a sensor board on the drone which is use to measure the orientation of the drone. The IMU consists of three sensors:
- Accelerometer
- Gyroscope
- Magnetometer
The accelerometer measures the acceleration of the object, using which the orientation of the drone can be estimated from the direction of the gravitation vector. The gyroscope measures the angular velocity of the object from which the angular displacements can be calculated. Even though these are some of the very standard sensors for UAV's, the accelerometer is very sensitive to any external forces: presence of external forces influences the measurement accuracy of the sensor. As for the gyroscope, the measurements of the gyroscope are usually quite accurate after calibration but however ,as time progresses, the angular displacement estimation becomes very inaccurate as it accumulates the integrated error and drift in velocity measurement.
This however, can be fixed by use of sensor fusion algorithms.
Lidar
The lidar is used for height estimation. LIDAR is an abbreviation and stands for 'Laser Imaging Detection And Ranging'. The sensor is located on the bottom of the drone and emits light in the direction perpendicular to the plane of the drone. This light goes down and reflects on the ground and travels back to the sensor again. The sensor can measure when the light comes back. The speed of light is known and the sensor measures the time between sending and receiving the light signal. Based on these parameters, one can find the distance to the ground. Note that the LIDAR does not take into account the roll and pitch of the drone. If the drone is rotated slightly the distance that the light travels before it hits the ground, will be larger, and the estimation of the height will also be higher. By some simple trigonometry, one can compensate for this discrepancy.
Optical flow
Optical flow refers to estimation of apparent velocities of certain objects in an image. This is done by measuring the optical flow of each frame using which velocities of objects can be estimated. It is 2D vector field where each vector is a displacement vector showing the movement of points from first frame to second. By estimating the flow of points in a frame, the velocity of the moving camera can be calculated. The estimation of velocity allows the use of advanced control schemes for the drone such as LQR because simple controllers such as PID just differentiate the position to damp the system, but this introduces amplification of noise within the control loop. A technique like LQR would directly make use of the velocities and does not have this problem.
Working Principle
If [math]\displaystyle{ I(x,y,t) }[/math] is the intensity of a pixel in an image then after some time [math]\displaystyle{ dt }[/math], as the pixel moves some distance [math]\displaystyle{ dx }[/math] and [math]\displaystyle{ dy }[/math] then as the pixel intensity is consistent, it can be said that;
[math]\displaystyle{ I(x,y,t) = I(x+dx, y+dy, t+dt) }[/math]
Using taylor series, it is possible to write
[math]\displaystyle{ \frac{\partial I}{\partial x} \frac{\partial x}{\partial t} + \frac{\partial I}{\partial y} \frac{\partial y}{\partial t} + \frac{\partial I}{\partial t} = 0 }[/math]
In the above equation, the space differentials of intensity are refereed to as image gradients and the above equation is termed as Optical flow Equation. The time differentials of the pixel position [math]\displaystyle{ x, y }[/math] are the unknowns which determine the optical flow.
There are multiple algorithms in order to solve this problem, the most popular ones being:
- Lucas Kanade Method
- Phase Correlation
Open MV Camera
In order to perform optical flow estimation, the team decided to use the Open MV Camera. The Open MV Camera can be programmed via the Open MV IDE. The language used in the IDE is Python with some additional libraries that are meant only for Open MV. In addition, Open MV provides multiple examples which can directly be used to solve the problem.
The optical flow in the open mv can be achieved by the 'displacement' function in the open mv ide. This function makes use of the phase correlation method to estimate the optical flow. However due to limited computational capacity of the on-chip processor the algorithm first reduced the quality of the images captured and then performs the velocity estimation.
In order to estimate the velocities of the drone, it was decided that a few LEDs would be placed on the ground and by meauring the optical flow of the LEDs, the velocties of the drone could be measured. However, as the camera quality of the Open MV was very limited and this was even more deteriorated by the algorithm, we decided to modify the existing code.
In order to reduce the computational power consumption, we used a blob detection that would detect the LED's and perform the phase correlation on that, this would limit the number of pixels over which phase correlation was performed.
However, after multiple tests and trials it was determined that the hardware of the Open MV camera would make optical flow possible in very specific and static conditions and our conditions for the show were very dynamic, therefore the team decided to drop the idea for the moment.
Sensor fusion
In order to overcome the limitations of each sensor, sensor fusion algorithms can be utilized. As the orientation of the drone is already estimated within the low level software by Avular, it was decided that only the position and velocity of the drone should be estimated in the top level software, using the low level estimate for the orientation.
Proposed filter
The proposed observer estimates the position and velocity of the drone with respect to the world frame's three different axis, i.e. the state is chosen as
[math]\displaystyle{ vec{x} = (x, \dot{x}, y, \dot{y}, z, \dot{z})^T }[/math]
It is an instationary single stage Kalman filter, meaning that all information gets fused in a single update step, but the Kalman gain is not predefined but calculated on the basis of the current covariances. This is needed as the rates of the different sensors are not the same and the covariance of the sensors is a function of the current state, and thus the covariance of the measurements and the current state are not constant, introducing the need for a dynamic Kalman gain.
Process model
The chosen process model is a constant velocity model, as the magnitude of the accelerations of the drone are such that the velocity can be regarded as constant between two consecutive time steps of the filter. The covariance of the process model is a white noise acceleration model to account for the unmodelled accelerations of the drone.
Sensors
The following sensors are used in the filter:
- UWB system: this system currently provides the x and y position of the drone based on the measured ranges to the stationary beacons. The system was first extended to also provide the two solutions for the z position by adding another equation to the closed form triangulation algorithm. The covariance of this sensor is taken to be the variance of the measured position during a static measurement.
- LIDAR: the LIDAR measures the distance to the ground in the direction of this sensor. Therefore, the orientation of the drone is needed to predict this measurement. The LIDAR has a resolution of 1 cm, i.e. the discrete character of the sensor is noticeable during flight. The noise level of the LIDAR is within this 1 cm, so the covariance cannot be obtained by a static measurement. For the first iteration of the filter, the sensor will be dealt with as if it was continuous (i.e. just a linear prediction model) and the 3[math]\displaystyle{ \sigma }[/math] bounds are taken to be 1 cm, as no noise is observed during a static measurement. This indicates that all possible noise lies within the 1 cm resolution, thus setting the 99.7% confidence interval to 1 cm as a first order of approximation.
- Barometer: the barometer measures the height of the drone, but is extremely noisy and suffers from drift, especially when the height of the drone exceeds approximately 50 cm. To this extent, the offset of the barometer is dynamically estimated in the filter and hardly uses the barometer above 50 cm.
- Ultrasound and top camera: the filter offers the possibility to include estimates from the top camera and ultrasound systems by dynamically building the measurement covariance matrices and innovation vector based on flags if a new measurement for each sensor is available. However, these sensors are currently not used in the filter as these measure the position of the drone, but might have a position dependent offset with respect to the UWB. Simply trying to incorporate these measurements together with the UWB will cause these offsets to be labelled as noise, decreasing the quality of the position estimate. In order to properly include these measurements, these offsets should be investigated and included in the prediction model for each sensor.
Results
The filter is currently being implemented and tested, so no results can be shown at this point in time.