Firefly Eindhoven - Remaining Sensors: Difference between revisions
Line 5: | Line 5: | ||
==Optical flow== | ==Optical flow== | ||
Optical flow refers to estimation of apparent velocities of certain objects in an image. This is done by measuring the optical flow of each frame using which velocities of objects can be estimated. It is 2D vector field where each vector is a displacement vector showing the movement of points from first frame to second. By estimating the flow of points in a frame, the velocity of the moving camera can be calculated. | Optical flow refers to estimation of apparent velocities of certain objects in an image. This is done by measuring the optical flow of each frame using which velocities of objects can be estimated. It is 2D vector field where each vector is a displacement vector showing the movement of points from first frame to second. By estimating the flow of points in a frame, the velocity of the moving camera can be calculated. The estimation of velocity allows the use of advanced control schemes for the drone such as LQR because simple controllers such as PIC just differentiate the position to estimate the velocity but that however inaccurate. But a technique like LQR would directly make use of the velocities are provide much better control. | ||
'''Working Principle''' | '''Working Principle''' | ||
Line 24: | Line 24: | ||
<li>Phase Correlation</li> | <li>Phase Correlation</li> | ||
</ul> | </ul> | ||
'''Open MV Camera''' | |||
In order to perform optical flow estimation, the team decided to use the Open MV Camera. The Open MV Camera can be programmed via the Open MV IDE. The language used in the IDE is Python with some additional libraries that are meant only for Open MV. In addition, Open MV provides multiple examples which can directly be used to solve the problem. | |||
The optical flow in the open mv can be achieved by the 'displacement' function in the open mv ide. This function makes use of the phase correlation method to estimate the optical flow. However due to limited computational capacity of the on-chip processor the algorithm first reduced the quality of the images captured and then performs the velocity estimation. | |||
In order to estimate the velocities of the drone, it was decided that a few LEDs would be placed on the ground and by meauring the optical flow of the LEDs, the velocties of the drone could be measured. However, as the camera quality of the Open MV was very limited and this was even more deteriorated by the algorithm, we decided to modify the existing code. | |||
In order to reduce the computational power consumption, we used a blob detection that would detect the LED's and perform the phase correlation on that, this would limit the number of pixels over which phase correlation was performed. | |||
==Sensor fusion== | ==Sensor fusion== |
Revision as of 12:42, 26 May 2018
IMU
Lidar
Optical flow
Optical flow refers to estimation of apparent velocities of certain objects in an image. This is done by measuring the optical flow of each frame using which velocities of objects can be estimated. It is 2D vector field where each vector is a displacement vector showing the movement of points from first frame to second. By estimating the flow of points in a frame, the velocity of the moving camera can be calculated. The estimation of velocity allows the use of advanced control schemes for the drone such as LQR because simple controllers such as PIC just differentiate the position to estimate the velocity but that however inaccurate. But a technique like LQR would directly make use of the velocities are provide much better control.
Working Principle
If [math]\displaystyle{ I(x,y,t) }[/math] is the intensity of a pixel in an image then after some time [math]\displaystyle{ dt }[/math], as the pixel moves some distance [math]\displaystyle{ dx }[/math] and [math]\displaystyle{ dy }[/math] then as the pixel intensity is consistent, it can be said that;
[math]\displaystyle{ I(x,y,t) = I(x+dx, y+dy, t+dt) }[/math]
Using taylor series, it is possible to write
[math]\displaystyle{ \frac{\partial I}{\partial x} \frac{\partial x}{\partial t} + \frac{\partial I}{\partial y} \frac{\partial y}{\partial t} + \frac{\partial I}{\partial t} = 0 }[/math]
In the above equation, the space differentials of intensity are refereed to as image gradients and the above equation is termed as Optical flow Equation. The time differentials of the pixel position [math]\displaystyle{ x, y }[/math] are the unknowns which determine the optical flow.
There are multiple algorithms in order to solve this problem, the most popular ones being:
- Lucas Kanade Method
- Phase Correlation
Open MV Camera
In order to perform optical flow estimation, the team decided to use the Open MV Camera. The Open MV Camera can be programmed via the Open MV IDE. The language used in the IDE is Python with some additional libraries that are meant only for Open MV. In addition, Open MV provides multiple examples which can directly be used to solve the problem.
The optical flow in the open mv can be achieved by the 'displacement' function in the open mv ide. This function makes use of the phase correlation method to estimate the optical flow. However due to limited computational capacity of the on-chip processor the algorithm first reduced the quality of the images captured and then performs the velocity estimation.
In order to estimate the velocities of the drone, it was decided that a few LEDs would be placed on the ground and by meauring the optical flow of the LEDs, the velocties of the drone could be measured. However, as the camera quality of the Open MV was very limited and this was even more deteriorated by the algorithm, we decided to modify the existing code.
In order to reduce the computational power consumption, we used a blob detection that would detect the LED's and perform the phase correlation on that, this would limit the number of pixels over which phase correlation was performed.