|
|
(34 intermediate revisions by 2 users not shown) |
Line 9: |
Line 9: |
| __TOC__ | | __TOC__ |
|
| |
|
| =Introduction=
| |
| *explain intentions
| |
| *since we cannot test hardware anymore we had to adapt and switch to simulation
| |
|
| |
|
| =Thought Process=
| |
| * Research and selection of drone
| |
| * Decision on relative positioning system instead of absolute
| |
| * Control test + hover mode using flowdeck
| |
| * Motion commands through python script
| |
| * Ball detection with python + opencv, tested using digital phone camera and wifi connection
| |
| * Test with 2 analog fpv cameras and radio transmission-> issues with image flicker due to noise made color-based ball detection unreliable
| |
| * Decision to use raspberry Pi with digital camera for image capture and processing and wifi connection to pc runnning bitcraze client
| |
| * Drone upscale: issues with ESCs and controller tuning
| |
| * Transfer to Coppeliasim due to quarantine: Drone control + ball and line detection implemented and tested in real game scenario
| |
|
| |
|
| == hardware ==
| |
| cf
| |
| bolt
| |
| flowdeck
| |
| also pid tuning of bigquad somewhere
| |
|
| |
|
| | =Introduction= |
| | [[File:CF_on_laptop.jpg|right|300px|]] |
| | This wiki is the documentation of work delivered by the AutoRef Honors students from the High Tech Systems track in the academic year 2019/2020. |
|
| |
|
| = Software Architecture =
| | The goal of this project is to develop an autonomous robot capable of refereeing a robot football play. Such a system can be of benefit by eliminating human error and by using factual data from more sources to make better-evaluated decisions. For this project we have decided to use a drone (quadcopter) as a robot. Most drones are fast and they can change altitude easily, this agility enables a drone to quickly move to a position with a good view, this is a large advantage for a referee. Robot soccer is played on fields of different sizes, and a system using a drone is scalable, whereas for instance a camera on a rail next to a field would not be so scalable. Another reason for using a drone is that it is easy to carry around and it is a small object to work on. |
| | |
| Since hardware testing is not an option anymore during times like these we had to adapt how we execute the project. Since we still want to be able to use the work done, the system was chosen to be built in a modular fashion. This way many system components can be kept unchanged when we go back and implement it with real hardware. Only the part that models the quadcopter and the video stream of the camera have to be swapped for the real drone and camera. This is one of the reasons we choose to use implement the communication of our system using the ROS (Robot Operating System). ROS helps to make a system modular in the sense that each (sub)component of the system can be kept and run and its own 'node'. Then when two nodes want to communicate with each other i.e. a sensor and an actuator, the actuates ''subscribes'' to a 'topic' to get the data that the sensor will ''publish'' to that topic. Using this approach, when one component or rather, one node, must be updated, the rest of the system will be left unchanged. Additionally, ROS offers huge open-source libraries that can be used to quickly prototype for your implementation so you don't have to spend a lot of time ''reinventing the wheel''.
| |
| | |
| The software architecture of the designed system can be seen in the below figure.
| |
| | |
| <center>[[File:Honors architecture sim.png|750 px|system]]</center>
| |
| | |
| As can be seen in the figure above, the system consists of four main subsystems; The drone itself, the camera on the drone, the external ROS nodes running on a separate PC, and a referee monitoring system.
| |
| | |
| The drone will start by hovering in the air, it will do this using its inertial measurement unit (IMU) and some additional sensors (for the implementation in real-life these will include an optical flow sensor and a barometer, more about this implementation can be found on the section about the hardware implementation). This unit along with the sensor will continuously send information about the drone's pose to the flight controller. Initially, the flight controller only makes sure that the given information will be used to stabilize the drone into hovering mode by sending trust commands to the four independent motors of the drone.
| |
| | |
| The camera that is attached to the drone transmits a video stream. This video stream will be picked up by the nodes that are running on the external PC.
| |
| | |
| On the PC the video stream will be processed so ball and line positions can be extracted. The position and radius of the ball within the video stream (that is relative to the drone) will be sent together with the detected lines to the referee's system.
| |
| | |
| On the referee's system, the earlier information found will be presented in a processed video stream accompanied by an interpretation of what this stream displays e.g. the ball is at a specific location, and is crossing a line.
| |
| | |
| The ball position and radius with respect to the camera will also be sent to the drone. Here it is received by the node that will determine based on the ball position how to orientate the drone. It will do this interpreting the information and deciding how to move the drone to get a better perspective of what is happening. This algorithm will then give velocity setpoints (to start rolling, pitching, yawing, or a combination thereof) to the flight controller that will take care of those.
| |
| | |
| = Vision =
| |
| == Ball tracking ==
| |
| | |
| To track the ball from the drone camera a python code was found and adapted. To track the ball a threshold in HSV color space is needed, this threshold represents a range of colors that the ball can show on camera due to different lighting.
| |
| | |
| The first thing the code will do is grab a frame from the video and make it into a standard size and apply a blur filter to create a less detailed image. Then all of the image that is not the color of the ball is made black, and more filters are applied to remove small parts in the image with the same color as the ball. Now a few functions of the cv2 package are used to encircle the largest piece of the image with the color of the ball. Then the center and radius of this circle are identified.
| |
| A problem we ran into was that the ball was sometimes detected in an incorrect position due to a color flickering, we then added a feature to only look for the ball in the same region of the image where the ball was detected in the previous frame.
| |
| | |
| More on the code can be found here https://www.pyimagesearch.com/2015/09/14/ball-tracking-with-opencv/
| |
| | |
| == Line detection ==
| |
| //
| |
| | |
| | |
| == Referee ==
| |
| make decisions and present information based on the vision part e.g. hey the ball is out of bounds, the robots are bumping into each other, GOAL!
| |
| | |
| = Simulation =
| |
|
| |
| The simulation environment chosen is CoppeliaSim (also known as V-rep), this environment has a nice intuitive API and works great with ROS. In CoppeliaSim each object (i.e. a drone, or a camera) can have its own (child) script which can communicate with ROS via subscription and publication to topics. The overall architecture can be displayed by the ROS command ''rqt_graph'' while the system is running. The output of this command will be displayed in the following figure.
| |
| | |
| == implementation ==
| |
| <center>[[File:Honors drone rqt graph.png|750 px|system]]</center>
| |
| The following text will briefly explain what everything in the figure means according to the software architecture. ''sim_ros_interface'' is the node that is created by the simulator itself and serves as a communication path between the simulator scripts and the rest of the system. What can not be seen in this figure are the individual object scripts within the simulator.
| |
| A script that belongs to a camera object will publish camera footage from the camera mounted on the drone to the topic ''rawDroneCam''. The ''ball_detector'' node which is run outside the simulator in a separate python file will subscribe to ''rawDroneCam'' and will subsequently get the image from the simulator. It will then extract the relative ball position and size (in pixels) out of each frame and stores it in an object (message type). This message will then be published to the ''ballPos'' topic. At the same time, the ''line_detector'' node will also subscribe to the drones camera feed and will find the lines in the image using Hough transforms. The simulator node (actually a script belonging to the flight controller object) will subscribe to this the ball position topic, get the position of the ball relative to the drone and decide what to do with this information i.e. move in an appropriate manner. Both the line and ball detection also provide an image of their findings to the node ''merge_cam''. Within this node, those two findings will be merged into one image file that displays the lines found and the ball. This loop will run at approximately 30Hz.
| |
| | |
| == Control of Drone in Simulator ==
| |
| The drone used in the simulation is CoppeliaSims built-in model 'Quadcopter.ttm' which can be found under ''robots->mobile''. This drone uses a script that takes care of the stabilization and drifts. The object script takes the drone's absolute pose and the absolute pose of a 'target' and tries to make the drone follow the target using a PID control loop which actuates on the four motors of the drone. In our actual hardware system, we do not want to use an absolute pose system to follow an object so we will not use the absolute pose of the drone in order to path plan in the simulation either. However, this ability of the drone will be used to stabilize the drone since for the real hardware drone we use an optical flow sensor for this.
| |
| When the drone receives a message from the /ballPos topic (contains information about the position of the ball relative to the drone) the drone will actuate in the following simple way. The y position of the ball relative to the 2d camera determines whether the ball is too far away or too close. In this scenario, the two front or rear motors will spin harder until the ball is in the middle of the screen again this is pitching motion. Whenever the x position of the ball is too high or low two options are considered to get the ball in the middle again. The drone could either roll (increase thrust for the lateral motors) or yaw (increase trust for diagonal motors). A combination of these is made so that the drone will both roll as well as yaw. With proper tuning of the amount of roll and yaw, the drone will behave somewhat like a human referee in the sense that it will try to minimize the distance to fly whilst keeping the camera stable.
| |
| | |
| == Real game ball simulation ==
| |
| In order to check whether the drone would be able to follow the ball during a real game situation, the ball trajectory in the simulation was set to replicate the ball movement during the first 3 minutes of the final game of Robocup 2019 Sydney. The position of the ball was logged during the game by one of the Tech United turtles and was downsampled before importing it into the simulator. Although the trajectory of the ball in the simulation resembles that of the ball during the real game, certain quick ball movements cannot be seen in the simulation due to the downsampling. Nevertheless, it is a good measure of the functionality of the control of the drone in a real game situation.
| |
| | |
|
| |
|
| == Tutorial Simulation ==
| | We made the project goal more specific into the following: |
| After following this tutorial the reader should be able to run a basic simulation of a drone following a yellow object (ball) on a soccer field. This tutorial assumes the reader is on a Linux machine, has installed ROS, and is familiar with its basic functionalities. The next figure gives a sense of what is achieved after this tutorial (it can be clicked to show a video).
| |
|
| |
|
| <center>[[File:Test_gif_drone.gif|center|780px|link=https://drive.google.com/file/d/1Xcl-WHoeJfAQmn44iASLafJr1F9Hzdv-/view?usp=sharing]]</center>
| | ''''' "Autonomously assist a football referee in a 2 versus 2 robot soccer match using a drone by enforcing three main rules: out of bounds, free kick, and goal."''''' |
| *Download and extract the downloadable zip file at the bottom of the page.
| |
| *Download CoppeliaSim (Edu): https://www.coppeliarobotics.com/downloads
| |
| *Install it and place the installation folder in a directory e.g. /home/CoppeliaSim/
| |
| *Follow the ROS setup for the simulator: https://www.coppeliarobotics.com/helpFiles/en/ros1Tutorial.htm. It does not have to be followed completely as long as ROS is configured for CoppeliaSim on your machine
| |
| *Initialize ROS by opening a terminal (ctrl+alt+t) and typing ''roscore''
| |
| *Open CoppeliaSim by going to its folder ''e.g. cd /home/CoppeliaSim'', open a terminal and type ''./coppeliaSim.sh''
| |
| *In the sim, ''file->open scene...'' and locate the ''follow_ball_on_path.ttt'' scene file from the extracted zip.
| |
| *Open the main script (orange paper under 'scene hierarchy'), find the line ''camp = sim.launchExecutable('/PATH/TO/FILE/ball_detector.py')'' and fill in the path to the extracted zip. Do the same for the ''line_detector.py'' and ''merge_cam.py'' files.
| |
| *All is set, press the play button to start the simulation.
| |
| *A window should pop up with the raw camera feed as well as with the processed camera feed and the quadcopter should start to follow the ball!
| |
|
| |
|
| =Future Work=
| | A large obstacle we faced during the project was that the university had to close due to the Covid-19 virus outbreak from March 2020 until the end of the project year. The effects on this project are that the team has not been able to test hardware on the university or work together physically. Considering these changes the team has decided to move our system to a simulation environment, and the work on the hardware has not been finalized. |
| future work
| |
|
| |
|
| =Conclusion= | | =Team= |
| conclusion
| | This project was made by the following Honors student in the academic year 2019/2020: |
| | *Alvaro Gonzalez |
| | *Jake Rap |
| | *Wolff Voss |
|
| |
|
| =References= | | =References= |
| references
| |
|
| |
|
| =Downloads=
| | Rosebrock, A. (2015). Ball Tracking with OpenCV - PyImageSearch. Retrieved 24 May 2020, from https://www.pyimagesearch.com/2015/09/14/ball-tracking-with-opencv/ |
| *Simulation Files: [[File:Honors_drone_v-rep_1105.zip]]
| |
| *Empty Robot Soccer Field for CoppeliaSim(V-rep): [[File:V-rep_soccerfield.zip]]
| |