Drone Referee - MSD 2017/18: Difference between revisions

From Control Systems Technology Group
Jump to navigation Jump to search
Line 46: Line 46:
==System Objectives and Requirements==
==System Objectives and Requirements==
==Project Scope==
==Project Scope==
As in any project, the scope of a project deviates from the initial objectives this project had some shifts.
The scope of this project was refined during the design phase of the project. This was done due to considerable hardware issues and time constraints. The scope was narrowed down to the following deliverables.  
In this project the focus was shifted to go for proof of concept of individual components rather than to go for end to end development and final product creation. It was decided to narrow down the scope of project without having a major change to the basic idea. The deliverables of Drone Referee project (MSD 2017) are:
 
* System Architecture (Conventional and DSM)
* System Architecture
* Detection of ball out of pitch
** Task-Skill-World model
* Collision Detection of robots
** DSM
* Live video feed to the remote referee
* Flight and Control
* To send recommendation to remote referee
** Manual and Semi-Autonomous Flight
* A GUI for remote referee to enforce rules
** Autonomous Flight
* The wiki-page with all documentation
** Hardware and Software Interfaces
* Event Detection
** Ball-out-of-pitch Detection
** Collision detection
** Hardware and Software Interfaces
* Rule Enforcement and HMI
** Supervisory Control
** Referee/Audience GUI
** Hardware and Software Interfaces
* Demonstration Video and Final Presentation
* Wiki page


=System Architecture=
=System Architecture=

Revision as of 22:10, 1 April 2018

Introduction

Abstract

Being a billion Euro industry, the game of Football is constantly evolving with the use of advancing technologies that not only improves the game but also the fan experience. Most football stadiums are outfitted with state-of-the-art camera technologies that provide previously unseen vantage points to audiences worldwide. However, football matches are still refereed by humans who take decisions based on their visual information alone. This causes the referee to make incorrect decisions, which might strongly affect the outcome of the games. There is a need for supporting technologies that can improve the accuracy of referee decisions. Through this project, TU Eindhoven hopes develop a system with intelligent technology that can monitor the game in real time and make fair decisions based on observed events. This project is a first step towards that goal.

In this project, a drone is used to evaluate a football match, detect events and provide recommendations to a remote referee. The remote referee is then able to make decisions based on these recommendations from the drone. This football match is played by the university’s RoboCup robots, and, as a proof-of-concept, the drone referee is developed for this environment.

This project focuses on the design and development of a high level system architecture and corresponding software modules on an existing quadrotor (drone). This project builds upon data and recommendations by the first two generations of Mechatronics System Design trainees with the purpose of providing a proof-of-concept Drone Referee for a 2x2 robot-soccer match.

Background and Context

The Drone Referee project was introduced to the PDEng Mechatronics Systems Design team of 2015. The team was successful in demonstrating a proof-of-concept architecture, and the PDEng team of 2016 developed this further on an off-the-shelf drone. The challenge presented to the team of 2017 was to use the lessons of the previous teams to develop a drone referee using a new custom-made quadrotor. This drone was built and configured by a master student and his thesis was used as the baseline for this project.

The MSD 2017 team is made up of seven people with different technical and academic backgrounds. One project manager and two team leaders were appointed and the remaining four team-members were divided under the two team leaders. The team is organized as below:

Name Role Contact
Siddharth Khalate Project Manager s.r.khalate@tue.nl
Mohamed Abdel-Alim Team 1 Leader m.a.a.h.alosta@tue.nl
Aditya Kamath Team 1 a.kamath@tue.nl
Bahareh Aboutalebian Team 1 b.aboutalebian@tue.nl
Sabyasachi Neogi Team 2 Leader s.neogi@tue.nl
Sahar Etedali Team 2 s.etedalidehkordi@tue.nl
Mohammad Reza Homayoun Team 2 m.r.homayoun@tue.nl


Problem Description

As mentioned above, the drone referee project was also performed by previous two generations of the PDEng MSD. The ideas the previous generation implemented were:

  • To detect ball out of pitch
  • To detect collision

However these were not in a live game and more of a proof of concept, where they performed them in a controlled simulation environment. This year the expectations of the stakeholders are to be able to monitor a 5 minute 2-against- 2 robot soccer match with a drone. The next part is a human referee who is at remote location having the drone view of the game and has a user interface which receives a set of recommendation for rule enforcement. The referee then look at the recommendation and replays to decide his final decision via user interface. This decision is then displayed on the audience screen who are near the robot soccer field; also the LEDs on the drone change color to notify whether there is a rule enforcement or not, if so then which rule is to be enforced . Finally the game restarts from the center after the rule is enforced. The rules that were to be detected and enforced were:

  • Rule A: Free throw, when the ball is out of pitch, i.e. crossing the 4 lines delimiting the field a free throw is awarded to the team that last did not last touch the ball.
  • Rule B: Collision detection, when two robots in pitch touch each other it is considered a foul.

System Objectives and Requirements

Project Scope

The scope of this project was refined during the design phase of the project. This was done due to considerable hardware issues and time constraints. The scope was narrowed down to the following deliverables.

  • System Architecture
    • Task-Skill-World model
    • DSM
  • Flight and Control
    • Manual and Semi-Autonomous Flight
    • Autonomous Flight
    • Hardware and Software Interfaces
  • Event Detection
    • Ball-out-of-pitch Detection
    • Collision detection
    • Hardware and Software Interfaces
  • Rule Enforcement and HMI
    • Supervisory Control
    • Referee/Audience GUI
    • Hardware and Software Interfaces
  • Demonstration Video and Final Presentation
  • Wiki page

System Architecture

Architecture Description and Methodology

Implemented System Architecture

Implementation

Flight and Control

Manual Flight

Drone Localization

Autonomous Flight

Trajectory Planning and Control

Event Detection and Enforcement

Ball Out Of Pitch

Collision Detection

Collisions between robots can be detected by the distance between them and the speeds the robots are traveling at. The collision detection algorithm assumes prior knowledge of the positions of each player robot. In this project, two methods of tracking the robot position were studied and trialed - ArUco Markers, and the use of TechUnited's world model.

ArUco Markers

ArUco is a library for augmented reality (AR) applications based on OpenCV. In implementation, this library produces a dictionary of uniquely numbered AR tags that can be printed and stuck to any surface. With sufficient number of tags, the library is then able to determine the 3D position of the camera that is viewing these tags, assuming that the positions of the tags are known. In this application, the ArUco tags were stuck on top of each player robot and the camera was attached on the drone. The ArUco algorithm was reversed using perspective projections to determine the position of the AR tags using the known position of the drone (localization). However, two drawbacks of this solution were realized.

  1. The strategy for trajectory planning is to have the ball visible at all times. Since the priority is given to the ball, it is not necessary that all robot players are visible to the camera at all times. In certain circumstances, it is also possible that a collision occurs away from the ball and hence away from the camera’s field of view.
  1. In this application, the drone and the robot players will be moving at all times. At high relative speeds, the camera is unable to detect the AR tags or detects them with high latency. During trials, it was also observed that the field of view of the fish-eye camera to track these AR tags is heavily limited.

Due to these drawbacks, and considering that the drone referee will be showcased using the TechUnited robots, it was decided to use TechUnited’s infrastructure to track the player robots.

TechUnited World Model

<TODO>

Collision Detection

Once robot positions are known, collisions between them can be detected using a function of two variables,

  1. Distance between the robots
  2. Relative velocity between the robots

These variables can be measured using the positions of the robots, their headings and their velocities. The algorithm to detect collisions uses the following pseudo-code.

<TODO PSEUDO_CODE>

The marked segments in the above pseudo-code are explained below.

Store State Values: <TODO>

Distance Check: <TODO>

Velocity Check: <TODO>

Determine Guilty Player: <TODO>

Interfaces

Human-Machine Interface

Supervisory Control

Graphical User Interface

Integration

Conclusions and Recommendations

Conclusions

Recommendations

Additional Resources

References