AutoRef Honors2019 conclusion: Difference between revisions

From Control Systems Technology Group
Jump to navigation Jump to search
Line 10: Line 10:


=Future Work=
=Future Work=
all of our work can be downloaded
future work


Line detection needs to be extended by differentiating what line represents an outer line or a goal line, this could be done by fitting the detected lines to a model of the field.


which rules and how
All the results obtained in this project can be downloaded from this site and continued. There are a few things to consider developing and testing moving forward:


test what we achieved in real life, already think about what is most likely to go wrong
 
The line detection code needs to be extended to be able to differente between field sidelines, the half-court line and the goal lines. This could be done by fitting the detected lines to a model of the field, which can then serve as a base to determine which line is in the field of view of the drone camera and therefore the approximate position of the drone on the field as well.
 
 
Once the lines are correctly detected, game rules such as out of bounds and goals can be implemented to make the drone an "autonomous referee". Additionally a code to detect robots and their collisions could be developed in order to detect and call out fouls, adding to the capabilites of the referee.
 
 
Once the pandemic shutdown is over, the results obtained in the simulation environment should be tested with the new upscaled drone and a ball in the Tech United field, since both the motion command code and the vision code will very likely have to be altered and adapted in order to make them work in a non-ideal environment. Once the system works in the Tech United field its scalability can be tested in fields of different dimensions.


possible solutions
possible solutions


suggestions
suggestions
testing whether scalable

Revision as of 21:31, 24 May 2020


AutoRef Honors 2019 - Conclusion


Conclusion

The project goal was to autonomously assist a football referee in a 2 versus 2 robot soccer match using a drone by enforcing three main rules: out of bounds, free kick and goal. We have not been able to reach this goal, non of the three rules can be enforced now. The drone can keep the moving ball in it's vision field in the simulation, with some testing this should to work in reality as well. The lines can be detected in simulation, but further work is needed to make this code detect when the goal is out of bounds or in the goal. Recognition of the robots and distance computation have not been worked on yet, so the free kick rules will take longer to implement.

Future Work

All the results obtained in this project can be downloaded from this site and continued. There are a few things to consider developing and testing moving forward:


The line detection code needs to be extended to be able to differente between field sidelines, the half-court line and the goal lines. This could be done by fitting the detected lines to a model of the field, which can then serve as a base to determine which line is in the field of view of the drone camera and therefore the approximate position of the drone on the field as well.


Once the lines are correctly detected, game rules such as out of bounds and goals can be implemented to make the drone an "autonomous referee". Additionally a code to detect robots and their collisions could be developed in order to detect and call out fouls, adding to the capabilites of the referee.


Once the pandemic shutdown is over, the results obtained in the simulation environment should be tested with the new upscaled drone and a ball in the Tech United field, since both the motion command code and the vision code will very likely have to be altered and adapted in order to make them work in a non-ideal environment. Once the system works in the Tech United field its scalability can be tested in fields of different dimensions.

possible solutions

suggestions