PRE2018 3 Group9: Difference between revisions

From Control Systems Technology Group
Jump to navigation Jump to search
 
(75 intermediate revisions by 4 users not shown)
Line 18: Line 18:


===Initial Concepts===
===Initial Concepts===
After discussing various topics we came up with this final list of projects that seemed interesting to us.
After discussing various topics, we came up with this final list of projects that seemed interesting to us:
*Drone interception
*Drone interception
*A tunnel digging robot
*A tunnel digging robot
*A fire fighting drone for finding people
*A firefighting drone for finding people
*Delivery uav - (blood in Africa, parcels, medicine, etc.)
*Delivery UAV - (blood in Africa, parcels, medicine, etc.)
*Voice control robot - (general technique that has many applications)
*Voice controlled robot - (general technique that has many applications)
*A spider robot that can be used to get to hard to reach places
*A spider robot that can be used to get to hard to reach places


Line 29: Line 29:


===Introduction===
===Introduction===
According to the most recent industry forecast studies, the unmanned aerial systems (UAS) market is expected to reach 4.7 million units by 2020.<ref name="rise">Allianz Global Corporate & Specialty (2016). [https://www.agcs.allianz.com/assets/PDFs/Reports/AGCS_Rise_of_the_drones_report.pdf Rise of the Drones] Managing the Unique Risks Associated
According to the most recent industry forecast studies, the unmanned aerial systems (UAS) market is expected to reach 4.7 million units by 2020.<ref name="rise">Allianz Global Corporate & Specialty (2016). [https://www.agcs.allianz.com/assets/PDFs/Reports/AGCS_Rise_of_the_drones_report.pdf Rise of the Drones] Managing the Unique Risks Associated with Unmanned Aircraft Systems</ref> Nevertheless, regulations and technical challenges need to be addressed before such unmanned aircraft become as common and accepted by the public as their manned counterpart. The impact of an air collision between an UAS and a manned aircraft is a concern to both the public and government officials at all levels. All around the world, the primary goal of enforcing rules for UAS operations into the national airspace is to assure an appropriate level of safety. Therefore, research is needed to determine airborne hazard impact thresholds for collisions between unmanned and manned aircraft or even collisions with people on the ground as this study already shows.<ref name="ato">Federal Aviation Administration (FAA) (2017). [http://www.assureuas.org/projects/deliverables/a3/Volume%20I%20-%20UAS%20Airborne%20Collision%20Severity%20Evaluation%20-%20Structural%20Evaluation.pdf UAS Airborne Collision Severity Evaluation] Air Traffic Organization, Washington, DC 20591</ref>.
with Unmanned Aircraft Systems</ref> Nevertheless, regulations and technical challenges need to be addressed before such unmanned aircraft become as common and accepted by the public as their manned counterpart. The impact of an air collision between an UAS and a manned aircraft is a concern to both the public and government officials at all levels. All around the world, the primary goal of enforcing rules for UAS operations into the national airspace is to assure an appropriate level of safety. Therefore, research is needed to determine airborne hazard impact thresholds for collisions between unmanned and manned aircraft or even collisions with people on the ground as this study already shows.<ref name="ato">Federal Aviation Administration (FAA) (2017). [http://www.assureuas.org/projects/deliverables/a3/Volume%20I%20-%20UAS%20Airborne%20Collision%20Severity%20Evaluation%20-%20Structural%20Evaluation.pdf UAS Airborne Collision Severity Evaluation] Air Traffic Organization, Washington, DC 20591</ref>.


With the recent developments of small and cheap electronics unmanned aerial vehicles (UAVs) are becoming more affordable for the public and we are seeing an increase in the number of drones that are flying in the sky. This has started to pose a number of potential risks which may jeopardize not only our daily lives but also the security of various high values assets such as airports, stadiums or similar protected airspaces. The latest incident involving a drone which invaded the airspace of an airport took place in December 2018 when Gatwick airport had to be closed and hundreds of flights were cancelled following reports of drone sightings close to the runway. The incident caused major disruption and affected about 140000 passengers and over 1000 flights. This was the biggest disruption since ash from an Icelandic volcano shut down all traffic across Europe in 2010.<ref name="gatwick">From Wikipedia, the free encyclopedia (2018). [https://en.wikipedia.org/wiki/Gatwick_Airport_drone_incident Gatwick Airport drone incident] Wikipedia</ref>
With the recent developments of small and cheap electronics unmanned aerial vehicles (UAVs) are becoming more affordable for the public and we are seeing an increase in the number of drones that are flying in the sky. This has started to pose several potential risks which may jeopardize not only our daily lives but also the security of various high values assets such as airports, stadiums or similar protected airspaces. The latest incident involving a drone which invaded the airspace of an airport took place in December 2018, when Gatwick airport had to be closed and hundreds of flights were cancelled following reports of drone sightings close to the runway. The incident caused major disruption and affected about 140000 passengers and over 1000 flights. This was the biggest disruption since ash from an Icelandic volcano shut down all traffic across Europe in 2010.<ref name="gatwick">From Wikipedia, the free encyclopedia (2018). [https://en.wikipedia.org/wiki/Gatwick_Airport_drone_incident Gatwick Airport drone incident] Wikipedia</ref>


Tests performed at the University of Dayton Research Institute show the even a small drone can cause major damage to an airliner’s wing if they meet at more than 300 kilometers per hour.<ref name="dayton">Pamela Gregg (2018). [https://www.udayton.edu/blogs/udri/18-09-13-risk-in-the-sky.php Risk in the Sky?]  University of Dayton Research Institute</ref>
Tests performed at the University of Dayton Research Institute show the even a small drone can cause major damage to an airliner’s wing if they meet at more than 300 kilometers per hour.<ref name="dayton">Pamela Gregg (2018). [https://www.udayton.edu/blogs/udri/18-09-13-risk-in-the-sky.php Risk in the Sky?]  University of Dayton Research Institute</ref>


This project will mostly focus on the importance of interceptor drones for an airport’s security system and the impact of rogue drones on such a system. However, a discussion on drone terrorism, privacy violation and drone spying will also be given and the impacts that these drones can have on users, society and entreprises will be analyzed. As this topic has become widely debated worldwide over the past years, we shall provide an overview of current regulations concerning drones and the restrictions that apply when flying them in certain airspaces. With the research that is going to be carried out for this project, together with all the various deliverables that will be produced, we hope to shine some light on the importance of having systems such as interceptor drones in place for protecting the airspace of the future.
The Alliance for System Safety of UAS through Research Excellence (ASSURE) which is FAA's Center of Excellence for UAS Research also conducted a study<ref name="assure_11">ASSURE (2017). [http://www.assureuas.org/projects/deliverables/sUASAirborneCollisionReport.php ASSURE]  Alliance for System Safety of UAS through Research Excellence (ASSURE)</ref> regarding the collision severity of unmanned aerial systems and evaluated the impact that these might have on passenger airplanes. This is very interesting as it shows how much damage these small drones or even radio-controlled airplanes can inflict to big airplanes, which poses a huge safety threat to planes worldwide. As one can image, airplanes are most vulnerable to these types of collisions when taking off or landing, therefore protecting the airspace of an airport is of utmost importance.
 
This project will mostly focus on the importance of interceptor drones for an airport’s security system and the impact of rogue drones on such a system. However, a discussion on drone terrorism, privacy violation and drone spying will also be given and the impacts that these drones can have on users, society and enterprises will be analyzed. As this topic has become widely debated worldwide over the past years, we shall provide an overview of current regulations concerning drones and the restrictions that apply when flying them in certain airspaces. With the research that is going to be carried out for this project, together with all the various deliverables that will be produced, we hope to shine some light on the importance of having systems such as interceptor drones in place for protecting the airspace of the future.


===Problem Statement===
===Problem Statement===
Line 44: Line 45:


===Objectives===
===Objectives===
*Determine the best UAS that can intercept another UAV in airborne situations
*Determine the best UAS that can intercept a UAV in airborne situations
*Improve the chosen concept
*Improve the chosen concept
*Create a design for the improved concept, including software and hardware
*Create a design for the improved concept, including software and hardware
Line 53: Line 54:
===Approach===
===Approach===


The aim of our project is to deliver a prototype and model on how an interceptive drone can be implemented. The approach to reach this goal contains multiple steps.  
The aim of our project is to deliver a prototype and model on how an interceptor drone can be implemented. The approach to reach this goal contains multiple steps.  


Firstly, we will be going to research papers which describe the state of art of such drones and its respective components. This allows our group to get a grasp of the current technology of such a system and introduce us to the new developments in this field. This also helps to create a foundation for the project, which we can develop into. The state of art also gives valuable insight into possible solutions we can think and whether their implementation is feasible given the knowledge we possess and the limited time. The SotA research will be achieved by studying the literature, recent reports from research institutes and the media and analyzing patents which are strongly connected to our project.  
Firstly, we will be going through research papers and other sources which describe the state of art of such drones and its respective components. This allows our group to get a grasp of the current technology of such a system and introduce us to the new developments in this field. This also helps to create a foundation for the project, which we can develop onto. The state of art also gives valuable insight into possible solutions we can think and whether their implementation is feasible given the knowledge we possess and the limited time. The SotA research will be achieved by studying the literature, recent reports from research institutes and the media and analyzing patents which are strongly connected to our project.  


Furthermore, we will continue to analyze the problem from a USE – user, society, enterprise – perspective. An important source of this analysis is the state of art research, where the impact of these drone systems in different stakeholders discussed. The USE aspects will be of utmost importance for our project as every engineer should strive to develop new technologies for helping not only the users but also the society as a whole and also avoid the possible consequence of the system they develop. This analysis will finally lead to a list of requirements for our solutions. Moreover, we will discuss the impact of these solutions on the categories listed prior.
Furthermore, we will continue to analyze the problem from a USE – user, society, enterprise – perspective. An important source of this analysis is the state of art research, where the impact of these drone systems in different stakeholders is discussed. The USE aspects will be of utmost importance for our project as every engineer should strive to develop new technologies for helping not only the users but also the society as a whole and to avoid the possible consequence of the system they develop. This analysis will finally lead to a list of requirements for our design. Moreover, we will discuss the impact of these solutions on the categories listed prior.


Finally, we hope to develop a prototype for an interceptor drone. We do not plan on making a physical prototype as the time of the project is not feasible for this task. We plan on creating a 3D model of the drone and simulating it, showcasing its functionality in real life. To complement this, we also plan on building an Android application, which serves as a dashboard for the drone tracking different parameters about the drones such as position and overall status. To realize this a list of hardware components will be researched, which would be feasible with our project and create a cost-effective product. Concerning the software, a UML diagram will be created first, to represent the system which will be implemented later on. Together with the wiki page, these will be our final deliverables for the project.
Finally, we hope to develop a prototype for an interceptor drone. We do not plan on making a physical prototype as the time of the project is not feasible for this task. We plan on creating a 3D model of the drone and detailing how such a system would be implemented in an airport. To show the functioning of the tracking capabilities we are building a demo tracker. To complement this, we also plan on building an application, which serves as a dashboard for the drone, tracking different parameters from the drones such as position and overall status. Next to this, a list of hardware components will be researched, which would be feasible with our project and create a cost-effective product. Concerning the software, a UML diagram will be created first, to represent the system which will be implemented later. Together with the wiki page, these will be our final deliverables for the project.


Below we summarize the main steps in our approach of the project.
Below we summarize the main steps in our approach of the project.
Line 68: Line 69:
*Choose the Hardware and create the UML diagram
*Choose the Hardware and create the UML diagram
*Work on the prototype (3D model and mobile application)
*Work on the prototype (3D model and mobile application)
*Create a demo of the tracking functionality
*Evaluate the prototype
*Evaluate the prototype


Line 74: Line 76:


*After week 2, the best UAS is chosen, options for improvements of this system are made and also there is a clear vision on the user. This means that it is known who the users are and what their requirements are.
*After week 2, the best UAS is chosen, options for improvements of this system are made and also there is a clear vision on the user. This means that it is known who the users are and what their requirements are.
*After week 5, the software and hardware are designed for the improved system. Also a prototype has been made.
*After week 5, the software and hardware are designed for the improved system. Also, a prototype has been made.
*After week 8, the wiki page is finished and updated with the results that were found from testing the prototype. Also future developments are looked into and added to the wiki page.
*After week 7, the tracker demo will be finished in order to be shown at the presentations.
*After week 8, the wiki page is finished and updated with the results that were found from testing the prototype. Also, future developments are looked into and added to the wiki page.


===Deliverables===
===Deliverables===
Line 81: Line 84:
*A presentation, which is a summary of what was done and what our most important results are
*A presentation, which is a summary of what was done and what our most important results are
*A prototype
*A prototype
*A video of the tracker demo


===Planning===
===Planning===
Line 109: Line 113:
| Make a draft planning
| Make a draft planning
| Add requirements for drone on wiki
| Add requirements for drone on wiki
| Regulations and Present Situation
| Regulations and present situation
| Write pseudocode for interceptor drone
| Write pseudocode for interceptor drone
| Mobile app development
| Mobile app development
| Proof read the wiki page and correct mistakes
| Proof read the wiki page and correct mistakes
| Make a final presentation
| Review wiki page
| Review wiki page
| Make a final presentation
|-
|-
| Summarize project ideas
| Summarize project ideas
Line 120: Line 124:
| Wireframing the Dashboard App
| Wireframing the Dashboard App
| Build UML diagram for software architecture
| Build UML diagram for software architecture
|  
| Improve UML with net rounds
| Review wiki page
| Add system security considerations
|  
| Add new references on wiki (FAA)
|  
| Add final planning on wiki
|-
|-
| Write wiki introduction
| Write wiki introduction
Line 129: Line 133:
|  
|  
| Start design for dashboard mobile app
| Start design for dashboard mobile app
|  
| Add user interface form
| Mobile app development
| Work on demo (YOLO neural network)
|  
| Export app screens to presentation
|  
|  
|-
|-
Line 138: Line 142:
|  
|  
|  
|  
|  
| Add login page to app
|  
|  
|  
|  
Line 147: Line 151:
| Elaborate on the SotA
| Elaborate on the SotA
| Research the hardware components
| Research the hardware components
| Work on the drone model
| Mathematical model of net launcher
| Work on 3D drone model / prototype
| Write on the security of the system
| Reviewing the Wiki and fixing spelling errors
| Write on hardawre/software interface
| Finalise the Wiki Page
| Work on the Presantation
| Work on final presantation
| Present the work done
|-
|-
| Write about the USE aspects
| Write about the USE aspects
| Review the whole Wiki page
| Review the whole Wiki page
| Purchase or request the needed hardware
| Draw the schematics of electronics used
| Work on the code needed for the electronics
| What makes a drone friend or enemy
| Finish the simulation
| Try to fix the Latex
| Work on the layout of the Wiki
| Check on the relaisation of all objectives
|  
|  
| Write on the Base Station
| Finalize writting on base station
|-
|-
|  
|  
| Improve approach
| Improve approach
| Draw the schematics of electronics used
|  
| Start with the simulation
| Start with the simulation
|
| Expand on the material on the Wiki more
|  
|  
|  
|  
| Research Drone Detection radars
| Review the Wiki
|-
|-
|  
|  
Line 184: Line 188:
| Update SotA
| Update SotA
| Research hardware options
| Research hardware options
| Start on 3D model of drone
| List of parts and estimation of costs
| Work on 3D model
| Target recognition & target following
| Review wiki page
| Review wiki page
| Finalise demo and presentation
| Finalise wiki page
| Finalise wiki page
|
|-
|-
| Fill in draft planning
| Fill in draft planning
| Check USE
| Check USE
| Start thinking about electronics layout
| Start thinking about electronics layout
| Make a bill of costs and list of parts
| Target recognition & target following
|  
| Image recognition
| Deliver 3D model of drone
| Contact bluejay
|  
| Make Aruco marker detection demo
|  
|  
|-
|-
Line 204: Line 208:
|  
|  
|  
|  
|  
| Work on demo
|  
|  
|  
|  
Line 221: Line 225:
| Update milestones and deliverables
| Update milestones and deliverables
|  
|  
| Start on 3D model of drone
| Finish hardware research
| Work on 3D model
| User interview
| Review wiki page
| Review wiki page
| Continue tasks from week 7
| Finish experts part
| Finish all lose ends
| Finalise wiki page
|-
|-
| Search information about subject
| Search information about subject
Line 231: Line 235:
| Research hardware components
| Research hardware components
| Make a bill of costs and list of parts
| Make a bill of costs and list of parts
| Confirmation of having acquired target
| Contact the army
| Improve Justification
|  
|  
| Put in new devellopmets from week 3, 4 and 5
| Write future developmetns
| Write conclusion/results part
|-
|-
| Write problem statements
| Write problem statements
| USE analysis including references
| USE analysis including references
|  
|  
| Improve design of net gun
|  
|  
|  
| Justify requirements and constraints
| Start looking at the final presentation
| Improve planning
|
|  
|  
|-
|-
Line 269: Line 273:
| Research hardware components
| Research hardware components
| Interface design for the mobile app
| Interface design for the mobile app
|  
| Finish the software parts
| Start working on the visualisation
| Brainstorm about the presentation
|  
| Work on the final presentation
|  
|  
|-
|-
Line 277: Line 281:
| Expand on the state of the art
| Expand on the state of the art
| Research systems for stopping drones
| Research systems for stopping drones
| Start working on drone prototype
| Start working on 3d drone model
|  
| Finish simulation of drone forces
|  
| Make a draft for the presentation
|  
| Finish the demo
|  
|  
|-
|-
Line 286: Line 290:
| Society and enterprise needs
| Society and enterprise needs
|  
|  
| Start working on 3d drone model
| Start thinking about the software
|  
| Start contacting experts
|  
| Look for ideas for the demo
|  
| Improve the planning
|  
|  
|-
|-
Line 295: Line 299:
| Work on UML activity diagram
| Work on UML activity diagram
|  
|  
| Start working on the simulation
|  
|  
|  
|  
|  
Line 314: Line 318:
| Add requirements to wiki page
| Add requirements to wiki page
| USE analysis finished
| USE analysis finished
|  
| 3D drone model finished
| Provide a bill of costs and list of parts
| Simulation of drone forces
| Mobile app prototype finished
| Mobile app prototype finished
| Visualisation finished
| The demo(s) work
|  
| Wiki page is finished
|-
|-
| Add research papers to wiki
| Add research papers to wiki
| Add state of the art to wiki page
| Add state of the art to wiki page
| UML activity diagram finished
| UML activity diagram finished
| List of parts and estimation of costs
|  
|  
|  
| Information from experts is processed
| 3D drone model finished
| Presentation is finished
|
|  
|  
|-
|-
Line 354: Line 358:
|  
|  
| Mobile app prototype
| Mobile app prototype
|  
| Demo tracker(s)
| Final presentation
| Final presentation
|-
|-
Line 368: Line 372:


=USE=
=USE=
In this section we will focus more on analysing the different aspects involving users, society and enterprises in the context of interceptor drones. We will start by identifying key stakeholders for each of the categories and proceed by giving a more in depth analysis. After identifying all these stakeholders, we will continue by stating what our project will mainly focus on in terms of stakeholders. Since the topic of interceptor drones is quite vast depending from which angle we choose to tackle the problem, focusing on a specific group of stakeholders will help us produce a better prototype and conduct better research for that group. Moreover, each of these stakeholders experiences different concerns, which are going to be elaborated separately.
In this section, we will focus on analyzing the different aspects involving users, society and enterprises in the context of interceptor drones. We will start by identifying key stakeholders for each of the categories and proceed by giving a more in-depth analysis. After identifying all these stakeholders, we will continue by stating what our project will mainly focus on in terms of stakeholders. Since the topic of interceptor drones is quite vast depending from which angle we choose to tackle the problem, focusing on a specific group of stakeholders will help us produce a better prototype and conduct better research for that group. Moreover, each of these stakeholders experiences different concerns, which are going to be elaborated separately.


===Users===
===Users===
When analysing the main users for an interceptor drone, we quickly see that airports are the most interested in having such a technology. This comes as no surprise when we look at the number of incidents involving rogue drones around airports in the last couple of years. Due to the fact that the airspace within and around airports is heavily restricted and regulated, it is clear that unauthorized flying drones are a real danger not only to the operation of airports and airlines, but also to the safety and comfort of passengers. As was the case with previous incidents, intruder drones which are violating airport airspaces lead to airport shutdowns which result in big delay and huge losses.
When analyzing the main users for an interceptor drone, we quickly see that airports are the most interested in having such a technology. Since the airspace within and around airports is heavily restricted and regulated, unauthorized flying drones are a real danger not only to the operation of airports and airlines, but also to the safety and comfort of passengers. As was the case with previous incidents, intruder drones which are violating airport airspaces lead to airport shutdowns which result in big delay and huge losses.
 
Another key group of users is represented by governmental agencies and civil infrastructure operators that want to protect certain high value assets such as embassies. As one can imagine, having intruder drones flying above such a place could lead to serious problems such as diplomacy fights or even impact the relations between the two countries involved. Therefore, one could argue that such a drone could indeed be used with malicious intent to directly cause tense relations.
 
Another good example worth mentioning is the incident involving the match between Serbia and Albania in 2014 when a drone invaded the pitch carrying an Albanian nationalist banner which leads to a pitch invasion by the Serbian fans and full riot. This incident led to retaliations from both Serbians and Albanians which resulted in significant material damage and damaged even more the fragile relations between the two countries <ref name="euro">From Wikipedia, the free encyclopedia (2019). [https://en.wikipedia.org/wiki/Serbia_v_Albania_(UEFA_Euro_2016_qualifying) Serbia v Albania (UEFA Euro 2016 qualifying)] Wikipedia</ref>.


Another key group of users is represented by governmental agencies and civil infrastructure operators that want to protect certain high value assets such as embassies. As one can imagine, having intruder drones flying above such a place could lead to serious problems such as diplomacy fights or even impact the relations between the two countries involved. Therefore, one could argue that such a drone could indeed be used with malicious intent to directly cause such tense relations.
We can also imagine such an interceptor drone being used by the military or other government branches for fending off terrorist attacks. Being able to deploy such a countermeasure (on a battlefield) would improve not only the safety of people but would also help in deterring terrorists from carrying out such acts of violence in the first place.
Another good example worth mentioning is the incident involving the match between Serbia and Albania in 2014 when a drone invaded the pitch carrying an Albanian nationalist banner which lead to a pitch invasion by the Serbian fans and full riot. Needless to say, this incident led to retaliations from both Serbians and Albanians which resulted in significant material damage and damaged even more the fragile relations between the two countries <ref name="euro">From Wikipedia, the free encyclopedia (2019). [https://en.wikipedia.org/wiki/Serbia_v_Albania_(UEFA_Euro_2016_qualifying) Serbia v Albania (UEFA Euro 2016 qualifying)] Wikipedia</ref>.


We can also imagine such an interceptor drone being used by the military or other government branches for fending off terrorist attacks. Being able to deploy such a countermeasure (on a battlefield) would improve improve not only the safety of people but would also help in deterring terrorists from carrying our such acts of violence in the first place.
Lastly, a smaller group of users, but still worth considering are individuals who are prone to get targeted by drones, therefore having their privacy violated by such systems. This could be the case with celebrities or other VIPs who are targeted by the media to get more information about their private lives.


Lastly, a smaller group of users, but still worth taking into account could be represented by individuals who are prone to get targeted by drones, therefore having their privacy violated by such systems. This could be the case with celebrities or other VIPs who are targeted by the media to get more information about their private lives.
To summarize, from a user perspective we think that the research which will go into this project can benefit airports the most. One could say that we are taking a utilitarianism approach to solving this problem, as implementing a security system for airports would produce the greatest good for the greatest number of people.
To summarize, from a user perspective we think that the research which will go into this project can benefit airport security systems the most. One could say that we are taking an utilitarianism approach to solving this problem, as implementing a security systems for airports would produce the greatest good for the greatest number of people.


===Society===
===Society===
When thinking how society could benefit from the existence of a system that detects and stops intruder drones, the best example to consider is again the airport scenario. It is already clear that whenever an unauthorized drone enters the restricted airspace of an airport this causes major concern for the safety of the passengers. Moreover, it causes huge delays and creates big problems for the airport’s operations and airlines which will be losing a lot of money. Apart from this, rogue drones around airports cause logistical nightmares for airports and airlines alike since this will not only create bottlenecks in the passengers flow through the airport but airlines might need to divert passengers on other routes and planes. The cargo planes will also suffer delays and this could lead to bigger problems down the supply chain such as medicine not reaching patients in time. All these problems are a great concern for the society as a whole.
When thinking about how society could benefit from the existence of a system that detects and stops intruder drones, the best example to consider is again the airport scenario. It is already clear that whenever an unauthorized drone enters the restricted airspace of an airport this causes major concern for the safety of the passengers. Moreover, it causes huge delays and creates big problems for the airport’s operations and airlines which will be losing a lot of money. Apart from this, rogue drones around airports cause logistical nightmares for airports and airlines alike since this will not only create bottlenecks in the passenger flow through the airport, but airlines might need to divert passengers on other routes and planes. The cargo planes will also suffer delays, and this could lead to bigger problems down the supply chain such as medicine not reaching patients in time. All these problems are a great concern for society.


Another big issue for society which interceptor drones hope to solve would be the ability to safely stop a rogue drone from attacking large crowds of people at various events for example. For providing the necessary protection in these situations, it is crucial that the interceptor drone acts very quickly and stops the intruder in a safe and controlled manner as fast as possible without putting the lives of other people in danger. Again, when we think in the context of providing the greatest good for the greatest number of people, the airport security example stands out, therefore this is where the main focus of the research of this paper will be aimed at.
Another big issue for the society, which interceptor drones hope to solve, would be the ability to safely stop a rogue drone from attacking large crowds of people at various events for example. For providing the necessary protection in these situations, it is crucial that the interceptor drone acts very quickly and stops the intruder in a safe and controlled manner as fast as possible without putting the lives of other people in danger. Again, when we think in the context of providing the greatest good for the greatest number of people, the airport security example stands out, therefore this is where the focus of the research of this paper will be targeted towards.


===Enterprise===
===Enterprise===
When analysing the impact interceptor drones will have on the enterprise in the context of airport security we identify two main players: the airport security and airlines operating from that airport. Moreover, the airlines can be further divided into two categories: those which transport passengers and those that transports cargo (and we can also have airlines that do both).
When analyzing the impact interceptor drones will have on the enterprise in the context of airport security we identify two main players: the airport security and airlines operating from that airport. Moreover, the airlines can be further divided into two categories: those which transport passengers and those that transports cargo (and we can also have airlines that do both).


From the airport’s perspective, a drone sighting near the airport would require a complete shutdown of all operations for at least 30 minutes, as stated by current regulations <ref name="alex_hern">Alex Hern, Gwyn Topham (2018). [https://www.theguardian.com/technology/2018/dec/20/how-dangerous-are-drones-to-aircraft How dangerous are drones to aircraft?] The Guardian</ref>. As long as the airport is closed, it will lose money and cause huge operational problems if we think at the number of people left stranded all over the airport waiting for the next flight out. Furthermore, when the airport will open again, there will be even more problems caused by congestion since all planes would want to leave at the same time which is obviously not possible. This can actually lead to incidents on the tarmac involving planes, due to improper handling or lack of space in an airport which is potentially already overfilled with planes.
From the airport’s perspective, a drone sighting near the airport would require a complete shutdown of all operations for at least 30 minutes, as stated by current regulations <ref name="alex_hern">Alex Hern, Gwyn Topham (2018). [https://www.theguardian.com/technology/2018/dec/20/how-dangerous-are-drones-to-aircraft How dangerous are drones to aircraft?] The Guardian</ref>. As long as the airport is closed, it will lose money and cause operational problems. Furthermore, when the airport will open again, there will be even more problems caused by congestion since all planes would want to leave at the same time which is obviously not possible. This can lead to incidents on the tarmac involving planes, due to improper handling or lack of space in an airport which is potentially already overfilled with planes.


From the airline’s perspective, whether we are talking about passengers transport or cargo, drones violating an airport’s airspace directly translates in huge losses, big delays and unhappy passengers. Not only will the airline need to compensate passengers in case the flight is canceled, but they would also need to support accommodation expenses in some cases. For cargo companies, a delay in delivering packages can literally mean life or dead if we talk about medicine that needs to get to patients. Moreover, disruptions in the transport of goods can greatly impact the supply chain of numerous other businesses and enterprises, thus these types of events (rogue drones near airports) could have even bigger ramifications.
Whether we are talking about passenger transport or cargo, drones violating an airport’s airspace directly translates in huge losses, big delays and unhappy passengers. Not only will the airline need to compensate passengers in case the flight is canceled, but they would also need to support accommodation expenses in some cases. For cargo companies, a delay in delivering packages can literally mean life or dead if we talk about medicine that needs to get to patients. Moreover, disruptions in the transport of goods can greatly impact the supply chain of numerous other businesses and enterprises, thus these types of events (rogue drones near airports) could have even bigger ramifications.


Finally, after analyzing the USE implications of intruder drones in the context of airport security, we will now focus on researching different types of systems than can be deployed in order to not only detect but also stop and catch such intruders as quickly as possible. This will therefore help mitigate both the risks and various negative implications that such events have on the USE stakeholders that were mentioned before.
Finally, after analyzing the USE implications of intruder drones in the context of airport security, we will now focus on researching different types of systems than can be deployed in order to not only detect but also stop and catch such intruders as quickly as possible. This will therefore help mitigate both the risks and various negative implications that such events have on the USE stakeholders that were mentioned before.


===Requirements===
===Experts===
We have explained who the main users are, but not how much use our idea can find and if it can be implemented in airports. To find how our idea can be implemented, we contacted different airports; Schiphol Airport, Rotterdam The Hague Airport, Eindhoven Airport, Groningen Eelde Airport and Maastricht Aachen Airport. Only Eindhoven Airport responded to the questions that we posed.
We asked the airports the following questions:
*Are you aware of the problem that rogue intruder drones pose?
*How are you taking care of this problem now?
*Who is responsible if things go wrong?
*How does this process go? (By this we mean the process form when a rogue drone is detected until it is taken down.)
*Do you see room for innovation or are you satisfied with the current process?
 
Their response made clear that they are very aware of the problem. But they told us that it is not their responsibility. The terrain of the airport is the property of the Dutch army, and also the responsibility of the Dutch army. Because this problem is new, the innovation center of the army takes care of these problems.


In order to better understand the needs and design for an interceptor drone, a list of requirements is necessary. There are clearly different ways in which a rogue UAV can be detected, intercepted, tracked and stopped. However, the requirements for the interceptor drone need to be analyzed carefully as any design for such a system must ensure the safety of bystandars and minimize all possible risks involved in taking down the rogue UAV. Equally important are the constraints for the interceptor drone and finally the preferences we have for the system. For prioritizing the specific requirements for the project, the MoSCoW model was used. Each requirement has a specific level of priority which stands for must have (''M''), should have (''S''), could have (''C''), would have (''W''). We will now give the RPC table for the autonomous interceptor drone and later provide some more details about each specific requirement.
This answer gave us some information, but we still don't know anything about the current process yet. So, we e-mailed the innovation center of the army. Also, we contacted Delft Dynamics, because they also deal with this problem and probably know more about it. The innovation center of the army and Delft Dynamics both didn't respond to any of our emails.


===Requirements===
To better understand the needs and design for an interceptor drone, a list of requirements is necessary. There are clearly different ways in which a rogue UAV can be detected, intercepted, tracked and stopped. However, the requirements for the interceptor drone need to be analyzed carefully as any design for such a system must ensure the safety of bystanders and minimize all possible risks involved in taking down the rogue UAV. Equally important are the constraints for the interceptor drone and finally the preferences we have for the system. For prioritizing the specific requirements for the project, the MoSCoW model was used. Each requirement has a specific level of priority which stands for must have (''M''), should have (''S''), could have (''C''), would have (''W''). We will now give the RPC table for the autonomous interceptor drone and later provide some more details about each specific requirement.
{| class="wikitable" | style="vertical-align:middle;" | border="2" style="border-collapse:collapse"
{| class="wikitable" | style="vertical-align:middle;" | border="2" style="border-collapse:collapse"
! style="text-align: center; font-weight:bold;" | ID
! style="text-align: center; font-weight:bold;" | ID
Line 409: Line 426:
| style="text-align: center; font-style:italic;" | R1
| style="text-align: center; font-style:italic;" | R1
| Detect rogue drone
| Detect rogue drone
| LIDAR system for detecting intruders
| Long range system for detecting intruders
| Does NOT require human action
| Does NOT require human action
| Software
| Software
Line 424: Line 441:
| Object recognition
| Object recognition
| Accuracy of 100%
| Accuracy of 100%
| Uses AI bounding box algorithm
| Able to be intervened
| Software
| Software
| M
| M
Line 444: Line 461:
| style="text-align: center; font-style:italic;" | R6
| style="text-align: center; font-style:italic;" | R6
| Track target
| Track target
| Tracking targer for at least 10 minutes
| Tracking targer for at least 30 minutes
| Allows for operator to correct drone
| Allows for operator to correct drone
| Software & Hardware
| Software & Hardware
Line 450: Line 467:
|-
|-
| style="text-align: center; font-style:italic;" | R7
| style="text-align: center; font-style:italic;" | R7
| Velocity of 40 km/h
| Velocity of 50 km/h
| Drone is as fast as possible
| Drone is as fast as possible
|  
|  
| Hardware
| Hardware
| S
| M
|-
|-
| style="text-align: center; font-style:italic;" | R8
| style="text-align: center; font-style:italic;" | R8
| Flight time of 10 minutes
| Flight time of 30 minutes
| Flight time is maximized
| Flight time is maximized
|  
|  
Line 468: Line 485:
| Records all flight video footage
| Records all flight video footage
| Hardware
| Hardware
| W
| C
|-
|-
| style="text-align: center; font-style:italic;" | R10
| style="text-align: center; font-style:italic;" | R10
| Camera of 1080p
| Camera with high resolution
| Flight video is as clear as possible
| Flight video is as clear as possible
|  
| Allow for human drone identification
| Hardware
| Hardware
| C
| C
Line 480: Line 497:
| Stop rogue drone
| Stop rogue drone
| Is always successful
| Is always successful
| Can NOT be violet or endanger others
| Cannot endanger others
| Hardware
| Hardware
| M
| M
Line 487: Line 504:
| Stable connection to operation base
| Stable connection to operation base
| Drone is always connected to base
| Drone is always connected to base
| If connection is lost drone buffers data
| If connection is lost drone pauses intervention
| Software
| Software
| S
| S
Line 494: Line 511:
| Sensor monitoring
| Sensor monitoring
| Drone sends sensor data to base and app
| Drone sends sensor data to base and app
| All sensor information is sent to base
| Critical sensor information is sent to base
| Software
| Software
| S
| C
|-
|-
| style="text-align: center; font-style:italic;" | R14
| style="text-align: center; font-style:italic;" | R14
| Return to home functionality
| Drone autonomously returns home
| Drone autonomously returns home
| Drone always gets home by itself
|  
|  
| Software
| Software
Line 508: Line 525:
| Auto take off
| Auto take off
|  
|  
|  
| Drone can take off autonomously at any moment
| Control
| Control
| M
| M
Line 515: Line 532:
| Auto landing
| Auto landing
|  
|  
|  
| Drone can autonomously land at any moment
| Control
| Control
| M
| M
Line 522: Line 539:
| Auto leveling (in flight)
| Auto leveling (in flight)
| Drone is able to fly in heavy weather
| Drone is able to fly in heavy weather
| Does NOT require human action
| Does not require human action
| Control
| Control
| M
| M
Line 528: Line 545:
| style="text-align: center; font-style:italic;" | R18
| style="text-align: center; font-style:italic;" | R18
| Minimal weight
| Minimal weight
| Drone uses carbon fiber materials
|  
|  
|  
| Hardware
| Hardware
Line 534: Line 551:
|-
|-
| style="text-align: center; font-style:italic;" | R19
| style="text-align: center; font-style:italic;" | R19
| Cargo capacity of 4 kg
| Cargo capacity of 8 kg
| Drone is able to carry two catching devices
|  
|  
| Drone is able to carry captured drone
| Hardware
| Hardware
| S
| M
|-
|-
| style="text-align: center; font-style:italic;" | R20
| style="text-align: center; font-style:italic;" | R20
| Portability
| Portability
| Drone is portable and easy to transport
| Drone is portable and easy to transport
| Does NOT hinder drone's robustness
| Does not hinder drone's robustness
| Hardware
| Hardware
| C
| C
Line 550: Line 567:
| Fast deployment
| Fast deployment
| Drone can be deployed in under 5 minutes
| Drone can be deployed in under 5 minutes
| Does NOT hinder drone's robustness
| Does not hinder drone's functionality
| Hardware
| Hardware
| C
| S
|-
|-
| style="text-align: center; font-style:italic;" | R22
| style="text-align: center; font-style:italic;" | R22
| Minimal costs
| Minimal costs
| Drone cost is less than 800 euros
| System cost is less than 100000 euros
|  
|  
| Costs
| Costs
| S
| S
|}
|}
===Justification Requirements===
Starting with R1 we see that our drone must be able to detect a rogue drone to take him down. A preference is that the detection is done by a long-range system, this allows a smaller number of detectors to scan a bigger airspace. As can be seen at the constraint of R1, we want the drone to be autonomous. This is again specified in R2. This system autonomy is required to ensure that the system is as fast as possible, as human are less reliable in that regard. Next to that, automatic systems allow for higher accuracy as these systems can be controlled to generate optimal solutions. The difference between just autonomous flight and full autonomy is the detection and deployment part. In our system, the intruder will be detected autonomously, and the interceptor drone will also be deployed autonomously. Only after deployment, the second requirement takes effect.
R3 until R5 are about recognition and following. R3 claims that our drone must be able to recognize different objects. R4 and R5 have a lower priority but are still important. This is because detecting flying direction and detecting speed increases the chance of catching the drone but are not completely necessary. This is because the interceptor drone can better follow the rogue drone and with better following comes better shooting quality from the net launcher. Like R3, we prefer that R4 and R5 are done with 100 percent accuracy since this will improve our chance of success. Also, the drone must be able to track the rogue drone to catch it. If the rogue drone escapes, we cannot find out why the drone was here and will not be able to prevent it from happening again. Therefore, R6 gets a high priority. The constraint has been set such that the tracking process can be interrupted by an operator to prevent takedowns of friendly targets or to stop the drone if it disproportionally endangers others. R7 is about the minimal maximum velocity. If our drone cannot keep up with the rogue drone, it might escape. We prefer that our drone is as fast as possible, but we have to keep in mind that the battery does not run out too fast. 50km/h has been chosen as a safe speed as it makes sure the drone can get to the other side of a 5km airfield in roughly 10 minutes. This way it is also able to follow fast light drones. (R8) The drone must have a flight time of at least 30 minutes, because otherwise it might not have enough battery life to participate in a dogfight with an intruding drone. A flight time of 30 minutes is chosen because it is the average flight time of a prosumer drone, which are the most advanced drones manufactured by companies we can encounter.
R9 and R10 are about the cameras on the drone. The drone will need two cameras. R9 is about the high-speed camera that is used to do the tracking of the intruder. It needs to have low latency and a high refresh rate to be able to optimize the control (see section ATR). Next to this camera, a high-resolution camera is needed since it can be used to identify the intruder by the operator. The stream from this camera gets fed back to the base for inspection. R11 is the most important RPC because this is the main goal of our drone. The requirement is to stop the rogue drone, and the preference is that this is always successful. Because we chose a net launcher (justified later on), the drone will need to reload at its base after every shot. The constraint from R11 comes back at the constraint of R6, because here we see that the drone may never endanger humans.
R12 is about the connection to the operation base. A stable connection to the operation base is required because the operation base needs to know what is happening. The drone can act by itself because it’s autonomous, but maybe human interference is needed. If the operation base is not connected to the drone, human interference is not possible. We prefer that the drone is always connected to the base. But when connection is lost, the drone has to continue tracking but not yet launch its net, such that it will not try to capture a friend. R13 concerns the sensor information that is returned to the base. With this information we can adjust the drone for better performance and monitor it. This allows for intervention based on for example mechanical failure. Also, this information helps us to understand incident better afterward as the data can be saved, helps us to prepare for upcoming incidents and provide information about how we can solve them even better. R14, R15 and R16 represent the take-off, returning to station and landing. These are tasks that can best be performed autonomously for the same reasons as specified for R1 and R2.
Our drone also needs to be able to withstand different weather types. Auto-leveling in flight is required because the drone needs to be stable even in different weather conditions, like a change of wind speed. This can be performed at best by sensors and computing power instead of human control. We prefer that our drone can also fly in heavy weather conditions, at least as heavy as a regular plane can so that the system can always be used. Looking at R7 and R8 we see that we want to reach maximum speed and maximum flight time. For this to take place, it is required that our drone has minimal weight (R18). Because we need to carry the rogue drone, in R19 is mentioned that we require a cargo capacity of 8 kg. This is based on an estimated weight of an intruder with possible cargo like explosives. In case of emergency, the drone should be able to be transported, but a constraint is that this does not hinder the drone’s robustness. Not having optimal robustness causes the drone to be able to break more easily and thereby decrease the chance of success. Lastly, R22 claims that the system should be delivered at minimal costs to ensure its use in as many as possible airports. This price is based on a rough estimate but a lot lower than the potential losses. That is why we feel this price is justified.


=== UML Activity Diagram ===
=== UML Activity Diagram ===
Line 581: Line 609:
*'''Class B''': Operations may be conducted under IFR, SVFR, or VFR. All aircraft are subject to ATC clearance. All flights are separated from each other by ATC.
*'''Class B''': Operations may be conducted under IFR, SVFR, or VFR. All aircraft are subject to ATC clearance. All flights are separated from each other by ATC.
*'''Class C''': Operations may be conducted under IFR, SVFR, or VFR. All aircraft are subject to ATC clearance (country-specific variations notwithstanding). Aircraft operating under IFR and SVFR are separated from each other and from flights operating under VFR, but VFR flights are not separated from each other. Flights operating under VFR are given traffic information in respect of other VFR flights.
*'''Class C''': Operations may be conducted under IFR, SVFR, or VFR. All aircraft are subject to ATC clearance (country-specific variations notwithstanding). Aircraft operating under IFR and SVFR are separated from each other and from flights operating under VFR, but VFR flights are not separated from each other. Flights operating under VFR are given traffic information in respect of other VFR flights.
*'''Class D''': Operations may be conducted under IFR, SVFR, or VFR. All flights are subject to ATC clearance (country-specific variations notwithstanding). Aircraft operating under IFR and SVFR are separated from each other, and are given traffic information in respect of VFR flights. Flights operating under VFR are given traffic information in respect of all other flights.
*'''Class D''': Operations may be conducted under IFR, SVFR, or VFR. All flights are subject to ATC clearance (country-specific variations notwithstanding). Aircraft operating under IFR and SVFR are separated from each other and are given traffic information in respect of VFR flights. Flights operating under VFR are given traffic information in respect of all other flights.
*'''Class E''': Operations may be conducted under IFR, SVFR, or VFR. Aircraft operating under IFR and SVFR are separated from each other, and are subject to ATC clearance. Flights under VFR are not subject to ATC clearance. As far as is practical, traffic information is given to all flights in respect of VFR flights.
*'''Class E''': Operations may be conducted under IFR, SVFR, or VFR. Aircraft operating under IFR and SVFR are separated from each other and are subject to ATC clearance. Flights under VFR are not subject to ATC clearance. As far as is practical, traffic information is given to all flights in respect of VFR flights.
*'''Class F''': Operations may be conducted under IFR or VFR. ATC separation will be provided, so far as practical, to aircraft operating under IFR. Traffic Information may be given as far as is practical in respect of other flights.
*'''Class F''': Operations may be conducted under IFR or VFR. ATC separation will be provided, so far as practical, to aircraft operating under IFR. Traffic Information may be given as far as is practical in respect of other flights.
*'''Class G''': Operations may be conducted under IFR or VFR. ATC has no authority but VFR minimums are to be known by pilots. Traffic Information may be given as far as is practical in respect of other flights.
*'''Class G''': Operations may be conducted under IFR or VFR. ATC has no authority, but VFR minimums are to be known by pilots. Traffic Information may be given as far as is practical in respect of other flights.
Special Airspace: these may limit pilot operation in certain areas. These consist of Prohibited areas, Restricted areas, Warning Areas, MOAs (military operation areas), Alert areas and Controlled firing areas (CFAs), all of which can be found on the flight charts.
Special Airspace: these may limit pilot operation in certain areas. These consist of Prohibited areas, Restricted areas, Warning Areas, MOAs (military operation areas), Alert areas and Controlled firing areas (CFAs), all of which can be found on the flight charts.


Line 591: Line 619:
Currently, each country is responsible for enforcing a set of restrictions and regulations for flying drones as there are no EU laws on this matter. In most countries, there are two categories for drone pilots: recreational drone pilots (hobbyists) and commercial drone pilots (professionals). Depending on the use of such drones, there are certain regulation that apply and even permits that a pilot needs to obtain before flying.
Currently, each country is responsible for enforcing a set of restrictions and regulations for flying drones as there are no EU laws on this matter. In most countries, there are two categories for drone pilots: recreational drone pilots (hobbyists) and commercial drone pilots (professionals). Depending on the use of such drones, there are certain regulation that apply and even permits that a pilot needs to obtain before flying.


For example, in The Netherlands, recreational drone pilots are allowed to fly at a maximum altitude of 120m (only in Class G airspace) and they need special permission for flying at a higher altitude. Moreover, the drone need to remain in sight at all times and the maximum takeoff weight is 25 kg. For recreational pilots, a licence is not required, however flying at night needs special approval.
For example, in The Netherlands, recreational drone pilots are allowed to fly at a maximum altitude of 120m (only in Class G airspace) and they need special permission for flying at a higher altitude. Moreover, the drone needs to remain in sight at all times and the maximum takeoff weight is 25 kg. For recreational pilots, a license is not required, however, flying at night needs special approval.
For commercial drone pilots, one or more permits are required depending on the situations. The RPAS (remotely piloted aircraft system) certificate is the most common licence and will allow for pilots to fly drones for commercial use. The maximum height, distance and takeoff limits are increased compared to the recreational use of such drones, however night time flying still requires special approval and drones still need to be flown in Class G airspace.
For commercial drone pilots, one or more permits are required depending on the situations. The RPAS (remotely piloted aircraft system) certificate is the most common license and will allow pilots to fly drones for commercial use. The maximum height, distance and takeoff limits are increased compared to the recreational use of such drones, however night time flying still requires special approval and drones still need to be flown in Class G airspace.


Apart from these rules, there are certain drone ban zones which are strictly forbidden for flying and these are: state institutions, federal or regional authority constructions, airport control zones, industrial plants, railway tracks, vessels, crowds of people, populated areas, hospitals, operation sites of police, military or search and rescue forces and finally the Dutch Caribbean Islands of Bonaire St.Eustatius and Saba. Failing to abide by the rules may result in a warning or a fine. The drone may also be confiscated. The amount of the fine or the punishment depends on the type of violation. For example, the judicial authorities will consider whether the drone was being used professionally or for hobby purposes, and whether people have been endangered.
Apart from these rules, there are certain drone ban zones which are strictly forbidden for flying and these are: state institutions, federal or regional authority constructions, airport control zones, industrial plants, railway tracks, vessels, crowds of people, populated areas, hospitals, operation sites of police, military or search and rescue forces and finally the Dutch Caribbean Islands of Bonaire St. Eustatius and Saba. Failing to abide by the rules may result in a warning or a fine. The drone may also be confiscated. The amount of the fine or the punishment depends on the type of violation. For example, the judicial authorities will consider whether the drone was being used professionally or for hobby purposes and whether people have been endangered.


As it is usually the case with new emerging technologies, the rules and regulations fail to keep up with the technological advancements. However, recent developments in the European Parliament hope to create a unified set of laws concerning the use of drones for all European countries. A recent study<ref name="economy_drones">EU Parliament (2018). [http://www.europarl.europa.eu/news/en/headlines/economy/20180601STO04820/drones-new-rules-for-safer-skies-across-europe Drones: new rules for safer skies across Europe]  Civilian Drones: Rules that Apply to European Countries</ref> suggests that the rapid developing drone sector will generate up to 150000 jobs by 2050 and in the future this industry could account for 10% of the EU’s aviation market which amounts to 15 billion euros per year.
As it is usually the case with new emerging technologies, the rules and regulations fail to keep up with the technological advancements. However, recent developments in the European Parliament hope to create a unified set of laws concerning the use of drones for all European countries. A recent study<ref name="economy_drones">EU Parliament (2018). [http://www.europarl.europa.eu/news/en/headlines/economy/20180601STO04820/drones-new-rules-for-safer-skies-across-europe Drones: new rules for safer skies across Europe]  Civilian Drones: Rules that Apply to European Countries</ref> suggests that the rapid developing drone sector will generate up to 150000 jobs by 2050 and in the future this industry could account for 10% of the EU’s aviation market which amounts to 15 billion euros per year.
Therefore, there is definitely need to change the current regulation which in some cases complicates cross border trade in this fast growing sector. As shown with the previous example, unmanned aircraft weighing less than 25 kg (drones) are regulated at a nationwide level which leads to inconsistent standards across different countries.
Therefore, there is definitely a need to change the current regulation which in some cases complicates cross border trade in this fast-growing sector. As shown with the previous example, unmanned aircraft weighing less than 25 kg (drones) are regulated at a nationwide level which leads to inconsistent standards across different countries.
Following a four months consultation period, the European Union Aviation Safety Agency (EASA) published a proposal<ref name="easa_proposal">European Union Avition Safety Agency (2019). [https://www.easa.europa.eu/easa-and-you/civil-drones-rpas Civil Drones]  European Union Aviation Safety Agency</ref> for a new regulation for unmanned aerial systems operation in open (recreational) and specific (professional) categories. On the 28th February 2019, the EASA Committee has given its positive vote to the European Commission proposal for implementing the regulations which are expected to be adopted at the latest on 15 March 2019. Although these are still small steps, the EASA is working on enabling safe operations of unmanned aerial systems (UAS) across Europe’s airspaces and the integration of these new airspace users into an already busy ecosystem.
Following a four months consultation period, the European Union Aviation Safety Agency (EASA) published a proposal<ref name="easa_proposal">European Union Aviation Safety Agency (2019). [https://www.easa.europa.eu/easa-and-you/civil-drones-rpas Civil Drones]  European Union Aviation Safety Agency</ref> for a new regulation for unmanned aerial systems operation in open (recreational) and specific (professional) categories. On the 28th February 2019, the EASA Committee has given its positive vote to the European Commission proposal for implementing the regulations which are expected to be adopted at the latest on 15 March 2019. Although these are still small steps, the EASA is working on enabling safe operations of unmanned aerial systems (UAS) across Europe’s airspaces and the integration of these new airspace users into an already busy ecosystem.


=The Interceptor Drone=
=The Interceptor Drone=
Line 607: Line 635:
The next step is to look at the device which we use to intercept the drone. This could be done with another drone, which we suggested above. But there is also another option. This is by shooting the drone down with a specialized launcher, like the ‘Skywall 100’ from OpenWorks Engineering.<ref name = "mp200">OpenWorks Engineering [https://openworksengineering.com/]</ref>  This British company invented a net launcher which is specialized in shooting down drones. It is manual and has a short reload time. This way, taking down the drone is easy and fast, but it has two big problems. The first one is that the drone falls to the ground after it is shot down. This way it could fall onto people or even worse, conflict enormous damage when the drone is armed with explosives. Therefore the ‘Skywall 100’ cannot be used in every situation. Another problem is that this launcher is manual, and a human life can be at risk in situations when an armed drone must be taken down.
The next step is to look at the device which we use to intercept the drone. This could be done with another drone, which we suggested above. But there is also another option. This is by shooting the drone down with a specialized launcher, like the ‘Skywall 100’ from OpenWorks Engineering.<ref name = "mp200">OpenWorks Engineering [https://openworksengineering.com/]</ref>  This British company invented a net launcher which is specialized in shooting down drones. It is manual and has a short reload time. This way, taking down the drone is easy and fast, but it has two big problems. The first one is that the drone falls to the ground after it is shot down. This way it could fall onto people or even worse, conflict enormous damage when the drone is armed with explosives. Therefore the ‘Skywall 100’ cannot be used in every situation. Another problem is that this launcher is manual, and a human life can be at risk in situations when an armed drone must be taken down.
[[File:Interceptor_from_Delft_Dynamics.jpg|thumb|A drone catching another one by using a net launcher]]
[[File:Interceptor_from_Delft_Dynamics.jpg|thumb|A drone catching another one by using a net launcher]]
There remain two alternatives by which another drone is used to catch the violating drone. No human lives will be at risks and the violating drone can be delivered at a desired place. The first option is by using an interceptor drone, which deploys a net in which the drone is caught. This is an existing idea. A French company named MALOU-tech, has built the Interceptor MP200.<ref name = "interceptor"> MALOU-tech [http://groupe-assmann.fr/malou-tech/]</ref>  But this way of catching a drone has some side-effects. On the one hand, this interceptor drone can catch a violating drone and deliver it at a desired place. But on the other hand, the relatively big interceptor drone must be as fast and agile as the smaller drone, which is hard to achieve. Also, the net is quite rigid and when there is a collision between the net and the violating drone, the interceptor drone must be stable and able to find balance, otherwise it will fall to the ground. Another problem that occurs is that the violating drone is caught in the net, but not sealed in it. It can easily fall out of the net or not even be able to be caught in the net. Drones with a frame that protect the rotor blades well are not able to get caught because the rotor blades cannot get stuck in the net.  
There remain two alternatives by which another drone is used to catch the violating drone. No human lives will be at risks and the violating drone can be delivered at a desired place. The first option is by using an interceptor drone, which deploys a net in which the drone is caught. This is an existing idea. A French company named MALOU-tech, has built the Interceptor MP200.<ref name = "interceptor"> MALOU-tech [http://groupe-assmann.fr/malou-tech/]</ref>  But this way of catching a drone has some side-effects. On the one hand, this interceptor drone can catch a violating drone and deliver it at a desired place. But on the other hand, the relatively big interceptor drone must be as fast and agile as the smaller drone, which is hard to achieve. Also, the net is quite rigid and when there is a collision between the net and the violating drone, the interceptor drone must be stable and able to find balance, otherwise it will fall to the ground. Another problem that occurs is that the violating drone is caught in the net, but not sealed in it. It can easily fall out of the net or not even be able to be caught in the net. Drones with a frame that protect the rotor blades are not able to get caught because the rotor blades cannot get stuck in the net.  


Drones like the Interceptor MP 200 are good solutions to violating drones which need to be taken down, but we think that there is a better option. When we implement a net launcher onto the interceptor drone and remove the big net, this will result in better performance because of the lower weight. But when the shot is aimed correctly, the violating drone is completely stuck in the net and can’t get out, even if it has blade guards. This is important when the drone is equipped with explosives. In this case we must be sure that the armed drone is neutralized completely, meaning that we know for sure that it cannot escape or crash in an unforeseen location. This is an existing idea, and Delft Dynamics built such a drone.<ref name="dronecatcher">Delft Dynamics [https://dronecatcher.nl/ DroneCatcher]</ref>. This drone however is, in contrary to our proposed design, not fully autonomous.
Drones like the Interceptor MP 200 are good solutions to violating drones which need to be taken down, but we think that there is a better option. When we implement a net launcher onto the interceptor drone and remove the big net, this will result in better performance because of the lower weight. And when the shot is aimed correctly, the violating drone is completely stuck in the net and can’t get out, even if it has blade guards. This is important when the drone is equipped with explosives. In this case we must be sure that the armed drone is neutralized completely, meaning that we know for sure that it cannot escape or crash in an unforeseen location. This is an existing idea, and Delft Dynamics built such a drone.<ref name="dronecatcher">Delft Dynamics [https://dronecatcher.nl/ DroneCatcher]</ref>. This drone however is, in contrary to our proposed design, not fully autonomous.


==Building the Interceptor Drone==
==Building the Interceptor Drone==


Building a drone, like most other high-tech current day systems, consists of hardware as well as software design. In this part we like to focus on the software and give a general overview of the hardware. This is because our interests are more at the software part, where a lot more innovative leaps can still be made. In the hardware part we will provide an overview of drone design considerations and a rough estimation of what such a drone would cost. In the software part we are going to look at the software that makes this drone autonomous. First, we look at how to detect the intruder, how to target it and how to assure that it has been captured. Also, a dashboard app is shown, which displays the real time system info and provides critical controls.
Building a drone, like most other high-tech current day systems, consists of hardware as well as software design. In this part we like to focus on the software and give a general overview of the hardware. This is because our interests are more at the software part, where a lot more innovative leaps can still be made. In the hardware part we will provide an overview of drone design considerations and a rough estimation of what such a drone would cost. In the software part we are going to look at the software that makes this drone autonomous. First, we look at how to detect the intruder, how to target it and how to assure that it has been captured. Also, a dashboard app is shown, which displays the real-time system information and provides critical controls.


===Hardware/Software interface===
===Hardware/Software interface===
[[File:Hard.jpg|center|800px|The hardware design]]
[[File:Hard.jpg|center|800px|The hardware design]]


The main unit of the drone is the flight controller which is made out of an ST Cortex M4 Processor. This unit serves as the brain of the drone. The drone itself is powered by high capacity Lithium-Ion Batteries. The power of the batteries goes through a power module, that makes sure the drone is fed constant power, while measuring the voltage and current going through it, detecting when an anomaly with the power is happening or when the drone needs recharging. The code for autonomous flight is coded into the Cortex M4 chip, which through the Electronic Speed Controllers can control the Brushless DC Motors which spin the propellers to make the drone fly. Each Motor has its own ESC, meaning that each motor is controlled separately.
The main unit of the drone is the flight controller which is an ST Cortex M4 Processor. This unit serves as the brain of the drone. The drone itself is powered by high capacity Lithium-Ion Batteries. The power of the batteries goes through a power module, that makes sure the drone is fed constant power, while measuring the voltage and current going through it, detecting when an anomaly with the power is happening or when the drone needs recharging. The code for autonomous flight is coded into the Cortex M4 chip, which through the Electronic Speed Controllers (ESC) can control the Brushless DC Motors which spin the propellers to make the drone fly. Each Motor has its own ESC, meaning that each motor is controlled separately.


For the intent of having the drone position in real time, a GPS module is used, which provides a fairly accurate location for outdoor flight. This module communicates with the Cortex M4 to process and update the location of the drone in relation to the location target.  
For the intent of having the drone position in real time, a GPS module is used, which provides an accurate location for outdoor flight. This module communicates with the Cortex M4 to process and update the location of the drone in relation to the location target.  


To be able to detect an intruder drone with the help of computer vision, a Raspberry Pi is placed into the drone to offer the extra computation power needed. Raspberry Pi is connected to a camera, with which it can detect the attacking drone. After the target has been locked into position, the net launcher is launched towards the attacking drone.
To be able to detect an intruder drone with the help of computer vision, a Raspberry Pi is placed into the drone to offer the extra computation power needed. Raspberry Pi is connected to a camera, with which it can detect the attacking drone. After the target has been locked into position, the net launcher is launched towards the attacking drone.
Line 628: Line 656:


From the requirements a general design of the drone can be created. The main requirements concerning the hardware design of the drone are:
From the requirements a general design of the drone can be created. The main requirements concerning the hardware design of the drone are:
*Velocity of 40km/h
*Velocity of 50km/h
*Flight time of 10 minutes
*Flight time of 30 minutes
*FPV live feed
*FPV live feed
*Stop rogue drone
*Stop rogue drone
Line 638: Line 666:
*At minimal cost
*At minimal cost


A big span is required, in order to stay stable while carrying high weight (i.e. an intruder with heavy explosives). This is due to the higher leverage of rotors at larger distances. But with the large format the portability and maneuverability decrease. Based on the intended maximum target weight of 8kg, a drone of roughly 80cm has been chosen. The chosen design uses eight rotors instead of the more common four or six (also called an octocopter, [https://3dinsider.com/hexacopters-quadcopters-octocopters/]) to increase maneuverability and carrying capacity. By doing so, the flight time will decrease and costs will increase. Flight time however is not a big concern as a typical interceptor routine will not take more than ten minutes. For quick recharging, a system with automated battery swapping can be deployed.<ref name= "lee"> D. Lee, J. Zhou, W. Tze Lin [https://ieeexplore.ieee.org/abstract/document/7152282/ Autonomous battery swapping system for quadcopter] (2015) </ref>
A big span is required, to stay stable while carrying high weight (i.e. an intruder with heavy explosives). This is due to the higher leverage of rotors at larger distances. But with the large format the portability and maneuverability decrease. Based on the intended maximum target weight of 8kg, a drone of roughly 80cm has been chosen. The chosen design uses eight rotors instead of the more common four or six (also called an octocopter, [https://3dinsider.com/hexacopters-quadcopters-octocopters/]) to increase maneuverability and carrying capacity. By doing so, the flight time will decrease, and costs will increase. Flight time however is not a big concern as a typical interceptor routine will not take more than ten minutes. For quick recharging, a system with automated battery swapping can be deployed.<ref name= "lee"> D. Lee, J. Zhou, W. Tze Lin [https://ieeexplore.ieee.org/abstract/document/7152282/ Autonomous battery swapping system for quadcopter] (2015) </ref>
Another important aspect of this drone will be its ability to track another drone. To do so, it is equipped with two cameras. The fpv-camera is low resolution and low latency and is used for tracking. Because of its low resolution, it can also real time be streamed to the base station. The drone is also equipped with a high resolution camera which captures images at a lower framerate. These images are streamed to the base station and can be used for identification of intruders. Both cameras are mounted on a gimbal to the drone to keep their feeds steady at all times such that targeting and identification gets easier.
Another important aspect of this drone will be its ability to track another drone. To do so, it is equipped with two cameras. The FPV-camera is low resolution and low latency and is used for tracking. Because of its low resolution, it can also real time be streamed to the base station. The drone is also equipped with a high-resolution camera which captures images at a lower framerate. These images are streamed to the base station and can be used for identification of intruders. Both cameras are mounted on a gimbal to the drone to keep their feeds steady at all times such that targeting and identification gets easier.
The drone is equipped with a net launcher in order to stop the targeted intruder as described in the section “How to catch a drone”.
The drone is equipped with a net launcher to stop the targeted intruder as described in the section “How to catch a drone”.


To clarify our hardware design, we have made a 3D model of the proposed drone and the net launcher. This model is based on work from Felipe Westin on GrabCad [https://grabcad.com/library/drone-octocopter-z8-1] and extended with a net launcher.
To clarify our hardware design, we have made a 3D model of the proposed drone and the net launcher. This model is based on work from Felipe Westin on GrabCad [https://grabcad.com/library/drone-octocopter-z8-1] and extended with a net launcher.
Line 648: Line 676:
===Bill of costs===
===Bill of costs===


If we wanted to build the drone, we first start with the frame. In this frame it must be able to implement eight motors. Suitable frames can be bought at a price around 2000 euros. The next step is to implement motors into this frame. Our drone has to able to follow the violating drone, so it has to be fast. Motors with 1280kv are able to reach a velocity of 80 kilometers per hour. This motors cost about 150 euros each. The motors need power which comes from the battery. The price of the battery is a rough estimation. This is because we do not exactly know how much power our drone needs and how much it is going to weigh. Intelligent flight batteries differ in Ampère's. Because this is a price estimation we take a battery which generates 4500mAh and costs about 200 euros. Also two camera's are needed as explained before. FPV-camera's for racing drones cost about 50 euro. High resolution cameras can costs as much as you want. But we need to keep the price reasonable so the camera which we implement gives us 5.2K Ultra HD at 30 frames per second. These two cameras and the net launcher need to be attached on a gimbal. A strong and stable enough gimbal costs about 2000 euros. Further costs are electronics such as a flight controller, an ESC, antennas, transmitters and cables.
If we wanted to build the drone, we start with the frame. In this frame it must be able to implement eight motors. Suitable frames can be bought for around 2000 euros. The next step is to implement motors into this frame. Our drone must be able to follow the violating drone, so it must be fast. Motors with 1280kv can reach a velocity of 80 kilometers per hour. These motors cost about 150 euros each. The motors need power which comes from the battery. The price of the battery is a rough estimation. This is because we do not exactly know how much power our drone needs and how much it is going to weigh. Because this is a price estimation we take a battery with 4500mAh and costs about 200 euros. Also, two cameras are needed as explained before. FPV-camera's for racing drones cost about 50 euros. High resolution cameras can cost as much as you want. But we need to keep the price reasonable so the camera which we implement gives us 5.2K Ultra HD at 30 frames per second. These two cameras and the net launcher need to be attached on a gimbal. A strong and stable enough gimbal costs about 2000 euros. Further costs are electronics such as a flight controller, an ESC, antennas, transmitters and cables.




Line 684: Line 712:


The net launcher of the drone uses a pneumatic launcher to shoot the net. To fully understand its capabilities and how to design it, a model needs to be derived first.
The net launcher of the drone uses a pneumatic launcher to shoot the net. To fully understand its capabilities and how to design it, a model needs to be derived first.
[[File:Schematic_and_free_body_diagram_of_the_pneumatic_Launcher.JPG‎ |center|800px|Schematic and free body diagram]]


From Newton's second law, the sum of forces acting on the projectile attached to the corners of the net are
[[File:Schematic_and_free_body_diagram_of_the_pneumatic_Launcher.JPG‎ |none|800px|Schematic and free body diagram]]
 
From Newton's second law, the sum of forces acting on the projectile attached to the corners of the net are:
 
[[File:formula_1.png|none|150px]]
 
Where
 
[[File:formula_2.png|none|150px]]
 
With p being the pressure on the projectile and A the cross-sectional area. Using these equations, we calculate that:


<math display="block"> \sum F = F_{pres} = ma, </math>
[[File:formula_3.png|none|150px]]


where <math display="inline">\F_{pres}=pA</math> with p being the pressure on the projectile and A the crossectional area.
For the pneumatic launcher to function we need a chamber with carbon dioxide, using it to push the projectile. We will assume that no heat is lost through the tubes of the net launcher and that this gas expands adiabatically. The equation describing this process is:


Using these equations, we calculate that
[[File:formula_4.png|none|150px]]


<math display="block"> pA=mv\frac{\mathrm{d}v }{\mathrm{d} y}. </math>
where <math display="inline">p</math> is the initial pressure, <math display="inline">v</math> is the initial volume of the gas, <math display="inline">\gamma</math> is the ratio of the specific heats at constant pressure and at constant pressure, which is 1.4 for air between 26.6 degrees and 49 degrees and


For the pneumatic launcher to function we need a chamber with carbon dioxide, using it to push the projectile. We will assume that no heat is lost through the tubes of the net launcher and that his gas expands adiabatically. The equation describing this process is
[[File:formula_5.png|none|150px]]


<math display="block"> p_{0}V_{0}^{\gamma }=pV^{\gamma }, </math>
is the volume at any point in time. After adjusting the previous formulas, we get the following:


where <math display="inline"> p_{0} </math> is the inital pressure , <math display="inline"> V_{0} </math> is the inital volume of the gas, <math display="inline">/gamma </math> is the ratio of the specific heats at constant pressure and at constant pressure, which is 1.4 for air between 26.6 degrees and 49 degrees and  <math display="inline">V= V_{0} + Ay </math> is the volume at any point in time. After adjusting the previous formulas, we get the following
[[File:formula_6.png|none|250px]]


<math display="block">\left [ p_{0}\left ( \frac{V_{0}}{V_{0}+Ay} \right ) ^{\gamma }\right ]A=mv\frac{\mathrm{d} v}{\mathrm{d} y}.  </math>
After integrating from <math display="inline">y</math> is 0 to L, where L is the length of the projectile tube, we can find the speed which the projectile lives the muzzle, given by:


After integrating from <math display="inline"> y=0 </math> to <math display="inline"> y=L </math>, where L is the length of the projectile tube, we can find the speed which the projectile lives the muzzle, given by
[[File:formula_7.png|none|300px]]


<math display="block">v_{out}=\sqrt{\frac{2p_{0}V_{0}}{m(\gamma -1)}\left [ 1-\left ( \frac{V_{0}}{\frac{\pi }{4}d^2L+V_{0}} \right )^{\gamma } \right ]}. </math>
This equation helps us to design the net launcher, more specifically its diameter and length. As we require a certain speed that the net needs to be shot, we will have to adjust these parameters accordingly, so the required speed is met.  


As there is a projectile motion by the net, assuming that friction and tension forces are negligible we get the following equations for the x and y position at any point in time t:
As there is a projectile motion by the net, assuming that friction and tension forces are negligible we get the following equations for the x and y position at any point in time t:


<math display="block"> y= -y_{0} +V_{out}\sin (\alpha )t + \frac{gt^2}{2},  </math>
[[File:formula_8.png|none|250px]]
<math display="block"> x= V_{out}\cos (\alpha )t , </math>
[[File:formula_9.png|none|200px]]
 
Where <math display="inline">y</math> is the initial height and <math display="inline">\alpha</math> is the angle that the projectile forms with the x-axis.
 
Assuming a speed of 60 m/s, we can plot the position of the net in a certain time point, helping us predict where our drone needs to be for it to successfully launch the net to the other drone.
 
[[File:Plot2.jpg|none|500px]]
 
From the plot, it is clear that the drone can be used from a very large distance to capture the drone it is attacking, from a distance as far as 130m. Although this is an advantage of our drone, shooting from such a distance must be only seen as a last option, as the model does not predict the projectile perfectly. Assumptions such as the use of an ideal gas, or neglect of the air friction make for inaccuracies, which only get amplified in bigger distances.
 
===Automatic target recognition and following===
''Many thanks to Duarte Antunes on helping us with drone detection and drone control theory''
 
For the drone to be able to track and target the intruder, it needs to know where the intruder is at any point in time. To do so, various techniques have been developed over the years all belonging to the area of Automatic Target Recognition (also referred to as ATR). There has been a high need for this research field for a long time. It has applications in for example guided missiles, automated surveillance and automated vehicles.
 
ATR started with radars and manual operators but as quality of cameras and computation power became more accessible and have taken a big part in today’s development. The camera can supply high amounts of information at low cost and weight. That is why it has been chosen to use it on this drone. Additionally, the drone is equipped with a gimbal on the camera to make the video stream more stable and thus lower the needed filtering and thereby increase the quality of the information.
 
This high amount of information does induce the need for a lot of filtering to get the required information from the camera. Doing so is computationally heavy, although a lot of effort has been made to reduce the computational lifting. To do this, our drone carries a Raspberry Pi computation unit next to its regular flight controller to do the heavy image processing. One example algorithm is contour tracking, which detects the boundary of a predefined object. Typically, the computational complexity for these algorithms is low but their performance in complex environments (like when mounted to a moving parent or tracking objects that move behind obstructions) is also low. An alternative technology is based around particle filters. Using this particle filter on color-based information, a robust estimation of the target's position, orientation and scale can be made. <ref name="teuliere">C. Teuliere, L. Eck, E. Merchand (2011) [https://ieeexplore.ieee.org/abstract/document/6094404 Chasing a moving target from a flying UAV] ''2011 IEEE/RSJ International Conference on Intelligent Robots and Systems''</ref>
 
Another option is a neural network trained to detect drones. This way, a pretrained network could operate on the Raspberry Pi on the drone and detect the other drone in real time. The advantages of these networks are among others that they can cope with varying environments and low-quality images. A downside of these networks is their increased computational difficulty. But because other methods like Haar cascade cannot cope very well with changing environments, seeing parts of the object or changing objects (like rotating rotors). [https://www.baseapp.com/deepsight/opencv-vs-yolo-face-detector/] A very popular network is YOLO (You Only Look Once), which is open-source and available for the common computer vision library OpenCV. This network must be trained with a big dataset of images that contain drones, from which it will learn what a drone looks like.
 
To actually track the intruding drone using visual information, one can use one of two approaches. One way is to determine the 3D pose of the drone and use this information to generate the error values for a controller which in turn tries to follow this target. Another way is to use the 2D position of your target in the camera frame and using the distance of your target from the center of your frame as the error term of the controller. To complete this last method, you need a way to now the distance to the target. This can be done with an ultrasonic sensor, which is cheap and fast, with stereo cameras or with and approximation based on the perceived size of the drone in the camera frame.
 
Which controller to use depends on the required speed, complexity of the system and computational weight. LQR and MPC are two popular control methods which are well suited to generate the most optimal solution, even in complex environments. Unfortunately, they require a lot of tuning and are relatively computationally heavy. A well-known alternative is the PID controller, which is not able to always generate the most optimal solution but is very easy to tune and easy to implement. Therefore, such a controller would best suit this project. For future version, we do advise to investigate more advanced controllers (like LQR and MPC).
 
===YOLO demo===
To demonstrate the ATR capabilities of a neural network we have built a demo. This demo shows neural network called YOLO (You Only Look Once) trained to detect humans. Based on the location of these humans in the video frame, a Python script outputs control signals to an Arduino. This Arduino controls a two-axis gimbal on which the camera is attached. This way the system can track a person in its camera frame. Due to the computational weight of this specific neural net and the lacking ability to run of a GPU, the performance is quite limited. In controlled environments, the demo setup was able to run at roughly 3 FPS. This meant that running out of the camera frame was easy and caused the system to “lose” the person.
 
Video of the YOLO demo:
[https://youtu.be/1KOLKnDnK0I https://youtu.be/1KOLKnDnK0I]
 
To also show a faster tracker we have built a demo that tracks so called Aruco markers. These markers are designed for robotics and are easy to detect for image recognition algorithms. This means that the demo can run at a higher framerate. A new Python script determines the error as the distance from the center of the marker to the center of the frame and, using this, generates a control output to the Arduino. This Arduino in turn moves the gimbal. This way, the tracker tries to keep the target at the center of the frame.
 
Video of the Aruco demo:
[https://youtu.be/ARxtBIWKNT8 https://youtu.be/ARxtBIWKNT8]
 
See [https://github.com/martinusje/USE_robots_everywhere] for all code.
 
===Conformation of acquired target===
One of the most important things is actually shooting the net. But after that, the drone must know what to do next. Because when it misses, it must go back to base and reload the net gun. Otherwise, when the target is acquired, and the intruder is caught, the drone should deliver it at a safe place, and his mission has succeeded.


where <math display="inline"> y_{0} </math> is the initial height and <math display="inline"> /alpha </math> is the angle that the projectile forms with the x-axis.
To confirm whether the drone has been caught, a robust and sensitive force sensor is attached between the net and the net launcher. Based on vibrations in the net (caused by the target trying to fly) and the actual weight, the drone can determine whether the other drone has been caught. Next to that the drone is equipped with a microphone that recognizes the pitching noise of the rotors of the other drone. If this sound has disappeared, the drone uses this information along with its other sensors to determine whether the target has been acquired or not.


Assuming a speed of 60 <math display="inline"> m/s </math>, we can plot the position of the net in a certain time point, helping us predict where our drone needs to be in order for it to successfully launch the net to the other drone.
=Base Station=
[[File:Plot1.jpg|center|500px]]


===Software===
The base station is where all the data processing of drone detection is happening, and all the decision are made. Drone detection is the most important part of the system, as it should always be accurate to prevent any unwanted situation.  
====Automatic target recognition====
For the drone to be able to track and target the intruder, it needs to know where the intruder is. To do so, various techniques have been developed over the years all belonging to the area of Automatic Target Recognition (also referred to as A.T.R.). A popular sensor tool is the camera. This is because of its low weight and cost but its high amount of available information. This high amount of information does induce the need for a lot of filtering to get the required information from the camera. Doing so is computationally heavy and therefore our drone carries a Raspberry Pi next to its flight controller to do the heavy image processing. Because our camera is moving with respect to its reference frame (since it is on the flying drone), not all techniques are applicable. These are for example ones that filter out background data by comparing many frames.


An alternative technology is based around the idea of particle filters. Using this particle filter and color-based information, a robust estimation of the target's position, orientation and scale can be made. <ref name="teuliere">C. Teuliere, L. Eck, E. Merchand (2011) [https://ieeexplore.ieee.org/abstract/document/6094404 Chasing a moving target from a flying UAV] ''2011 IEEE/RSJ International Conference on Intelligent Robots and Systems''</ref>
There is a lot of state of art technology that can be employed for drone detection such as mmWave Radar <ref name="Droz"> J. Drozdowicz, M. Wielgo, P. Samczynski, K. Kulpa, J. Krzonkalla, M. Mordzonek, M. Bryl, and Z. Jakielaszek [https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=7497351 35 GHz FMCW drone detection system ]  “35 GHz FMCW drone detection system,” in Proc. Int. Radar Symposium (IRS). IEEE, 2016, pp. 1–4. </ref>, UWB Radar <ref name="Fontana"> R. J. Fontana, E. A. Richley, A. J. Marzullo, L. C. Beard, R. W. Mulloy, and E. Knight [ https://ieeexplore.ieee.org/abstract/document/1006344/ An ultra-wideband radar for micro air vehicle applications ]  “An ultra-wideband radar for micro air vehicle applications” in Proc. IEEE Conf. Ultra Wideband Syst. Technol., 2002, pp. 187–191</ref>, Acoustic Tracking <ref name="Benyamin"> M. Benyamin and G. H. Goldman [https://www.arl.army.mil/arlreports/2014/ARL-TR-7086.pdf Acoustic Detection and Tracking ] “Acoustic Detection and Tracking of a Class I UAS with a Small Tetrahedral Microphone Array,” Army Research Laboratory Technical Report (ARL-TR-7086), DTIC Document, Tech. Rep., Sep. 2014. </ref> and Computer Vision <ref name="Boddhu"> S. K. Boddhu, M. McCartney, O. Ceccopieri, and R. L. Williams https://www.researchgate.net/publication/271452517_A_collaborative_smartphone_sensing_platform_for_detecting_and_tracking_hostile_drones A collaborative smartphone sensing platform for detecting and tracking hostile drone ] “A collaborative smartphone sensing platform for detecting and tracking hostile drones,” SPIE Defense, Security, and Sensing, pp. 874 211–874 211, 2013. </ref> . For use in airport environment radar-based systems work best, as they can identify a drone without problem in any weather condition or even in instances of high noise, which both Computer Vision and Acoustic Tracking can have problems respectively.


The acquired information of the target is than used to control the intervening drone’s position and yaw angle in order to chase the target.
As for Acoustic Tracking, there is a lot of interference in an airport, such as airplanes, jets and numerous other machines working in an airport. For a system to be able to work perfectly it needs to detect all these noises and differentiate from the ones that a drone makes, which can prove to be troublesome. With Computer Vision a high degree of feasibility can be achieved with the use of special cameras equipped with night vision, thermal sensors or Infiniti Near Infrared Cameras. Such cameras are suggested to be used in addition to radar technology for detecting drones, using data fusion techniques to get the best result, since if only Computer Vision is used a large number of cameras is needed and distance of the drone is not computed as accurately as a radar system.
 
[[File:Base.JPG|none|400px]]
The radar systems considered for drone detection are mmWave and UWB Radar. While mmWave offers 20% higher detection range, distance detection and drone distinguishing offers better results with UWB radar systems, so it is the preferred type of radar for our application <ref name="Guve"> Güvenç, İ., Ozdemir, O., Yapici, Y., Mehrpouyan, H., & Matolak, D [ https://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/20170009465.pdf Detection, localization, and tracking of unauthorized UAS and jammers ] Detection, localization, and tracking of unauthorized UAS and jammers. In 2017 IEEE/AIAA 36th Digital Avionics Systems Conference (DASC) (pp. 1-10). IEEE. </ref>. Higher cost can be involved with installing more radar systems, but since the user of the system is an airport, accuracy holds higher importance than cost.
 
Use of such radars enables the user to accurately analyze the Doppler spectrum, which is important from distinguishing from birds and drones and knowing the type of drone that is interfering in the airport area. By studying the characteristics of UWB radar echoes from a drone and testing with different types of drones, a lot of information can be distinguished about the attacking drones such as drone’s range, radial velocity, size, type, shape, and altitude <ref name=Droz />. A database with all the data can be formed which can lead to easy detection of an attacking drone once one is located near an airport.
 
A location of the enemy drone can be estimated by the various UWB radars using triangulation <ref name="Park"> Hyunwook Park, Jaewon Noh and Sunghyun Cho [ https://journals.sagepub.com/doi/pdf/10.1177/1550147716671720 Three-dimensional positioning system
using Bluetooth low-energy beacons ] "Three-dimensional positioning system using Bluetooth low-energy beacons." International Journal of Distributed Sensor Networks. 12. 10.1177/1550147716671720. </ref>. The accuracy of triangulation depends on the number of radars that detect the drone, but the estimate is enough for our drone to be in the sight of the attacking drone. Once this happens another method is used for accurately locating the 3D position of the attacking drone. 3D localization is realized by UWB radars by employing transceivers in the ground station and another in our drone. The considered approach uses the two-way time-of-flight technique and can work at communication ranges up to 80m. A Kalman filter can finally be used to track the range of the target since there will also be noise available which we want to filter. Results have shown that the noisy range likelihood estimates can be smoothed to obtain an accurate range estimate to the target the attacking drone <ref name=Guve />.


=Application=
=Application=
Line 732: Line 814:
''Note:'' there will be no commands being sent from the application to the drone. The application's main purpose is to display the various data coming from the drones such that users can better analyze the performance of the drones and maintenance personnel can service the drones quicker.
''Note:'' there will be no commands being sent from the application to the drone. The application's main purpose is to display the various data coming from the drones such that users can better analyze the performance of the drones and maintenance personnel can service the drones quicker.


For building the application, the UI wireframe was first designed. This will make building the final application easier, since the overall layout is already known. The UI wireframe for the application’s main pages is shown below together with some details for each of the application's screens explaining the overall functionality.
For building the application, the UI wireframe was first designed. This will make building the final application easier since the overall layout is already known. The UI wireframe for the application’s main pages is shown below together with some details for each of the application's screens explaining the overall functionality.


{|
{|
Line 745: Line 827:
| [[File:New-Location.png|thumb|none|alt=ALT|250px|Radar Map page]]
| [[File:New-Location.png|thumb|none|alt=ALT|250px|Radar Map page]]
| [[File:System Settings.png|thumb|none|alt=ALT|250px|System Settings page]]
| [[File:System Settings.png|thumb|none|alt=ALT|250px|System Settings page]]
| [[File:Login.png|thumb|none|alt=ALT|250px|Login page]]
|}
|}


Line 752: Line 835:


The overall security of the system is very important for us. As one can imagine, having an interceptor drone or even a fleet of them could be used with malicious intent by some. Therefore, both the application and the ground control stations would need to be secured in order to prevent such attacks from taking place. This can be done by requiring the users to make an account for using the application and also having the users verified when installing the interceptor drone system as described above. Moreover, the protocols used by the drones to send data to the ground station will be secured to avoid any possible attacks. Also the communication between all the subsystems will happen in a network which will be secured and closely monitored for suspicious traffic.
The overall security of the system is very important for us. As one can imagine, having an interceptor drone or even a fleet of them could be used with malicious intent by some. Therefore, both the application and the ground control stations would need to be secured in order to prevent such attacks from taking place. This can be done by requiring the users to make an account for using the application and also having the users verified when installing the interceptor drone system as described above. Moreover, the protocols used by the drones to send data to the ground station will be secured to avoid any possible attacks. Also the communication between all the subsystems will happen in a network which will be secured and closely monitored for suspicious traffic.
Although we strive to build a package that is as secure as possible, this will not be the main point of this project and in the following we will concentrate on application’s user interface.
Although we strive to build a package that is as secure as possible, this will not be the main point of this section and in the following, we will concentrate on application’s user interface. The system security section provides more details as to how the whole system will be secured.


===Home Page===
===Home Page===
Line 782: Line 865:


This page can be accessed from the Home page by clicking on the Settings page button in the bottom navigation menu. From this page users can change different settings which are not related to the security of the system. These can be things such as assigning different identifiers for the drones to be displayed on the home page and also changing the configuration settings for the communication between the drones and the ground stations.
This page can be accessed from the Home page by clicking on the Settings page button in the bottom navigation menu. From this page users can change different settings which are not related to the security of the system. These can be things such as assigning different identifiers for the drones to be displayed on the home page and also changing the configuration settings for the communication between the drones and the ground stations.
===Login Page===
This page will be prompted to the user each time the application is opened. It will require an email and password. If these credential are correct, then the user is expected to input a token which is generated at the base station (either by a token generator provided with the system or by some kind of a software running in a secured environment such as computer which is not connected to the internet). Once the token is inserted the user can proceed to logging in. This will allow access to all the other application features. This way we have a two factor authentication and we can ensure a greater level of security overall. For more details about how the communication between the drone and the base station will be secured please read the System Security section.


===User Testing===
===User Testing===


For testing whether the application and its user interface are intuitive and easy to use, a questionnaire was built. The main purpose for this was to see what needed to be improved for the application to be useful for the end users. We will provide both the questionnaire and its results once the user testing phase is finished for the application.
For testing whether the application and its user interface are intuitive and easy to use, a questionnaire was built. The main purpose for this was to see what needed to be improved for the application to be useful for the end users. We will provide both the questionnaire and its results once the user testing phase is finished for the application.
We used Google Forms to build a questionnaire in order to understand more not only about the application but the system as a whole. The form can be found at the following [https://docs.google.com/forms/d/e/1FAIpQLSckDJSP6x5e6TlEIGnprqFJY-HWB3pMpVnng6htKwQwhjnI_w/viewform link] together with the results. From analysing the data, it was pretty clear that a lot of people working in the industry are well aware about the problems concerning rogue drones flying around airports, which is our main focus for the project. For the application, although the majority had favorable opinions, there is still room for improvement. In the future, more options can be added to both the security screen and the settings pages. Moreover, the radar map can be improved to provide more accurate data and allow to user for better customizing certain options. A search functions for locating flights would also be a nice addition to this screen.
All in all, we do think that there are indeed benefits from having the application for the drone interception system.
==Security==
===System Security===
This section provides more details as to how the whole system (base station and drone fleet) will be secured. For passing data between the base station and the drone the Transport Layer Security (TLS) protocol will be used. This is widely used over the internet for applications which involve securing sensitive information such as bank payments. It is a cryptographic protocol designed to provide communication security over a computer network. The TLS protocol aims primarily to provide privacy and data integrity between two or more communicating computer applications such as a server and a client.
When using the TLS protocol for securing the data that is being communicated, the system will be using symmetric key for encryption and decryption. Moreover, the identity of the parties which are communicating (in this case the base station and the drones) can be authenticated using public-key cryptography. Public-key cryptography entails that all the parties in the network will have a public key which is known by everyone. Associated with this public key, the parties will also have a private key which should not be disclosed (must be kept secret). For this specific case, the base station will have a private key generated upon installation of the system (at some airport or other facility). This will be kept in cold storage somewhere offline, protected from any entity that might be interested in stealing it and only accessed when required. Moreover, each drone will also be assigned a private key to be used in the communication protocol, which again will be stored on the drone’s internal storage. Since the drones will not have an external IP and they can only be accessed from the base station’s servers this will be secure.
Upon leaving the base station, a TLS handshake between the drone and server can be made to ensure the connection will be established throughout the mission and data can be securely passed back.
Finally, using the TLS protocol for passing data between the drones and the base station will also provide reliability because each message transmitted includes a message integrity check using a message authentication code to prevent undetected loss or alteration of the data during transmission. Therefore, messages that will not match the integrity check will be discarded by the server and resend from the drone.
Similarly, all the data that is sent from the base station’s servers (only in case of a mission abort) will also be encrypted. Once the data reaches the drone, it will be checked and if it does not match the signature of the server it will not be executed.
Using the Transport Layer Security (TLS) protocol for communicating data between the base station and the drone fleet will make the overall system secure and protected from any hacker that might want to gain access to our system and pass commands to the drone or other malicious data to the servers.
===Application Security===
The drone can be designed to operate autonomously or with a human controlling it. Both methods have their own pros and cons. If we were to design the drone to operate autonomously it could completely prevent the possibility of someone hacking into it remotely and disabling it to intercept an attacking drone. Even though this operation mode would be preferred, it means that if something were to go wrong and the drone starts making dangerous maneuvers, human beings could be endangered by such actions. Because of this, the main legislation governing the essential health and safety requirements for machinery at EU level requires the implementation of an emergency stop button in such types of machines <ref name="Eu"> EU Machinery Legislation (2014) [http://ec.europa.eu/DocsRoom/documents/10661/attachments/1/translations/en/renditions/native Emergency Stop Devices ]  '' Emergency Stop Devices (MD Annex I 1.2.4.3)''</ref>. This legislation makes the option of our system being completely autonomous illegal and another solution should be implemented where the user has some control on the drone.
If the drone is to be controlled by a user, what this user can control is really important. The end goal of such an interceptor drone is to be as autonomous as possible, leaving as little as possible in the hands of humans. But what should the interceptive drone system do, once it has detected that an unwanted drone inside the operating area of the airport? Should it intercept immediately or wait for a confirmation from an airport employee?
The flights of drones are unacceptable in any airport area, so the goal would be to get rid of it as soon as possible. Introducing a human factor to confirm the interference of the attacking drone would introduce another extra step to the whole process, which could cause unwanted delays and cost a lot of money to the airport, as the Gatwick accident has shown. The best thing to do is for these drones to try to intercept the target as soon as it is detected, as every drone flying in airport perimeter can be considered unwanted (enemy), whoever is operating it.
In the meantime, this system should notify the user of the system that a target has been detected. This way the employee responsible can also inform the Air Traffic Control (ATC) to hold up the incoming and departing planes. While this can cause inconvenience and create more queues at the airport, it is the best available option. Also, in case there are some real emergencies with planes holding in the pattern for too much time, then the ATC can coordinate with the pilots for a safe landing. The alternative could be too costly, as a drone could hit the engine of an airplane and have devastating results. When a drone is detected by our system, a human will monitor and pass this information to the ATC. This is to avoid a false positives since pausing all the air traffic would cause chaos and incur big losses for the airport, therefore, it must be avoided at all costs.
The only control on the drone must be the emergency stop button. As this button is of high importance because it can render the whole system useless if activated, it must be completely secured by any outside factor trying to hack into it. Only putting a stop button in the application that we designed, would not be secure enough, because some backdoor could be found and this would compromise the whole system. That is why a physical button is added to the system either in the ATC tower or at the drone fleet HQ. This means that if you only have access to one of them it would not be enough to stop the system, as both need to be pressed to stop the operation. Only a handful of people authorized by the airport would have the power to stop these drones, while anyone trying to maliciously disable the system must have physical access to the airport, which considering the high level of security it has, would be hard to achieve.


=State of the Art=
=State of the Art=
Line 791: Line 907:
===Tracking/Targeting===
===Tracking/Targeting===
To target a moving target from a moving drone, a way of tracking the target is needed. A lot of articles on how to do this, or related to this problem have been published:
To target a moving target from a moving drone, a way of tracking the target is needed. A lot of articles on how to do this, or related to this problem have been published:
*Continous wave radar for drone detection. This paper goes about the detection of drones by FMCW type radars, i.e. mmWave radars, by the use of the Doppler effect. It calculates the ranges that these radars need to be near the drone for the detection to be successful and validates them by real-life trials <ref name=Droz />.
*Moving Target Classification and Tracking from Real-time Video. In this paper describes a way of extracting moving targets from a real-time video stream which can classify them into predefined categories. This is a useful technique which can be mounted to a ground station or to a drone and extract relevant data of the target.<ref name="lipton">A.J. Lipton, H.Fujiyoshi, R.S. Patil [https://ieeexplore.ieee.org/abstract/document/732851 Moving target classification and tracking from real-time video] (1998) ''Proceedings Fourth IEEE Workshop on Applications of Computer Vision. WACV'98 (Cat. No.98EX201)''</ref>
*Moving Target Classification and Tracking from Real-time Video. In this paper describes a way of extracting moving targets from a real-time video stream which can classify them into predefined categories. This is a useful technique which can be mounted to a ground station or to a drone and extract relevant data of the target.<ref name="lipton">A.J. Lipton, H.Fujiyoshi, R.S. Patil [https://ieeexplore.ieee.org/abstract/document/732851 Moving target classification and tracking from real-time video] (1998) ''Proceedings Fourth IEEE Workshop on Applications of Computer Vision. WACV'98 (Cat. No.98EX201)''</ref>
*Target tracking using television-based bistatic radar. This article describes a way of detecting and tracking airborne targets from a ground based station using radar technology. In order to determine the location and estimate the target’s track, it uses the Doppler shift and bearing of target echoes. This allows for tracking and targeting drones from a large distance. <ref name="howland">P.F. Howland [https://digital-library.theiet.org/content/journals/10.1049/ip-rsn_19990322 Target tracking using television-based bistatic radar] (1999) ''IEE Proceedings - Radar, Sonar and Navigation Volume 146, Issue 3'' p. 166 – 174</ref>
*Target tracking using television-based bistatic radar. This article describes a way of detecting and tracking airborne targets from a ground based station using radar technology. In order to determine the location and estimate the target’s track, it uses the Doppler shift and bearing of target echoes. This allows for tracking and targeting drones from a large distance. <ref name="howland">P.F. Howland [https://digital-library.theiet.org/content/journals/10.1049/ip-rsn_19990322 Target tracking using television-based bistatic radar] (1999) ''IEE Proceedings - Radar, Sonar and Navigation Volume 146, Issue 3'' p. 166 – 174</ref>
Line 799: Line 916:
*Patent for scanning environments and tracking unmanned aerial vehicles. This patent refers to systems and methods for scanning environments and tracking unmanned aerial vehicles within the scanned environments. It also provides a method for identifying points of interest in an image and generating a map of the region. <ref name="asa">Asa Hammond Nathan. Schuett Naimisaranya Das Busek. (2016). [https://patentimages.storage.googleapis.com/5d/43/94/7703df7619d8ea/US20160292872A1.pdf Scanning environments and tracking unmanned aerial vehicles] ''U.S. Patent No. US20160292872A1. Washington, DC: U.S. Patent and Trademark Office''</ref>
*Patent for scanning environments and tracking unmanned aerial vehicles. This patent refers to systems and methods for scanning environments and tracking unmanned aerial vehicles within the scanned environments. It also provides a method for identifying points of interest in an image and generating a map of the region. <ref name="asa">Asa Hammond Nathan. Schuett Naimisaranya Das Busek. (2016). [https://patentimages.storage.googleapis.com/5d/43/94/7703df7619d8ea/US20160292872A1.pdf Scanning environments and tracking unmanned aerial vehicles] ''U.S. Patent No. US20160292872A1. Washington, DC: U.S. Patent and Trademark Office''</ref>
*Algorithms based on Multiplayer Differential Game Theory, such as two-player decomposition approach, maximum principle approach and minimum-time decomposition approach are presented, each arriving to an efficient way of intercepting an attacking UAV but focusing on optimizing a different variable based on the numbers of drones controlled and attacking the UAV. <ref name="Reimann"> Johan M. Reimann [https://pdfs.semanticscholar.org/2f1f/097e7a14337d274f921cb3ba069568eef6e4.pdf USING MULTIPLAYER DIFFERENTIAL GAME THEORY TO DERIVE EFFICIENT PURSUIT-EVASION STRATEGIES FOR UNMANNED AERIAL VEHICLES] (2007) ''School of Electrical and Computer Engineering, Georgia Institute of Technology''</ref>
*Algorithms based on Multiplayer Differential Game Theory, such as two-player decomposition approach, maximum principle approach and minimum-time decomposition approach are presented, each arriving to an efficient way of intercepting an attacking UAV but focusing on optimizing a different variable based on the numbers of drones controlled and attacking the UAV. <ref name="Reimann"> Johan M. Reimann [https://pdfs.semanticscholar.org/2f1f/097e7a14337d274f921cb3ba069568eef6e4.pdf USING MULTIPLAYER DIFFERENTIAL GAME THEORY TO DERIVE EFFICIENT PURSUIT-EVASION STRATEGIES FOR UNMANNED AERIAL VEHICLES] (2007) ''School of Electrical and Computer Engineering, Georgia Institute of Technology''</ref>
===Autonomous flying===
===Autonomous flying===
*Patent for flight control using computer vision. This patent provides methods for computing a three-dimensional relative location of a target with respect to the reference aerial vehicle based on the image of the environment. <ref name="guy">Guy Bar-Nahum. Hong-Bin Yoon. Karthik Govindaswamy. Hoang Anh Nguyen. (2018). [https://patentimages.storage.googleapis.com/9d/6e/74/8cb3c250ec50a3/US20190025858A1.pdf Flight control using computer vision] ''U.S. Patent No. US20190025858A1. Washington, DC: U.S. Patent and Trademark Office''</ref>
*Patent for flight control using computer vision. This patent provides methods for computing a three-dimensional relative location of a target with respect to the reference aerial vehicle based on the image of the environment. <ref name="guy">Guy Bar-Nahum. Hong-Bin Yoon. Karthik Govindaswamy. Hoang Anh Nguyen. (2018). [https://patentimages.storage.googleapis.com/9d/6e/74/8cb3c250ec50a3/US20190025858A1.pdf Flight control using computer vision] ''U.S. Patent No. US20190025858A1. Washington, DC: U.S. Patent and Trademark Office''</ref>
Line 816: Line 934:
*Aerodynamics and control of autonomous quadrotor helicopters in aggressive maneuvering. This article improves on previous work on aerodynamic effects impacting quadrotors. These are used to develop new control techniques that allow for more aggressive maneuvering. This is also useful in the pursuit of another agile drone or UAV. <ref name = "huang"> H. Huang, G. M. Hoffmann, S. L. Waslander, C. J. Tomlin [https://ieeexplore.ieee.org/abstract/document/5152561 Aerodynamics and control of autonomous quadrotor helicopters in aggressive maneuvering] (2009) </ref>
*Aerodynamics and control of autonomous quadrotor helicopters in aggressive maneuvering. This article improves on previous work on aerodynamic effects impacting quadrotors. These are used to develop new control techniques that allow for more aggressive maneuvering. This is also useful in the pursuit of another agile drone or UAV. <ref name = "huang"> H. Huang, G. M. Hoffmann, S. L. Waslander, C. J. Tomlin [https://ieeexplore.ieee.org/abstract/document/5152561 Aerodynamics and control of autonomous quadrotor helicopters in aggressive maneuvering] (2009) </ref>
One big challenge surrounding drones is that their flight time is really limited. A way to prevent this and to have drones constantly surveilling the area they are programmed to is with autonomous mid-air battery swapping. <ref name= "Reed"> Jacobsen, Reed; Ruhe, Nikolai; and Dornback, Nathan [https://ideaexchange.uakron.edu/cgi/viewcontent.cgi?referer=https://www.google.com/&httpsredir=1&article=1694&context=honors_research_projects Autonomous UAV Battery Swapping] (2018) </ref>
One big challenge surrounding drones is that their flight time is really limited. A way to prevent this and to have drones constantly surveilling the area they are programmed to is with autonomous mid-air battery swapping. <ref name= "Reed"> Jacobsen, Reed; Ruhe, Nikolai; and Dornback, Nathan [https://ideaexchange.uakron.edu/cgi/viewcontent.cgi?referer=https://www.google.com/&httpsredir=1&article=1694&context=honors_research_projects Autonomous UAV Battery Swapping] (2018) </ref>
===Collision with Drones===
The FAA has explained why it is necessary to determine the potential severity of sUAS mid-air collisions with aircraft in order to define an Equivalent Level of Safety to manned aviation. The organization has created four reports based on various drone collisions with manned aircraft and the dangers that these situations entail. The four reports are presented below.
Volume I: UAS Airborne Collision Severity Evaluation: Summary of Structural Evaluation <ref name="assure_report_1">ASSURE (2017). [http://www.assureuas.org/projects/deliverables/a3/Volume%20I%20-%20UAS%20Airborne%20Collision%20Severity%20Evaluation%20-%20Structural%20Evaluation.pdf Volume I: UAS Airborne Collision Severity Evaluation: Summary of Structural Evaluation] Alliance for System Safety of UAS through Research Excellence (ASSURE)</ref>
Volume II: UAS Airborne Collision Severity Evaluation: Quadcopter <ref name="assure_report_2">ASSURE (2017). [http://www.assureuas.org/projects/deliverables/a3/Volume%20II%20-%20UAS%20Airborne%20Collision%20Severity%20Evaluation%20-%20Quadcopter.pdf Volume II: UAS Airborne Collision Severity Evaluation: Quadcopter] Alliance for System Safety of UAS through Research Excellence (ASSURE)</ref>
Volume III: UAS Airborne Collision Severity Evaluation: Fixed-Wing UAS <ref name="assure_report_3">ASSURE (2017). [http://www.assureuas.org/projects/deliverables/a3/Volume%20III%20-%20UAS%20Airborne%20Collision%20Severity%20Evaluation%20-%20Fixed-wing.pdf Volume III: UAS Airborne Collision Severity Evaluation: Fixed-Wing UAS] Alliance for System Safety of UAS through Research Excellence (ASSURE)</ref>
Volume IV: UAS Airborne Collision Severity Evaluation: Engine Ingestion <ref name="assure_report_4">ASSURE (2017). [http://www.assureuas.org/projects/deliverables/a3/Volume%20IV%20-%20UAS%20Airborne%20Collision%20Severity%20Evaluation%20-%20Engine%20Ingestion.pdf Volume IV: UAS Airborne Collision Severity Evaluation: Engine Ingestion] Alliance for System Safety of UAS through Research Excellence (ASSURE)</ref>
===Other===
===Other===
Although the use of drones for intervention is not directly military, this application can be seen as military. In a paper by Bradley Jay Strawser, the duty to employ UAV’s is discussed. It describes why there is nothing wrong in principle with using a UAV’s. <ref name=”Strawser”>Bradley Jay Strawser [https://doi.org/10.1080/15027570.2010.536403 Moral Predators: The Duty to Employ Uninhabited Aerial Vehicles] Journal of Military Ethics, Volume 9, 2010 – Issue 4</ref>
Although the use of drones for intervention is not directly military, this application can be seen as military. In a paper by Bradley Jay Strawser, the duty to employ UAV’s is discussed. It describes why there is nothing wrong in principle with using a UAV’s. <ref name=”Strawser”>Bradley Jay Strawser [https://doi.org/10.1080/15027570.2010.536403 Moral Predators: The Duty to Employ Uninhabited Aerial Vehicles] Journal of Military Ethics, Volume 9, 2010 – Issue 4</ref>

Latest revision as of 19:09, 6 April 2019

Preface

Group Members

Name Study Student ID
Claudiu Ion Software Science 1035445
Endi Selmanaj Electrical Engineering 1283642
Martijn Verhoeven Electrical Engineering 1233597
Leo van der Zalm Mechanical Engineering 1232931

Initial Concepts

After discussing various topics, we came up with this final list of projects that seemed interesting to us:

  • Drone interception
  • A tunnel digging robot
  • A firefighting drone for finding people
  • Delivery UAV - (blood in Africa, parcels, medicine, etc.)
  • Voice controlled robot - (general technique that has many applications)
  • A spider robot that can be used to get to hard to reach places

Chosen Project: Drone Interception

Introduction

According to the most recent industry forecast studies, the unmanned aerial systems (UAS) market is expected to reach 4.7 million units by 2020.[1] Nevertheless, regulations and technical challenges need to be addressed before such unmanned aircraft become as common and accepted by the public as their manned counterpart. The impact of an air collision between an UAS and a manned aircraft is a concern to both the public and government officials at all levels. All around the world, the primary goal of enforcing rules for UAS operations into the national airspace is to assure an appropriate level of safety. Therefore, research is needed to determine airborne hazard impact thresholds for collisions between unmanned and manned aircraft or even collisions with people on the ground as this study already shows.[2].

With the recent developments of small and cheap electronics unmanned aerial vehicles (UAVs) are becoming more affordable for the public and we are seeing an increase in the number of drones that are flying in the sky. This has started to pose several potential risks which may jeopardize not only our daily lives but also the security of various high values assets such as airports, stadiums or similar protected airspaces. The latest incident involving a drone which invaded the airspace of an airport took place in December 2018, when Gatwick airport had to be closed and hundreds of flights were cancelled following reports of drone sightings close to the runway. The incident caused major disruption and affected about 140000 passengers and over 1000 flights. This was the biggest disruption since ash from an Icelandic volcano shut down all traffic across Europe in 2010.[3]

Tests performed at the University of Dayton Research Institute show the even a small drone can cause major damage to an airliner’s wing if they meet at more than 300 kilometers per hour.[4]

The Alliance for System Safety of UAS through Research Excellence (ASSURE) which is FAA's Center of Excellence for UAS Research also conducted a study[5] regarding the collision severity of unmanned aerial systems and evaluated the impact that these might have on passenger airplanes. This is very interesting as it shows how much damage these small drones or even radio-controlled airplanes can inflict to big airplanes, which poses a huge safety threat to planes worldwide. As one can image, airplanes are most vulnerable to these types of collisions when taking off or landing, therefore protecting the airspace of an airport is of utmost importance.

This project will mostly focus on the importance of interceptor drones for an airport’s security system and the impact of rogue drones on such a system. However, a discussion on drone terrorism, privacy violation and drone spying will also be given and the impacts that these drones can have on users, society and enterprises will be analyzed. As this topic has become widely debated worldwide over the past years, we shall provide an overview of current regulations concerning drones and the restrictions that apply when flying them in certain airspaces. With the research that is going to be carried out for this project, together with all the various deliverables that will be produced, we hope to shine some light on the importance of having systems such as interceptor drones in place for protecting the airspace of the future.

Problem Statement

The problem statement is: How can autonomous unmanned aerial systems be used to quickly intercept and stop unmanned aerial vehicles in airborne situations without endangering people or other goods.

A UAV is defined as an unmanned aerial vehicle and differs from an unmanned aerial system (UAS) in one major way: a UAV is just referring to the aircraft itself, not the ground control and communications units.[6]

Objectives

  • Determine the best UAS that can intercept a UAV in airborne situations
  • Improve the chosen concept
  • Create a design for the improved concept, including software and hardware
  • Build a prototype
  • Make an evaluation based on the prototype

Project Organisation

Approach

The aim of our project is to deliver a prototype and model on how an interceptor drone can be implemented. The approach to reach this goal contains multiple steps.

Firstly, we will be going through research papers and other sources which describe the state of art of such drones and its respective components. This allows our group to get a grasp of the current technology of such a system and introduce us to the new developments in this field. This also helps to create a foundation for the project, which we can develop onto. The state of art also gives valuable insight into possible solutions we can think and whether their implementation is feasible given the knowledge we possess and the limited time. The SotA research will be achieved by studying the literature, recent reports from research institutes and the media and analyzing patents which are strongly connected to our project.

Furthermore, we will continue to analyze the problem from a USE – user, society, enterprise – perspective. An important source of this analysis is the state of art research, where the impact of these drone systems in different stakeholders is discussed. The USE aspects will be of utmost importance for our project as every engineer should strive to develop new technologies for helping not only the users but also the society as a whole and to avoid the possible consequence of the system they develop. This analysis will finally lead to a list of requirements for our design. Moreover, we will discuss the impact of these solutions on the categories listed prior.

Finally, we hope to develop a prototype for an interceptor drone. We do not plan on making a physical prototype as the time of the project is not feasible for this task. We plan on creating a 3D model of the drone and detailing how such a system would be implemented in an airport. To show the functioning of the tracking capabilities we are building a demo tracker. To complement this, we also plan on building an application, which serves as a dashboard for the drone, tracking different parameters from the drones such as position and overall status. Next to this, a list of hardware components will be researched, which would be feasible with our project and create a cost-effective product. Concerning the software, a UML diagram will be created first, to represent the system which will be implemented later. Together with the wiki page, these will be our final deliverables for the project.

Below we summarize the main steps in our approach of the project.

  • Doing research on our chosen project using SotA literature analysis
  • Analyzing the USE aspects and determining the requirements of our system
  • Consider multiple design strategies
  • Choose the Hardware and create the UML diagram
  • Work on the prototype (3D model and mobile application)
  • Create a demo of the tracking functionality
  • Evaluate the prototype

Milestones

Within this project there are three major milestones:

  • After week 2, the best UAS is chosen, options for improvements of this system are made and also there is a clear vision on the user. This means that it is known who the users are and what their requirements are.
  • After week 5, the software and hardware are designed for the improved system. Also, a prototype has been made.
  • After week 7, the tracker demo will be finished in order to be shown at the presentations.
  • After week 8, the wiki page is finished and updated with the results that were found from testing the prototype. Also, future developments are looked into and added to the wiki page.

Deliverables

  • This wiki page, which contains all of our research and findings
  • A presentation, which is a summary of what was done and what our most important results are
  • A prototype
  • A video of the tracker demo

Planning

The plan for the project is given in the form of a table in which every team member has a specific task for each week. There are also group tasks which every team member should work on. The plan also includes a number of milestones and deliverables for the project.

Name Week #1 Week #2 Week #3 Week #4 Week #5 Week #6 Week #7 Week #8
Research Requirements and USE Analysis Hardware Design Software Design Prototype / Concept Proof Reading Future Developments Conclusions
Claudiu Ion Make a draft planning Add requirements for drone on wiki Regulations and present situation Write pseudocode for interceptor drone Mobile app development Proof read the wiki page and correct mistakes Make a final presentation Review wiki page
Summarize project ideas Improve introduction Wireframing the Dashboard App Build UML diagram for software architecture Improve UML with net rounds Add system security considerations Add new references on wiki (FAA) Add final planning on wiki
Write wiki introduction Check approach Start design for dashboard mobile app Add user interface form Work on demo (YOLO neural network) Export app screens to presentation
Find 5 research papers Add login page to app
Endi Selmanaj Research 6 or more papers Elaborate on the SotA Research the hardware components Mathematical model of net launcher Write on the security of the system Write on hardawre/software interface Work on the Presantation Present the work done
Write about the USE aspects Review the whole Wiki page Draw the schematics of electronics used What makes a drone friend or enemy Try to fix the Latex Write on the Base Station Finalize writting on base station
Improve approach Start with the simulation Research Drone Detection radars Review the Wiki
Check introduction and requirements
Martijn Verhoeven Find 6 or more research papers Update SotA Research hardware options List of parts and estimation of costs Target recognition & target following Review wiki page Finalise demo and presentation Finalise wiki page
Fill in draft planning Check USE Start thinking about electronics layout Target recognition & target following Image recognition Contact bluejay Make Aruco marker detection demo
Write about objectives Work on demo
Leo van der Zalm Find 6 or more research papers Update milestones and deliverables Finish hardware research User interview Review wiki page Finish experts part Finalise wiki page
Search information about subject Check SotA Research hardware components Make a bill of costs and list of parts Confirmation of having acquired target Contact the army Improve Justification
Write problem statements USE analysis including references Improve design of net gun Justify requirements and constraints Improve planning
Write objectives
Group Work Introduction Expand on the requirements Research hardware components Interface design for the mobile app Finish the software parts Brainstorm about the presentation Work on the final presentation
Brainstorming ideas Expand on the state of the art Research systems for stopping drones Start working on 3d drone model Finish simulation of drone forces Make a draft for the presentation Finish the demo
Find papers (5 per member) Society and enterprise needs Start thinking about the software Start contacting experts Look for ideas for the demo Improve the planning
User needs and user impacts Work on UML activity diagram
Define the USE aspects Improve week 1 topics
Milestones Decide on research topic Add requirements to wiki page USE analysis finished 3D drone model finished Simulation of drone forces Mobile app prototype finished The demo(s) work Wiki page is finished
Add research papers to wiki Add state of the art to wiki page UML activity diagram finished List of parts and estimation of costs Information from experts is processed Presentation is finished
Write introduction for wiki Research into building a drone
Finish planning Research into the costs involved
Deliverables Mobile app prototype Demo tracker(s) Final presentation
3D drone model Wiki page

USE

In this section, we will focus on analyzing the different aspects involving users, society and enterprises in the context of interceptor drones. We will start by identifying key stakeholders for each of the categories and proceed by giving a more in-depth analysis. After identifying all these stakeholders, we will continue by stating what our project will mainly focus on in terms of stakeholders. Since the topic of interceptor drones is quite vast depending from which angle we choose to tackle the problem, focusing on a specific group of stakeholders will help us produce a better prototype and conduct better research for that group. Moreover, each of these stakeholders experiences different concerns, which are going to be elaborated separately.

Users

When analyzing the main users for an interceptor drone, we quickly see that airports are the most interested in having such a technology. Since the airspace within and around airports is heavily restricted and regulated, unauthorized flying drones are a real danger not only to the operation of airports and airlines, but also to the safety and comfort of passengers. As was the case with previous incidents, intruder drones which are violating airport airspaces lead to airport shutdowns which result in big delay and huge losses.

Another key group of users is represented by governmental agencies and civil infrastructure operators that want to protect certain high value assets such as embassies. As one can imagine, having intruder drones flying above such a place could lead to serious problems such as diplomacy fights or even impact the relations between the two countries involved. Therefore, one could argue that such a drone could indeed be used with malicious intent to directly cause tense relations.

Another good example worth mentioning is the incident involving the match between Serbia and Albania in 2014 when a drone invaded the pitch carrying an Albanian nationalist banner which leads to a pitch invasion by the Serbian fans and full riot. This incident led to retaliations from both Serbians and Albanians which resulted in significant material damage and damaged even more the fragile relations between the two countries [7].

We can also imagine such an interceptor drone being used by the military or other government branches for fending off terrorist attacks. Being able to deploy such a countermeasure (on a battlefield) would improve not only the safety of people but would also help in deterring terrorists from carrying out such acts of violence in the first place.

Lastly, a smaller group of users, but still worth considering are individuals who are prone to get targeted by drones, therefore having their privacy violated by such systems. This could be the case with celebrities or other VIPs who are targeted by the media to get more information about their private lives.

To summarize, from a user perspective we think that the research which will go into this project can benefit airports the most. One could say that we are taking a utilitarianism approach to solving this problem, as implementing a security system for airports would produce the greatest good for the greatest number of people.

Society

When thinking about how society could benefit from the existence of a system that detects and stops intruder drones, the best example to consider is again the airport scenario. It is already clear that whenever an unauthorized drone enters the restricted airspace of an airport this causes major concern for the safety of the passengers. Moreover, it causes huge delays and creates big problems for the airport’s operations and airlines which will be losing a lot of money. Apart from this, rogue drones around airports cause logistical nightmares for airports and airlines alike since this will not only create bottlenecks in the passenger flow through the airport, but airlines might need to divert passengers on other routes and planes. The cargo planes will also suffer delays, and this could lead to bigger problems down the supply chain such as medicine not reaching patients in time. All these problems are a great concern for society.

Another big issue for the society, which interceptor drones hope to solve, would be the ability to safely stop a rogue drone from attacking large crowds of people at various events for example. For providing the necessary protection in these situations, it is crucial that the interceptor drone acts very quickly and stops the intruder in a safe and controlled manner as fast as possible without putting the lives of other people in danger. Again, when we think in the context of providing the greatest good for the greatest number of people, the airport security example stands out, therefore this is where the focus of the research of this paper will be targeted towards.

Enterprise

When analyzing the impact interceptor drones will have on the enterprise in the context of airport security we identify two main players: the airport security and airlines operating from that airport. Moreover, the airlines can be further divided into two categories: those which transport passengers and those that transports cargo (and we can also have airlines that do both).

From the airport’s perspective, a drone sighting near the airport would require a complete shutdown of all operations for at least 30 minutes, as stated by current regulations [8]. As long as the airport is closed, it will lose money and cause operational problems. Furthermore, when the airport will open again, there will be even more problems caused by congestion since all planes would want to leave at the same time which is obviously not possible. This can lead to incidents on the tarmac involving planes, due to improper handling or lack of space in an airport which is potentially already overfilled with planes.

Whether we are talking about passenger transport or cargo, drones violating an airport’s airspace directly translates in huge losses, big delays and unhappy passengers. Not only will the airline need to compensate passengers in case the flight is canceled, but they would also need to support accommodation expenses in some cases. For cargo companies, a delay in delivering packages can literally mean life or dead if we talk about medicine that needs to get to patients. Moreover, disruptions in the transport of goods can greatly impact the supply chain of numerous other businesses and enterprises, thus these types of events (rogue drones near airports) could have even bigger ramifications.

Finally, after analyzing the USE implications of intruder drones in the context of airport security, we will now focus on researching different types of systems than can be deployed in order to not only detect but also stop and catch such intruders as quickly as possible. This will therefore help mitigate both the risks and various negative implications that such events have on the USE stakeholders that were mentioned before.

Experts

We have explained who the main users are, but not how much use our idea can find and if it can be implemented in airports. To find how our idea can be implemented, we contacted different airports; Schiphol Airport, Rotterdam The Hague Airport, Eindhoven Airport, Groningen Eelde Airport and Maastricht Aachen Airport. Only Eindhoven Airport responded to the questions that we posed. We asked the airports the following questions:

  • Are you aware of the problem that rogue intruder drones pose?
  • How are you taking care of this problem now?
  • Who is responsible if things go wrong?
  • How does this process go? (By this we mean the process form when a rogue drone is detected until it is taken down.)
  • Do you see room for innovation or are you satisfied with the current process?

Their response made clear that they are very aware of the problem. But they told us that it is not their responsibility. The terrain of the airport is the property of the Dutch army, and also the responsibility of the Dutch army. Because this problem is new, the innovation center of the army takes care of these problems.

This answer gave us some information, but we still don't know anything about the current process yet. So, we e-mailed the innovation center of the army. Also, we contacted Delft Dynamics, because they also deal with this problem and probably know more about it. The innovation center of the army and Delft Dynamics both didn't respond to any of our emails.

Requirements

To better understand the needs and design for an interceptor drone, a list of requirements is necessary. There are clearly different ways in which a rogue UAV can be detected, intercepted, tracked and stopped. However, the requirements for the interceptor drone need to be analyzed carefully as any design for such a system must ensure the safety of bystanders and minimize all possible risks involved in taking down the rogue UAV. Equally important are the constraints for the interceptor drone and finally the preferences we have for the system. For prioritizing the specific requirements for the project, the MoSCoW model was used. Each requirement has a specific level of priority which stands for must have (M), should have (S), could have (C), would have (W). We will now give the RPC table for the autonomous interceptor drone and later provide some more details about each specific requirement.

ID Requirement Preference Constraint Category Priority
R1 Detect rogue drone Long range system for detecting intruders Does NOT require human action Software M
R2 Autonomous flight Fully autonomous drone Does NOT require human action Software & Control M
R3 Object recognition Accuracy of 100% Able to be intervened Software M
R4 Detect rogue drone's flying direction Accuracy of 100% Software & Hardware S
R5 Detect rogue drone's velocity Accuracy of 100% Software & Hardware S
R6 Track target Tracking targer for at least 30 minutes Allows for operator to correct drone Software & Hardware M
R7 Velocity of 50 km/h Drone is as fast as possible Hardware M
R8 Flight time of 30 minutes Flight time is maximized Hardware M
R9 FPV live feed (with 60 FPS) Drone records and transmits flight video Records all flight video footage Hardware C
R10 Camera with high resolution Flight video is as clear as possible Allow for human drone identification Hardware C
R11 Stop rogue drone Is always successful Cannot endanger others Hardware M
R12 Stable connection to operation base Drone is always connected to base If connection is lost drone pauses intervention Software S
R13 Sensor monitoring Drone sends sensor data to base and app Critical sensor information is sent to base Software C
R14 Drone autonomously returns home Drone always gets home by itself Software C
R15 Auto take off Drone can take off autonomously at any moment Control M
R16 Auto landing Drone can autonomously land at any moment Control M
R17 Auto leveling (in flight) Drone is able to fly in heavy weather Does not require human action Control M
R18 Minimal weight Hardware S
R19 Cargo capacity of 8 kg Drone is able to carry captured drone Hardware M
R20 Portability Drone is portable and easy to transport Does not hinder drone's robustness Hardware C
R21 Fast deployment Drone can be deployed in under 5 minutes Does not hinder drone's functionality Hardware S
R22 Minimal costs System cost is less than 100000 euros Costs S

Justification Requirements

Starting with R1 we see that our drone must be able to detect a rogue drone to take him down. A preference is that the detection is done by a long-range system, this allows a smaller number of detectors to scan a bigger airspace. As can be seen at the constraint of R1, we want the drone to be autonomous. This is again specified in R2. This system autonomy is required to ensure that the system is as fast as possible, as human are less reliable in that regard. Next to that, automatic systems allow for higher accuracy as these systems can be controlled to generate optimal solutions. The difference between just autonomous flight and full autonomy is the detection and deployment part. In our system, the intruder will be detected autonomously, and the interceptor drone will also be deployed autonomously. Only after deployment, the second requirement takes effect.

R3 until R5 are about recognition and following. R3 claims that our drone must be able to recognize different objects. R4 and R5 have a lower priority but are still important. This is because detecting flying direction and detecting speed increases the chance of catching the drone but are not completely necessary. This is because the interceptor drone can better follow the rogue drone and with better following comes better shooting quality from the net launcher. Like R3, we prefer that R4 and R5 are done with 100 percent accuracy since this will improve our chance of success. Also, the drone must be able to track the rogue drone to catch it. If the rogue drone escapes, we cannot find out why the drone was here and will not be able to prevent it from happening again. Therefore, R6 gets a high priority. The constraint has been set such that the tracking process can be interrupted by an operator to prevent takedowns of friendly targets or to stop the drone if it disproportionally endangers others. R7 is about the minimal maximum velocity. If our drone cannot keep up with the rogue drone, it might escape. We prefer that our drone is as fast as possible, but we have to keep in mind that the battery does not run out too fast. 50km/h has been chosen as a safe speed as it makes sure the drone can get to the other side of a 5km airfield in roughly 10 minutes. This way it is also able to follow fast light drones. (R8) The drone must have a flight time of at least 30 minutes, because otherwise it might not have enough battery life to participate in a dogfight with an intruding drone. A flight time of 30 minutes is chosen because it is the average flight time of a prosumer drone, which are the most advanced drones manufactured by companies we can encounter.

R9 and R10 are about the cameras on the drone. The drone will need two cameras. R9 is about the high-speed camera that is used to do the tracking of the intruder. It needs to have low latency and a high refresh rate to be able to optimize the control (see section ATR). Next to this camera, a high-resolution camera is needed since it can be used to identify the intruder by the operator. The stream from this camera gets fed back to the base for inspection. R11 is the most important RPC because this is the main goal of our drone. The requirement is to stop the rogue drone, and the preference is that this is always successful. Because we chose a net launcher (justified later on), the drone will need to reload at its base after every shot. The constraint from R11 comes back at the constraint of R6, because here we see that the drone may never endanger humans.

R12 is about the connection to the operation base. A stable connection to the operation base is required because the operation base needs to know what is happening. The drone can act by itself because it’s autonomous, but maybe human interference is needed. If the operation base is not connected to the drone, human interference is not possible. We prefer that the drone is always connected to the base. But when connection is lost, the drone has to continue tracking but not yet launch its net, such that it will not try to capture a friend. R13 concerns the sensor information that is returned to the base. With this information we can adjust the drone for better performance and monitor it. This allows for intervention based on for example mechanical failure. Also, this information helps us to understand incident better afterward as the data can be saved, helps us to prepare for upcoming incidents and provide information about how we can solve them even better. R14, R15 and R16 represent the take-off, returning to station and landing. These are tasks that can best be performed autonomously for the same reasons as specified for R1 and R2.

Our drone also needs to be able to withstand different weather types. Auto-leveling in flight is required because the drone needs to be stable even in different weather conditions, like a change of wind speed. This can be performed at best by sensors and computing power instead of human control. We prefer that our drone can also fly in heavy weather conditions, at least as heavy as a regular plane can so that the system can always be used. Looking at R7 and R8 we see that we want to reach maximum speed and maximum flight time. For this to take place, it is required that our drone has minimal weight (R18). Because we need to carry the rogue drone, in R19 is mentioned that we require a cargo capacity of 8 kg. This is based on an estimated weight of an intruder with possible cargo like explosives. In case of emergency, the drone should be able to be transported, but a constraint is that this does not hinder the drone’s robustness. Not having optimal robustness causes the drone to be able to break more easily and thereby decrease the chance of success. Lastly, R22 claims that the system should be delivered at minimal costs to ensure its use in as many as possible airports. This price is based on a rough estimate but a lot lower than the potential losses. That is why we feel this price is justified.

UML Activity Diagram

Activity diagrams, along with use case and state machine diagrams describe what must happen in the system that is being modeled and therefore they are also called behavior diagrams. Since stakeholders have many issues to consider and manage, it is important to communicate what the overall system should do with clarity and map out process flows in a way that is easy to understand. For this, we will give an activity diagram of our system, including the overview of how the interceptor drone will work. By doing this, we hope to demonstrate the logic of our system and also model some of the software architecture elements such as methods, functions and the operation of the drone.

Uml.png

Regulations

The world’s airspace is divided into multiple segments each of which is assigned to a specific class. The International Civil Aviation Organization (ICAO) specifies this classification to which most nations adhere to. In the US however, there are also special rules and other regulations that apply to the airspace for reasons of security or safety.

The current airspace classification scheme is defined in terms of flight rules: IFR (instrument flight rules), VFR (visual flight rules) or SVFR (special visual flight rules) and in terms of interactions between ATC (air traffic control). Generally, different airspaces allocate the responsibility for avoiding other aircraft to either the pilot or the ATC.

ICAO adopted classifications

Note: These are the ICAO definitions.

  • Class A: All operations must be conducted under IFR. All aircraft are subject to ATC clearance. All flights are separated from each other by ATC.
  • Class B: Operations may be conducted under IFR, SVFR, or VFR. All aircraft are subject to ATC clearance. All flights are separated from each other by ATC.
  • Class C: Operations may be conducted under IFR, SVFR, or VFR. All aircraft are subject to ATC clearance (country-specific variations notwithstanding). Aircraft operating under IFR and SVFR are separated from each other and from flights operating under VFR, but VFR flights are not separated from each other. Flights operating under VFR are given traffic information in respect of other VFR flights.
  • Class D: Operations may be conducted under IFR, SVFR, or VFR. All flights are subject to ATC clearance (country-specific variations notwithstanding). Aircraft operating under IFR and SVFR are separated from each other and are given traffic information in respect of VFR flights. Flights operating under VFR are given traffic information in respect of all other flights.
  • Class E: Operations may be conducted under IFR, SVFR, or VFR. Aircraft operating under IFR and SVFR are separated from each other and are subject to ATC clearance. Flights under VFR are not subject to ATC clearance. As far as is practical, traffic information is given to all flights in respect of VFR flights.
  • Class F: Operations may be conducted under IFR or VFR. ATC separation will be provided, so far as practical, to aircraft operating under IFR. Traffic Information may be given as far as is practical in respect of other flights.
  • Class G: Operations may be conducted under IFR or VFR. ATC has no authority, but VFR minimums are to be known by pilots. Traffic Information may be given as far as is practical in respect of other flights.

Special Airspace: these may limit pilot operation in certain areas. These consist of Prohibited areas, Restricted areas, Warning Areas, MOAs (military operation areas), Alert areas and Controlled firing areas (CFAs), all of which can be found on the flight charts.

Note: Classes A–E are referred to as controlled airspace. Classes F and G are uncontrolled airspace.

Currently, each country is responsible for enforcing a set of restrictions and regulations for flying drones as there are no EU laws on this matter. In most countries, there are two categories for drone pilots: recreational drone pilots (hobbyists) and commercial drone pilots (professionals). Depending on the use of such drones, there are certain regulation that apply and even permits that a pilot needs to obtain before flying.

For example, in The Netherlands, recreational drone pilots are allowed to fly at a maximum altitude of 120m (only in Class G airspace) and they need special permission for flying at a higher altitude. Moreover, the drone needs to remain in sight at all times and the maximum takeoff weight is 25 kg. For recreational pilots, a license is not required, however, flying at night needs special approval. For commercial drone pilots, one or more permits are required depending on the situations. The RPAS (remotely piloted aircraft system) certificate is the most common license and will allow pilots to fly drones for commercial use. The maximum height, distance and takeoff limits are increased compared to the recreational use of such drones, however night time flying still requires special approval and drones still need to be flown in Class G airspace.

Apart from these rules, there are certain drone ban zones which are strictly forbidden for flying and these are: state institutions, federal or regional authority constructions, airport control zones, industrial plants, railway tracks, vessels, crowds of people, populated areas, hospitals, operation sites of police, military or search and rescue forces and finally the Dutch Caribbean Islands of Bonaire St. Eustatius and Saba. Failing to abide by the rules may result in a warning or a fine. The drone may also be confiscated. The amount of the fine or the punishment depends on the type of violation. For example, the judicial authorities will consider whether the drone was being used professionally or for hobby purposes and whether people have been endangered.

As it is usually the case with new emerging technologies, the rules and regulations fail to keep up with the technological advancements. However, recent developments in the European Parliament hope to create a unified set of laws concerning the use of drones for all European countries. A recent study[9] suggests that the rapid developing drone sector will generate up to 150000 jobs by 2050 and in the future this industry could account for 10% of the EU’s aviation market which amounts to 15 billion euros per year. Therefore, there is definitely a need to change the current regulation which in some cases complicates cross border trade in this fast-growing sector. As shown with the previous example, unmanned aircraft weighing less than 25 kg (drones) are regulated at a nationwide level which leads to inconsistent standards across different countries. Following a four months consultation period, the European Union Aviation Safety Agency (EASA) published a proposal[10] for a new regulation for unmanned aerial systems operation in open (recreational) and specific (professional) categories. On the 28th February 2019, the EASA Committee has given its positive vote to the European Commission proposal for implementing the regulations which are expected to be adopted at the latest on 15 March 2019. Although these are still small steps, the EASA is working on enabling safe operations of unmanned aerial systems (UAS) across Europe’s airspaces and the integration of these new airspace users into an already busy ecosystem.

The Interceptor Drone

Catching a Drone

The Skywall 100 is a manual drone intercepting device

The next step is to look at the device which we use to intercept the drone. This could be done with another drone, which we suggested above. But there is also another option. This is by shooting the drone down with a specialized launcher, like the ‘Skywall 100’ from OpenWorks Engineering.[11] This British company invented a net launcher which is specialized in shooting down drones. It is manual and has a short reload time. This way, taking down the drone is easy and fast, but it has two big problems. The first one is that the drone falls to the ground after it is shot down. This way it could fall onto people or even worse, conflict enormous damage when the drone is armed with explosives. Therefore the ‘Skywall 100’ cannot be used in every situation. Another problem is that this launcher is manual, and a human life can be at risk in situations when an armed drone must be taken down.

A drone catching another one by using a net launcher

There remain two alternatives by which another drone is used to catch the violating drone. No human lives will be at risks and the violating drone can be delivered at a desired place. The first option is by using an interceptor drone, which deploys a net in which the drone is caught. This is an existing idea. A French company named MALOU-tech, has built the Interceptor MP200.[12] But this way of catching a drone has some side-effects. On the one hand, this interceptor drone can catch a violating drone and deliver it at a desired place. But on the other hand, the relatively big interceptor drone must be as fast and agile as the smaller drone, which is hard to achieve. Also, the net is quite rigid and when there is a collision between the net and the violating drone, the interceptor drone must be stable and able to find balance, otherwise it will fall to the ground. Another problem that occurs is that the violating drone is caught in the net, but not sealed in it. It can easily fall out of the net or not even be able to be caught in the net. Drones with a frame that protect the rotor blades are not able to get caught because the rotor blades cannot get stuck in the net.

Drones like the Interceptor MP 200 are good solutions to violating drones which need to be taken down, but we think that there is a better option. When we implement a net launcher onto the interceptor drone and remove the big net, this will result in better performance because of the lower weight. And when the shot is aimed correctly, the violating drone is completely stuck in the net and can’t get out, even if it has blade guards. This is important when the drone is equipped with explosives. In this case we must be sure that the armed drone is neutralized completely, meaning that we know for sure that it cannot escape or crash in an unforeseen location. This is an existing idea, and Delft Dynamics built such a drone.[13]. This drone however is, in contrary to our proposed design, not fully autonomous.

Building the Interceptor Drone

Building a drone, like most other high-tech current day systems, consists of hardware as well as software design. In this part we like to focus on the software and give a general overview of the hardware. This is because our interests are more at the software part, where a lot more innovative leaps can still be made. In the hardware part we will provide an overview of drone design considerations and a rough estimation of what such a drone would cost. In the software part we are going to look at the software that makes this drone autonomous. First, we look at how to detect the intruder, how to target it and how to assure that it has been captured. Also, a dashboard app is shown, which displays the real-time system information and provides critical controls.

Hardware/Software interface

The hardware design

The main unit of the drone is the flight controller which is an ST Cortex M4 Processor. This unit serves as the brain of the drone. The drone itself is powered by high capacity Lithium-Ion Batteries. The power of the batteries goes through a power module, that makes sure the drone is fed constant power, while measuring the voltage and current going through it, detecting when an anomaly with the power is happening or when the drone needs recharging. The code for autonomous flight is coded into the Cortex M4 chip, which through the Electronic Speed Controllers (ESC) can control the Brushless DC Motors which spin the propellers to make the drone fly. Each Motor has its own ESC, meaning that each motor is controlled separately.

For the intent of having the drone position in real time, a GPS module is used, which provides an accurate location for outdoor flight. This module communicates with the Cortex M4 to process and update the location of the drone in relation to the location target.

To be able to detect an intruder drone with the help of computer vision, a Raspberry Pi is placed into the drone to offer the extra computation power needed. Raspberry Pi is connected to a camera, with which it can detect the attacking drone. After the target has been locked into position, the net launcher is launched towards the attacking drone. The drone communicates with a system located near the area of surveillance. It uses XBee Wireless Communication to do so, with which it receives and transmits data. The drone uses this first for communicating with the app, where all the data of the drone is displayed. The other use of wireless communication is to communicate with the detection system placed around the area. This can consist of a radar-based system or external cameras, connected to the main server, which then communicates the rough location of the intruder drone back.

Hardware

From the requirements a general design of the drone can be created. The main requirements concerning the hardware design of the drone are:

  • Velocity of 50km/h
  • Flight time of 30 minutes
  • FPV live feed
  • Stop rogue drone
  • Minimal weight
  • Cargo capacity of 8kg
  • Portability
  • Fast deployment
  • At minimal cost

A big span is required, to stay stable while carrying high weight (i.e. an intruder with heavy explosives). This is due to the higher leverage of rotors at larger distances. But with the large format the portability and maneuverability decrease. Based on the intended maximum target weight of 8kg, a drone of roughly 80cm has been chosen. The chosen design uses eight rotors instead of the more common four or six (also called an octocopter, [3]) to increase maneuverability and carrying capacity. By doing so, the flight time will decrease, and costs will increase. Flight time however is not a big concern as a typical interceptor routine will not take more than ten minutes. For quick recharging, a system with automated battery swapping can be deployed.[14] Another important aspect of this drone will be its ability to track another drone. To do so, it is equipped with two cameras. The FPV-camera is low resolution and low latency and is used for tracking. Because of its low resolution, it can also real time be streamed to the base station. The drone is also equipped with a high-resolution camera which captures images at a lower framerate. These images are streamed to the base station and can be used for identification of intruders. Both cameras are mounted on a gimbal to the drone to keep their feeds steady at all times such that targeting and identification gets easier. The drone is equipped with a net launcher to stop the targeted intruder as described in the section “How to catch a drone”.

To clarify our hardware design, we have made a 3D model of the proposed drone and the net launcher. This model is based on work from Felipe Westin on GrabCad [4] and extended with a net launcher.

A render of the proposed drone design
A render of the proposed net launcher design

Bill of costs

If we wanted to build the drone, we start with the frame. In this frame it must be able to implement eight motors. Suitable frames can be bought for around 2000 euros. The next step is to implement motors into this frame. Our drone must be able to follow the violating drone, so it must be fast. Motors with 1280kv can reach a velocity of 80 kilometers per hour. These motors cost about 150 euros each. The motors need power which comes from the battery. The price of the battery is a rough estimation. This is because we do not exactly know how much power our drone needs and how much it is going to weigh. Because this is a price estimation we take a battery with 4500mAh and costs about 200 euros. Also, two cameras are needed as explained before. FPV-camera's for racing drones cost about 50 euros. High resolution cameras can cost as much as you want. But we need to keep the price reasonable so the camera which we implement gives us 5.2K Ultra HD at 30 frames per second. These two cameras and the net launcher need to be attached on a gimbal. A strong and stable enough gimbal costs about 2000 euros. Further costs are electronics such as a flight controller, an ESC, antennas, transmitters and cables.


Parts Number Estimated cost
Frame and landing gear 1 €2000,-
Motor and propellor 8 €1200,-
Battery 1 €200,-
FPV-camera 1 €200,-
High resolution camera 1 €2000,-
Gimbal 1 €2000,-
Net launcher 1 €1000,-
Electronics €1000,-
Other €750,-
Total €10350,-

Net launcher mathematical modeling

The net launcher of the drone uses a pneumatic launcher to shoot the net. To fully understand its capabilities and how to design it, a model needs to be derived first.

Schematic and free body diagram

From Newton's second law, the sum of forces acting on the projectile attached to the corners of the net are:

Formula 1.png

Where

Formula 2.png

With p being the pressure on the projectile and A the cross-sectional area. Using these equations, we calculate that:

Formula 3.png

For the pneumatic launcher to function we need a chamber with carbon dioxide, using it to push the projectile. We will assume that no heat is lost through the tubes of the net launcher and that this gas expands adiabatically. The equation describing this process is:

Formula 4.png

where [math]\displaystyle{ p }[/math] is the initial pressure, [math]\displaystyle{ v }[/math] is the initial volume of the gas, [math]\displaystyle{ \gamma }[/math] is the ratio of the specific heats at constant pressure and at constant pressure, which is 1.4 for air between 26.6 degrees and 49 degrees and

Formula 5.png

is the volume at any point in time. After adjusting the previous formulas, we get the following:

Formula 6.png

After integrating from [math]\displaystyle{ y }[/math] is 0 to L, where L is the length of the projectile tube, we can find the speed which the projectile lives the muzzle, given by:

Formula 7.png

This equation helps us to design the net launcher, more specifically its diameter and length. As we require a certain speed that the net needs to be shot, we will have to adjust these parameters accordingly, so the required speed is met.

As there is a projectile motion by the net, assuming that friction and tension forces are negligible we get the following equations for the x and y position at any point in time t:

Formula 8.png
Formula 9.png

Where [math]\displaystyle{ y }[/math] is the initial height and [math]\displaystyle{ \alpha }[/math] is the angle that the projectile forms with the x-axis.

Assuming a speed of 60 m/s, we can plot the position of the net in a certain time point, helping us predict where our drone needs to be for it to successfully launch the net to the other drone.

Plot2.jpg

From the plot, it is clear that the drone can be used from a very large distance to capture the drone it is attacking, from a distance as far as 130m. Although this is an advantage of our drone, shooting from such a distance must be only seen as a last option, as the model does not predict the projectile perfectly. Assumptions such as the use of an ideal gas, or neglect of the air friction make for inaccuracies, which only get amplified in bigger distances.

Automatic target recognition and following

Many thanks to Duarte Antunes on helping us with drone detection and drone control theory

For the drone to be able to track and target the intruder, it needs to know where the intruder is at any point in time. To do so, various techniques have been developed over the years all belonging to the area of Automatic Target Recognition (also referred to as ATR). There has been a high need for this research field for a long time. It has applications in for example guided missiles, automated surveillance and automated vehicles.

ATR started with radars and manual operators but as quality of cameras and computation power became more accessible and have taken a big part in today’s development. The camera can supply high amounts of information at low cost and weight. That is why it has been chosen to use it on this drone. Additionally, the drone is equipped with a gimbal on the camera to make the video stream more stable and thus lower the needed filtering and thereby increase the quality of the information.

This high amount of information does induce the need for a lot of filtering to get the required information from the camera. Doing so is computationally heavy, although a lot of effort has been made to reduce the computational lifting. To do this, our drone carries a Raspberry Pi computation unit next to its regular flight controller to do the heavy image processing. One example algorithm is contour tracking, which detects the boundary of a predefined object. Typically, the computational complexity for these algorithms is low but their performance in complex environments (like when mounted to a moving parent or tracking objects that move behind obstructions) is also low. An alternative technology is based around particle filters. Using this particle filter on color-based information, a robust estimation of the target's position, orientation and scale can be made. [15]

Another option is a neural network trained to detect drones. This way, a pretrained network could operate on the Raspberry Pi on the drone and detect the other drone in real time. The advantages of these networks are among others that they can cope with varying environments and low-quality images. A downside of these networks is their increased computational difficulty. But because other methods like Haar cascade cannot cope very well with changing environments, seeing parts of the object or changing objects (like rotating rotors). [5] A very popular network is YOLO (You Only Look Once), which is open-source and available for the common computer vision library OpenCV. This network must be trained with a big dataset of images that contain drones, from which it will learn what a drone looks like.

To actually track the intruding drone using visual information, one can use one of two approaches. One way is to determine the 3D pose of the drone and use this information to generate the error values for a controller which in turn tries to follow this target. Another way is to use the 2D position of your target in the camera frame and using the distance of your target from the center of your frame as the error term of the controller. To complete this last method, you need a way to now the distance to the target. This can be done with an ultrasonic sensor, which is cheap and fast, with stereo cameras or with and approximation based on the perceived size of the drone in the camera frame.

Which controller to use depends on the required speed, complexity of the system and computational weight. LQR and MPC are two popular control methods which are well suited to generate the most optimal solution, even in complex environments. Unfortunately, they require a lot of tuning and are relatively computationally heavy. A well-known alternative is the PID controller, which is not able to always generate the most optimal solution but is very easy to tune and easy to implement. Therefore, such a controller would best suit this project. For future version, we do advise to investigate more advanced controllers (like LQR and MPC).

YOLO demo

To demonstrate the ATR capabilities of a neural network we have built a demo. This demo shows neural network called YOLO (You Only Look Once) trained to detect humans. Based on the location of these humans in the video frame, a Python script outputs control signals to an Arduino. This Arduino controls a two-axis gimbal on which the camera is attached. This way the system can track a person in its camera frame. Due to the computational weight of this specific neural net and the lacking ability to run of a GPU, the performance is quite limited. In controlled environments, the demo setup was able to run at roughly 3 FPS. This meant that running out of the camera frame was easy and caused the system to “lose” the person.

Video of the YOLO demo: https://youtu.be/1KOLKnDnK0I

To also show a faster tracker we have built a demo that tracks so called Aruco markers. These markers are designed for robotics and are easy to detect for image recognition algorithms. This means that the demo can run at a higher framerate. A new Python script determines the error as the distance from the center of the marker to the center of the frame and, using this, generates a control output to the Arduino. This Arduino in turn moves the gimbal. This way, the tracker tries to keep the target at the center of the frame.

Video of the Aruco demo: https://youtu.be/ARxtBIWKNT8

See [6] for all code.

Conformation of acquired target

One of the most important things is actually shooting the net. But after that, the drone must know what to do next. Because when it misses, it must go back to base and reload the net gun. Otherwise, when the target is acquired, and the intruder is caught, the drone should deliver it at a safe place, and his mission has succeeded.

To confirm whether the drone has been caught, a robust and sensitive force sensor is attached between the net and the net launcher. Based on vibrations in the net (caused by the target trying to fly) and the actual weight, the drone can determine whether the other drone has been caught. Next to that the drone is equipped with a microphone that recognizes the pitching noise of the rotors of the other drone. If this sound has disappeared, the drone uses this information along with its other sensors to determine whether the target has been acquired or not.

Base Station

The base station is where all the data processing of drone detection is happening, and all the decision are made. Drone detection is the most important part of the system, as it should always be accurate to prevent any unwanted situation.

There is a lot of state of art technology that can be employed for drone detection such as mmWave Radar [16], UWB Radar [17], Acoustic Tracking [18] and Computer Vision [19] . For use in airport environment radar-based systems work best, as they can identify a drone without problem in any weather condition or even in instances of high noise, which both Computer Vision and Acoustic Tracking can have problems respectively.

As for Acoustic Tracking, there is a lot of interference in an airport, such as airplanes, jets and numerous other machines working in an airport. For a system to be able to work perfectly it needs to detect all these noises and differentiate from the ones that a drone makes, which can prove to be troublesome. With Computer Vision a high degree of feasibility can be achieved with the use of special cameras equipped with night vision, thermal sensors or Infiniti Near Infrared Cameras. Such cameras are suggested to be used in addition to radar technology for detecting drones, using data fusion techniques to get the best result, since if only Computer Vision is used a large number of cameras is needed and distance of the drone is not computed as accurately as a radar system.

Base.JPG

The radar systems considered for drone detection are mmWave and UWB Radar. While mmWave offers 20% higher detection range, distance detection and drone distinguishing offers better results with UWB radar systems, so it is the preferred type of radar for our application [20]. Higher cost can be involved with installing more radar systems, but since the user of the system is an airport, accuracy holds higher importance than cost.

Use of such radars enables the user to accurately analyze the Doppler spectrum, which is important from distinguishing from birds and drones and knowing the type of drone that is interfering in the airport area. By studying the characteristics of UWB radar echoes from a drone and testing with different types of drones, a lot of information can be distinguished about the attacking drones such as drone’s range, radial velocity, size, type, shape, and altitude [16]. A database with all the data can be formed which can lead to easy detection of an attacking drone once one is located near an airport.

A location of the enemy drone can be estimated by the various UWB radars using triangulation [21]. The accuracy of triangulation depends on the number of radars that detect the drone, but the estimate is enough for our drone to be in the sight of the attacking drone. Once this happens another method is used for accurately locating the 3D position of the attacking drone. 3D localization is realized by UWB radars by employing transceivers in the ground station and another in our drone. The considered approach uses the two-way time-of-flight technique and can work at communication ranges up to 80m. A Kalman filter can finally be used to track the range of the target since there will also be noise available which we want to filter. Results have shown that the noisy range likelihood estimates can be smoothed to obtain an accurate range estimate to the target the attacking drone [20].

Application

For the interceptor drone system there will be a mobile application developed from which the most important statistics about the interceptor drones can be viewed and also some key commands can be sent to the drone fleet. This application will also communicate with the interceptor drone (or drones if multiple are deployed) in real time, thus making the whole operation of intercepting and catching an intruder drone much faster. Being able to see the stats of the arious interceptor drones in one’s fleet is also very nice for the users and the maintenance personnel who will need to service the drones. A good example that shows how useful the application will be is when a certain drone returns from a mission, the battery levels can immediately be checked using the app and the drone can be charged accordingly.

Note: there will be no commands being sent from the application to the drone. The application's main purpose is to display the various data coming from the drones such that users can better analyze the performance of the drones and maintenance personnel can service the drones quicker.

For building the application, the UI wireframe was first designed. This will make building the final application easier since the overall layout is already known. The UI wireframe for the application’s main pages is shown below together with some details for each of the application's screens explaining the overall functionality.

ALT
Home page
ALT
Drone Status page
ALT
Drone Map page
ALT
Drone Settings page
ALT
Security page
ALT
Radar Map page
ALT
System Settings page
ALT
Login page

Securing the Application

Since the application will allow for configuration of sensitive information regarding the interceptor drone fleet, users will be required to Login prior to using the app. This ensures that some user can only make new modifications and change settings to the drone fleet that they own or have permissions for changing. The accounts will require that the owner of the interceptor drone system will be verified prior to using the application which also helps control who will have access to this security system.

The overall security of the system is very important for us. As one can imagine, having an interceptor drone or even a fleet of them could be used with malicious intent by some. Therefore, both the application and the ground control stations would need to be secured in order to prevent such attacks from taking place. This can be done by requiring the users to make an account for using the application and also having the users verified when installing the interceptor drone system as described above. Moreover, the protocols used by the drones to send data to the ground station will be secured to avoid any possible attacks. Also the communication between all the subsystems will happen in a network which will be secured and closely monitored for suspicious traffic. Although we strive to build a package that is as secure as possible, this will not be the main point of this section and in the following, we will concentrate on application’s user interface. The system security section provides more details as to how the whole system will be secured.

Home Page

This is where the users can have an overview of all the interceptor drones in their fleet. The screen displays an informative picture which is representative for each drone. By clicking on the drone’s card, the user will be taken to the drone’s Status page. The Home page also has a bottom navigation bar to all the other pages in the application: Security, Location and Settings. When the user will login into the application, the Home page will be the first screen that will show up.

Status Page

Each drone has a status page in the app. This can be accessed by clicking on a drone from the Home page. From this page, users can view whether the drone is operational or not. Any maintenance problems or error logs are also displayed here. This will be of tremendous help for the personnel who will have to take care and maintain the drone fleet. Moreover, the battery level for the drone is also displayed on this page. Each individual subsystem of the drone, such as motors, electronic speed controllers, flight controller, cameras or the net launcher will have their operational status displayed on this page.

Map Page

The Map page can be accessed from the drone’s status page. On the Map page the current location of the drone is displayed. The map can be dragged around for a better view of where the drone is located. This is helpful to not only monitor the progress of the interceptor drone during a mission, but also helps in situations when the drone might not be able to successfully fly back to the base in which case it would need to be recovered from its last known location.

Settings Page

The Settings page can be accessed from the drone’s status page. On the settings page the users can select different configurations for the drone and also view a number of graphs displaying various useful information such as battery draining levels.

Security Page

This page can be accessed from the Home page by clicking on the Security page button in the bottom navigation menu. From this page users can change different settings which are related to the security of the protected asset such as an airport or a governmental institution. For example, the no-fly zone coordinates can be configured from here together with the sensitivity levels for the whole detection system.

Radar Map Page

This page can be accessed from the Home page by clicking on the Radar Map page button in the bottom navigation menu. On this page, the users can view the location of all the drones nicely displayed on a radar like map. This is different from the drone’s Map page since it shows where all the drones in the fleet are positioned and gives a better overall view of the whole system. It also displays all the airplane related traffic from around the airport.

System Settings Page

This page can be accessed from the Home page by clicking on the Settings page button in the bottom navigation menu. From this page users can change different settings which are not related to the security of the system. These can be things such as assigning different identifiers for the drones to be displayed on the home page and also changing the configuration settings for the communication between the drones and the ground stations.

Login Page

This page will be prompted to the user each time the application is opened. It will require an email and password. If these credential are correct, then the user is expected to input a token which is generated at the base station (either by a token generator provided with the system or by some kind of a software running in a secured environment such as computer which is not connected to the internet). Once the token is inserted the user can proceed to logging in. This will allow access to all the other application features. This way we have a two factor authentication and we can ensure a greater level of security overall. For more details about how the communication between the drone and the base station will be secured please read the System Security section.

User Testing

For testing whether the application and its user interface are intuitive and easy to use, a questionnaire was built. The main purpose for this was to see what needed to be improved for the application to be useful for the end users. We will provide both the questionnaire and its results once the user testing phase is finished for the application.

We used Google Forms to build a questionnaire in order to understand more not only about the application but the system as a whole. The form can be found at the following link together with the results. From analysing the data, it was pretty clear that a lot of people working in the industry are well aware about the problems concerning rogue drones flying around airports, which is our main focus for the project. For the application, although the majority had favorable opinions, there is still room for improvement. In the future, more options can be added to both the security screen and the settings pages. Moreover, the radar map can be improved to provide more accurate data and allow to user for better customizing certain options. A search functions for locating flights would also be a nice addition to this screen. All in all, we do think that there are indeed benefits from having the application for the drone interception system.

Security

System Security

This section provides more details as to how the whole system (base station and drone fleet) will be secured. For passing data between the base station and the drone the Transport Layer Security (TLS) protocol will be used. This is widely used over the internet for applications which involve securing sensitive information such as bank payments. It is a cryptographic protocol designed to provide communication security over a computer network. The TLS protocol aims primarily to provide privacy and data integrity between two or more communicating computer applications such as a server and a client.

When using the TLS protocol for securing the data that is being communicated, the system will be using symmetric key for encryption and decryption. Moreover, the identity of the parties which are communicating (in this case the base station and the drones) can be authenticated using public-key cryptography. Public-key cryptography entails that all the parties in the network will have a public key which is known by everyone. Associated with this public key, the parties will also have a private key which should not be disclosed (must be kept secret). For this specific case, the base station will have a private key generated upon installation of the system (at some airport or other facility). This will be kept in cold storage somewhere offline, protected from any entity that might be interested in stealing it and only accessed when required. Moreover, each drone will also be assigned a private key to be used in the communication protocol, which again will be stored on the drone’s internal storage. Since the drones will not have an external IP and they can only be accessed from the base station’s servers this will be secure. Upon leaving the base station, a TLS handshake between the drone and server can be made to ensure the connection will be established throughout the mission and data can be securely passed back.

Finally, using the TLS protocol for passing data between the drones and the base station will also provide reliability because each message transmitted includes a message integrity check using a message authentication code to prevent undetected loss or alteration of the data during transmission. Therefore, messages that will not match the integrity check will be discarded by the server and resend from the drone. Similarly, all the data that is sent from the base station’s servers (only in case of a mission abort) will also be encrypted. Once the data reaches the drone, it will be checked and if it does not match the signature of the server it will not be executed.

Using the Transport Layer Security (TLS) protocol for communicating data between the base station and the drone fleet will make the overall system secure and protected from any hacker that might want to gain access to our system and pass commands to the drone or other malicious data to the servers.

Application Security

The drone can be designed to operate autonomously or with a human controlling it. Both methods have their own pros and cons. If we were to design the drone to operate autonomously it could completely prevent the possibility of someone hacking into it remotely and disabling it to intercept an attacking drone. Even though this operation mode would be preferred, it means that if something were to go wrong and the drone starts making dangerous maneuvers, human beings could be endangered by such actions. Because of this, the main legislation governing the essential health and safety requirements for machinery at EU level requires the implementation of an emergency stop button in such types of machines [22]. This legislation makes the option of our system being completely autonomous illegal and another solution should be implemented where the user has some control on the drone.

If the drone is to be controlled by a user, what this user can control is really important. The end goal of such an interceptor drone is to be as autonomous as possible, leaving as little as possible in the hands of humans. But what should the interceptive drone system do, once it has detected that an unwanted drone inside the operating area of the airport? Should it intercept immediately or wait for a confirmation from an airport employee?

The flights of drones are unacceptable in any airport area, so the goal would be to get rid of it as soon as possible. Introducing a human factor to confirm the interference of the attacking drone would introduce another extra step to the whole process, which could cause unwanted delays and cost a lot of money to the airport, as the Gatwick accident has shown. The best thing to do is for these drones to try to intercept the target as soon as it is detected, as every drone flying in airport perimeter can be considered unwanted (enemy), whoever is operating it.

In the meantime, this system should notify the user of the system that a target has been detected. This way the employee responsible can also inform the Air Traffic Control (ATC) to hold up the incoming and departing planes. While this can cause inconvenience and create more queues at the airport, it is the best available option. Also, in case there are some real emergencies with planes holding in the pattern for too much time, then the ATC can coordinate with the pilots for a safe landing. The alternative could be too costly, as a drone could hit the engine of an airplane and have devastating results. When a drone is detected by our system, a human will monitor and pass this information to the ATC. This is to avoid a false positives since pausing all the air traffic would cause chaos and incur big losses for the airport, therefore, it must be avoided at all costs.

The only control on the drone must be the emergency stop button. As this button is of high importance because it can render the whole system useless if activated, it must be completely secured by any outside factor trying to hack into it. Only putting a stop button in the application that we designed, would not be secure enough, because some backdoor could be found and this would compromise the whole system. That is why a physical button is added to the system either in the ATC tower or at the drone fleet HQ. This means that if you only have access to one of them it would not be enough to stop the system, as both need to be pressed to stop the operation. Only a handful of people authorized by the airport would have the power to stop these drones, while anyone trying to maliciously disable the system must have physical access to the airport, which considering the high level of security it has, would be hard to achieve.

State of the Art

In this section the State of the Art (or SotA) concerning our project will be discussed.

Tracking/Targeting

To target a moving target from a moving drone, a way of tracking the target is needed. A lot of articles on how to do this, or related to this problem have been published:

  • Continous wave radar for drone detection. This paper goes about the detection of drones by FMCW type radars, i.e. mmWave radars, by the use of the Doppler effect. It calculates the ranges that these radars need to be near the drone for the detection to be successful and validates them by real-life trials [16].
  • Moving Target Classification and Tracking from Real-time Video. In this paper describes a way of extracting moving targets from a real-time video stream which can classify them into predefined categories. This is a useful technique which can be mounted to a ground station or to a drone and extract relevant data of the target.[23]
  • Target tracking using television-based bistatic radar. This article describes a way of detecting and tracking airborne targets from a ground based station using radar technology. In order to determine the location and estimate the target’s track, it uses the Doppler shift and bearing of target echoes. This allows for tracking and targeting drones from a large distance. [24]
  • Detecting, tracking, and localizing a moving quadcopter using two external cameras. In this paper a way of tracking and localizing a drone using the bilateral view of two external cameras is presented. This technique can be used to monitor a small airspace and detect and track intruders. [25]
  • Aerial Object Following Using Visual Fuzzy Servoing. In this article a technique is presented to track a 3D moving object from another UAV based on the color information from a video stream with limited info. The presented technique as presented is able to do following and pursuit, flying in formation, as well as indoor navigation. [26]
  • Patent for an interceptor drone tasked to the location of another tracked drone. This patent proposes a system which includes LIDAR detection sensors and dedicated tracking sensors. [27]
  • Patent for detecting, tracking and estimating the speed of vehicles from a moving platform. This patent proposes an algorithm operated by the on-board computing system of an unmanned aerial vehicle that is used to detect and track vehicles moving on a roadway. The algorithm is configured to detect and track the vehicles despite motion created by movement of the UAV.[28]
  • Patent for scanning environments and tracking unmanned aerial vehicles. This patent refers to systems and methods for scanning environments and tracking unmanned aerial vehicles within the scanned environments. It also provides a method for identifying points of interest in an image and generating a map of the region. [29]
  • Algorithms based on Multiplayer Differential Game Theory, such as two-player decomposition approach, maximum principle approach and minimum-time decomposition approach are presented, each arriving to an efficient way of intercepting an attacking UAV but focusing on optimizing a different variable based on the numbers of drones controlled and attacking the UAV. [30]

Autonomous flying

  • Patent for flight control using computer vision. This patent provides methods for computing a three-dimensional relative location of a target with respect to the reference aerial vehicle based on the image of the environment. [31]
  • Cooperative Control Method Algorithm. This paper presents experimental results for the simultaneous interception of targets by a team of UAV’s. It includes an overview of the co-operative control strategy which can also be used in this project’s drones. [32]
  • Framework for Autonomous On-board Navigation. This article presents a framework for independent autonomous flying of a drone solely based on its onboard sensors. In this framework, the high-level navigation, computer vision and control tasks are carried out in an external processing unit. [33]
  • Towards a navigation system for autonomous indoor flying. This article provides a framework for autonomous indoor flying for small UAV’s derived from existing systems for ground based robots. This system can be utilized for indoor drone interception where dodging objects and humans is one of the most important features. [34]
  • Mission path following for an autonomous unmanned airship. In this project (AURORA) multiple flight path following techniques through a set of pre-defined points are described and compared through simulation. The tests were conducted both with and without wind to show the performance of the controllers. [35]
  • Patent for autonomous tracking and surveillance. This patent refers to a method of protecting an asset by imposing a security perimeter around it which is further divided in a number of zones protected by unmanned aerial vehicles.[36]

Stopping drones

  • One way of catching a drone is by shooting at it with a net. Extensive research has been done on shooting nets, mainly for wildlife purposes. [37][38]
  • Looking at drones shooting nets specifically, pneumatic launchers have been implemented successfully. [39]
  • Electromagnetic Launchers present another opportunity for launching a net to another drone. This article explains the mathematical model of such a system and shows the power of electromagnetic launchers, which offer very high acceleration speeds.[40]
  • This paper shows another option for shooting a net is using a hybrid pneumatic-electromagnetic launcher, giving the mathematical model of this system and simulation results. It shows that a hybrid system offers a huge control in the acceleration, more so than a simple pneumatic launcher or an electromagnetic one. [41]

General design of drones

  • Design and control of quadrotors with application to autonomous flying. This paper describes a design of a micro quadrotor, its simulation and linear and nonlinear control techniques. The techniques presented in this paper are broadly applicable and can be used in other drone environments, like drone interception activities. [42]
  • Aerodynamics and control of autonomous quadrotor helicopters in aggressive maneuvering. This article improves on previous work on aerodynamic effects impacting quadrotors. These are used to develop new control techniques that allow for more aggressive maneuvering. This is also useful in the pursuit of another agile drone or UAV. [43]

One big challenge surrounding drones is that their flight time is really limited. A way to prevent this and to have drones constantly surveilling the area they are programmed to is with autonomous mid-air battery swapping. [44]

Collision with Drones

The FAA has explained why it is necessary to determine the potential severity of sUAS mid-air collisions with aircraft in order to define an Equivalent Level of Safety to manned aviation. The organization has created four reports based on various drone collisions with manned aircraft and the dangers that these situations entail. The four reports are presented below.

Volume I: UAS Airborne Collision Severity Evaluation: Summary of Structural Evaluation [45]

Volume II: UAS Airborne Collision Severity Evaluation: Quadcopter [46]

Volume III: UAS Airborne Collision Severity Evaluation: Fixed-Wing UAS [47]

Volume IV: UAS Airborne Collision Severity Evaluation: Engine Ingestion [48]

Other

Although the use of drones for intervention is not directly military, this application can be seen as military. In a paper by Bradley Jay Strawser, the duty to employ UAV’s is discussed. It describes why there is nothing wrong in principle with using a UAV’s. [49]

There is also an existing company, in Delft, that is making drone intercepting drones called Delft Dynamics and they have built the DroneCatcher[13]

References

  1. Allianz Global Corporate & Specialty (2016). Rise of the Drones Managing the Unique Risks Associated with Unmanned Aircraft Systems
  2. Federal Aviation Administration (FAA) (2017). UAS Airborne Collision Severity Evaluation Air Traffic Organization, Washington, DC 20591
  3. From Wikipedia, the free encyclopedia (2018). Gatwick Airport drone incident Wikipedia
  4. Pamela Gregg (2018). Risk in the Sky? University of Dayton Research Institute
  5. ASSURE (2017). ASSURE Alliance for System Safety of UAS through Research Excellence (ASSURE)
  6. From Wikipedia, the free encyclopedia (2019). Unmanned aerial vehicle Wikipedia
  7. From Wikipedia, the free encyclopedia (2019). Serbia v Albania (UEFA Euro 2016 qualifying) Wikipedia
  8. Alex Hern, Gwyn Topham (2018). How dangerous are drones to aircraft? The Guardian
  9. EU Parliament (2018). Drones: new rules for safer skies across Europe Civilian Drones: Rules that Apply to European Countries
  10. European Union Aviation Safety Agency (2019). Civil Drones European Union Aviation Safety Agency
  11. OpenWorks Engineering [1]
  12. MALOU-tech [2]
  13. 13.0 13.1 Delft Dynamics DroneCatcher
  14. D. Lee, J. Zhou, W. Tze Lin Autonomous battery swapping system for quadcopter (2015)
  15. C. Teuliere, L. Eck, E. Merchand (2011) Chasing a moving target from a flying UAV 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems
  16. 16.0 16.1 16.2 J. Drozdowicz, M. Wielgo, P. Samczynski, K. Kulpa, J. Krzonkalla, M. Mordzonek, M. Bryl, and Z. Jakielaszek 35 GHz FMCW drone detection system “35 GHz FMCW drone detection system,” in Proc. Int. Radar Symposium (IRS). IEEE, 2016, pp. 1–4.
  17. R. J. Fontana, E. A. Richley, A. J. Marzullo, L. C. Beard, R. W. Mulloy, and E. Knight [ https://ieeexplore.ieee.org/abstract/document/1006344/ An ultra-wideband radar for micro air vehicle applications ] “An ultra-wideband radar for micro air vehicle applications” in Proc. IEEE Conf. Ultra Wideband Syst. Technol., 2002, pp. 187–191
  18. M. Benyamin and G. H. Goldman Acoustic Detection and Tracking “Acoustic Detection and Tracking of a Class I UAS with a Small Tetrahedral Microphone Array,” Army Research Laboratory Technical Report (ARL-TR-7086), DTIC Document, Tech. Rep., Sep. 2014.
  19. S. K. Boddhu, M. McCartney, O. Ceccopieri, and R. L. Williams https://www.researchgate.net/publication/271452517_A_collaborative_smartphone_sensing_platform_for_detecting_and_tracking_hostile_drones A collaborative smartphone sensing platform for detecting and tracking hostile drone ] “A collaborative smartphone sensing platform for detecting and tracking hostile drones,” SPIE Defense, Security, and Sensing, pp. 874 211–874 211, 2013.
  20. 20.0 20.1 Güvenç, İ., Ozdemir, O., Yapici, Y., Mehrpouyan, H., & Matolak, D [ https://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/20170009465.pdf Detection, localization, and tracking of unauthorized UAS and jammers ] Detection, localization, and tracking of unauthorized UAS and jammers. In 2017 IEEE/AIAA 36th Digital Avionics Systems Conference (DASC) (pp. 1-10). IEEE.
  21. Hyunwook Park, Jaewon Noh and Sunghyun Cho [ https://journals.sagepub.com/doi/pdf/10.1177/1550147716671720 Three-dimensional positioning system using Bluetooth low-energy beacons ] "Three-dimensional positioning system using Bluetooth low-energy beacons." International Journal of Distributed Sensor Networks. 12. 10.1177/1550147716671720.
  22. EU Machinery Legislation (2014) Emergency Stop Devices Emergency Stop Devices (MD Annex I 1.2.4.3)
  23. A.J. Lipton, H.Fujiyoshi, R.S. Patil Moving target classification and tracking from real-time video (1998) Proceedings Fourth IEEE Workshop on Applications of Computer Vision. WACV'98 (Cat. No.98EX201)
  24. P.F. Howland Target tracking using television-based bistatic radar (1999) IEE Proceedings - Radar, Sonar and Navigation Volume 146, Issue 3 p. 166 – 174
  25. M. Dreyer, S. Raj, S. Gururajan, J. Glowacki Detecting, Tracking, and Localizing a Moving Quadcopter Using Two External Cameras (2018) 2018 Flight Testing Conference, AIAA AVIATION Forum, (AIAA 2018-4281)
  26. O. Méndez, M. Ángel, M. Bernal, I. Fernando, C. Cervera, P. Alvarez, M. Alvarez, L. Luna, M. Luna, C. Viviana Aerial Object Following Using Visual Fuzzy Servoing (2011)
  27. Brian R. Van Voorst (2017). Intercept drone tasked to location of lidar tracked drone U.S. Patent No. US20170261604A1. Washington, DC: U.S. Patent and Trademark Office
  28. Eric Saund Christopher. Paulson Gregory. Burton Eric Peeters. (2014). System and method for detecting, tracking and estimating the speed of vehicles from a mobile platform U.S. Patent No. US20140336848A1. Washington, DC: U.S. Patent and Trademark Office
  29. Asa Hammond Nathan. Schuett Naimisaranya Das Busek. (2016). Scanning environments and tracking unmanned aerial vehicles U.S. Patent No. US20160292872A1. Washington, DC: U.S. Patent and Trademark Office
  30. Johan M. Reimann USING MULTIPLAYER DIFFERENTIAL GAME THEORY TO DERIVE EFFICIENT PURSUIT-EVASION STRATEGIES FOR UNMANNED AERIAL VEHICLES (2007) School of Electrical and Computer Engineering, Georgia Institute of Technology
  31. Guy Bar-Nahum. Hong-Bin Yoon. Karthik Govindaswamy. Hoang Anh Nguyen. (2018). Flight control using computer vision U.S. Patent No. US20190025858A1. Washington, DC: U.S. Patent and Trademark Office
  32. Timothy W. McLain and Randal W. Beard, Jed M. Kelsey Experimental Demonstration of Multiple Robot Cooperative Target Intercept (2007)
  33. J. J. Lugo, A. Zell Framework for Autonomous On-board Navigation with the AR.Drone (2013)
  34. S. Grzonka, G. Grisetti, W. Burgard Towards a navigation system for autonomous indoor flying(2009)
  35. J.R. Azinheira, E. Carneiro de Paiva, J.G. Ramos, S.S. Beuno Mission path following for an autonomous unmanned airship (2000)
  36. Kristen L. Kokkeby Robert P. Lutter Michael L. Munoz Frederick W. Cathey David J. Hilliard Trevor L. Olson (2008). System and methods for autonomous tracking and surveillance U.S. Patent No. US20100042269A1. Washington, DC: U.S. Patent and Trademark Office
  37. STEPHEN L. WEBB, JOHN S. LEWIS, DAVID G. HEWITT, MICKEY W. HELLICKSON, FRED C. BRYANT Assessing the Helicopter and Net Gun as a Capture Technique for White‐Tailed Deer (2008) The Journal of Wildlife Management Volume 72, Issue 1,
  38. Andrey Evgenievich Nazdratenko (2007) Net throwing device U.S. Patent No. US20100132580A1. Washington, DC: U.S. Patent and Trademark Office
  39. Mohammad Rastgaar Aagaah, Evandro M. Ficanha, Nina Mahmoudian (2016) Drone with pneumatic net launcher U.S. Patent No. US20170144756A1. Washington, DC: U.S. Patent and Trademark Office
  40. Leubner, Karel & Laga, Radim & Dolezel, Ivo. (2015) Advanced Model of Electromagnetic Launcher Advanced Model of Electromagnetic Launcher. Advances in Electrical and Electronic Engineering. 13. 223-229. 10.15598/aeee.v13i3.1419
  41. Domin, Jaroslaw & Kluszczyński, K. (2013) Hybrid pneumatic-electromagnetic launcher Hybrid pneumatic-electromagnetic launcher - general concept, mathematical model and results of simulation. 89. 21-25.
  42. S. Bouabdallah, R. Siegwart Design and control of quadrotors with application to autonomous flying (2007)
  43. H. Huang, G. M. Hoffmann, S. L. Waslander, C. J. Tomlin Aerodynamics and control of autonomous quadrotor helicopters in aggressive maneuvering (2009)
  44. Jacobsen, Reed; Ruhe, Nikolai; and Dornback, Nathan Autonomous UAV Battery Swapping (2018)
  45. ASSURE (2017). Volume I: UAS Airborne Collision Severity Evaluation: Summary of Structural Evaluation Alliance for System Safety of UAS through Research Excellence (ASSURE)
  46. ASSURE (2017). Volume II: UAS Airborne Collision Severity Evaluation: Quadcopter Alliance for System Safety of UAS through Research Excellence (ASSURE)
  47. ASSURE (2017). Volume III: UAS Airborne Collision Severity Evaluation: Fixed-Wing UAS Alliance for System Safety of UAS through Research Excellence (ASSURE)
  48. ASSURE (2017). Volume IV: UAS Airborne Collision Severity Evaluation: Engine Ingestion Alliance for System Safety of UAS through Research Excellence (ASSURE)
  49. Bradley Jay Strawser Moral Predators: The Duty to Employ Uninhabited Aerial Vehicles Journal of Military Ethics, Volume 9, 2010 – Issue 4