PRE2024 1 Group2

From Control Systems Technology Group
Jump to navigation Jump to search

Group Members

Name Student ID Department
Max van Aken 1859455 Applied Physics
Robert Arnhold 1847848 Mechanical Engineering
Tim Damen 1874810 Applied Physics
Ruben Otter 1810243 Computer Science
Raul Sanchez Flores 1844512 Computer Science / Applied Mathematics

Problem Statement

In modern warfare drones play a huge role. Drones are relatively cheap to make and deal a lot of harm, while not a lot is done against them. There exist large anti-drone systems which protect important areas from being attacked by drones. Individuals, which are at the front line, are not protected by such anti-drone systems as they are expensive and too large to carry around. This makes individuals at the front line vulnerable to drone attacks. We aim to show that an anti-drone system can be made cheap, lightweight and portable to protect these vulnerable individuals.

Objectives

To show that an anti-drone system can be made cheap, lightweight and portable we do the following things:

  • Explore and determine ethical implications of the portable device.
  • Determine how drones and projectiles can be detected.
  • Determine how a drone or projectile can be intercepted and/or redirected.
  • Build a prototype of this portable device.
  • Create a model for the interception
  • Prove the system’s utility.

Planning

Within the upcoming 8 weeks we will be working on this project. The table below shows when we aim to finish the different tasks within the 8 weeks of the project.

Week Task
1 Initial planning and setting up the project.
2 Literary research.
3 Create ethical framework.
Conduct an interview with an expert to confirm and construct the use cases.
Start constructing prototype and software.
Determine potential problems.
4 Continue constructing prototype and software
5 Finish prototype and software
6 Testing prototype to verify its effectiveness and use cases.
Evaluate testing results and make final changes.
7 Create final presentation.
8 Finish Wiki page.

Risk Evaluation

A risk evaluation matrix can be used to determine where the risks are within our project. This is based on two factors: the consequence if a task is not fulfilled and the likelihood that this happens. Both of these factors are rated on a scale from 1 to 5 and using the matrix below a final risk is determined. This can be a low, medium, high or critical risk. Knowing the risks beforehand gives the ability to prevent failures from occurring as it is known where special attention is required.

[1]

Risk evaluation matrix
Task Consequence (1-5) Likelihood (1-5) Risk
Collecting 25 articles for the SoTA 1 1 Low
Interviewing front line soldier 1 2 Low
Finding features for our system 4 1 Medium
Making a prototype 3 3 Medium
Make the wiki 5 1 Medium
Finding a detection method for drones and projectiles 4 1 Medium
Determine (ethical) method to intercept or redirect drones and projectiles 5 1 Medium
Prove the systems utility 5 2 High

Interviews

TO BE ADDED

Users and their Requirements

We currently have two main usages for this project in mind, which are the following:

  • Military forces facing threats from drones and projectiles.
  • Privately-managed critical infrastructure in areas at risk of drone-based attacks.

The users of the system will require the following:

  • Minimal maintenance
  • High reliability
  • System should not pose additional threat to surrounding
  • System must be personally portable
  • System should work reliably in dynamic, often extreme, environments
  • System should be scalable and interoperable in concept

Ethical Framework

Description

The goal of this project is to create an easily portable system that can be used as the last line of defense against incoming projectiles. In order to come up with a sufficient ethical framework this description needs to be more specified and categorized. The device under consideration is capable of neutralizing an incoming projectile. However, an incoming projectile doesn’t always have to be neutralized and this can differ between situations. Because the main purpose of this device is to be used in combat circumstances we will focus on this sort of scenario which is described as follow:

  1. This device could be used in war zone situations. For example if soldiers are in trenches it is hard for the enemy to hit them, therefore a solution that is nowadays used  in Ukraine is drones [???]. A war zone is in general a rapidly evolving environment so the soldiers and equipment need to be able to adapt to that [1]. In order to give certainty that the device will neutralize the harmful projectile there needs to be an extensive software framework which can distinguish for example birds from drones, but can also detect grenades dropped from higher altitude.

In this scenario the device will be actively used in a war zone. Therefore it should comply with the Geneva conventions. It should thus be able to examine the impact its actions would make for civilians and it must be certain it will not do harm to civilians directly or indirectly. To illustrate, if a kamikaze drone is heading towards a military vehicle, the device will either redirect the drone or let the drone explode above the ground based on the type of interception that is chosen. If the drone is redirected to not hit too close to the military vehicle, it might be that the kamikaze drone injures civilians. If the drone explodes above the ground, the fragments will still shoot downwards causing injuries over a larger area, possibly including civilians [???]. This not only violates the Geneva convention, but also misses the point of preventing harm.

In principle this can be seen as a trolley problem in disguise. On the one hand we have the soldiers who get killed if the system is not activated, but on the other hand we have the civilians who get killed if the system does get activated. In order to make sure this is not something that will happen the device must be able to choose a desired and achievable location to redirect an incoming drone towards. If this problem is seen from a utilitarian point of view we want to minimize harm to maximize happiness. In order to achieve this the triggering of a drone explosion in the air with proper warnings for the surrounding people might be a solution to achieve this maximization and further looked into in chapter????.

After all, we have to remember that the situation where there is not a harmless place to redirect the drone to is fairly specific and not at all an ‘everyday-problem’. Even stronger, this specific problem can be seen as a side effect which is expected to be far outweighed by the gains and benefits of the product.


[1] https://ndupress.ndu.edu/Portals/68/Documents/jfq/jfq-101/jfq-101_78-83_Lynch.pdf?ver=Gu3iNHVHh5wYTbAPOqwd7Q%3d%3d

Is it ethical to deflect or redirect drones and projectiles?[2][3][4][5][6][7][8]

The goal in this project is to deflect or redirect drones and projectiles for soldiers to be safe. But by deflecting or redirecting these drones and projectiles they do not disappear. Chances are that other people get injured by these deflected or redirected drones and projectiles. So is it even ethical to deflect or redirect these drones or projectiles?


Lets start off by noting that this piece of technology is in no way meant to harm anyone, but instead to keep them safe. However, according to the International Humanitarian Law (IHL), article 49 specifically, an attack is seen as any act of violence, whether in offence or defense. This means that the deflection or redirection of drones and projectiles is seen as an attack. But since this piece of technology is used in warzones, attacks are really common there, so this should not be a big problem. By deflecting or redirecting an attack, it is simply a continuation of an already ongoing attack. This would simply lead to the conclusion that it is ethical to deflect or redirect drones and projectiles. However, deflection or redirection can also cause harm to other people on the defending side or even civilians. It is not a guarantee that the attack will be deflected or redirected to the attacking side. Now this piece of technology is comparable to autonomous cars. Both are made with the intention to protect people, but both also have the side effect that they can bring harm to people and they can make life or death decisions. Lots of research has gone into the ethics of autonomous cars and this information can be used to study the ethics of our product. This will be done by looking at different scenarios that could be encountered in war.


In a one on one situation in war, with one person from the attacking side and one from the defending side, our deflection or redirection of drones and projectiles is an ethical thing to do as mentioned above. You use an already existing attack and continue with it. Since it is a one on one situation there is no one else that can possibly get harmed.


If we expand our situation to a battlefield with more then one person on the attacking side and also more then one person on the defending side, there becomes the chance that the deflection or redirection of a drone or projectile hits a person from your own team or someone from the other team which did not send the attack. As for this second scenario this would not be a big problem. Since in a war you can attack those people who are fighting you. Even if this person did not send the drone or projectile toward you, he is in the attacking team and has intentions to fight you. Deflection or redirecting a drone or projectile towards a member of your team however does mean that you are attacking someone who is not attacking you, which is not allowed. This situation can be compared to an autonomous car which in the case of an accident tries to minimize the risks for the driver, by increasing risks for the passengers. While, in a perfect scenario, the car should minimize risks for everyone involved in the accident and it should not be biased toward one side or one person.

However not all risks can be resolved with autonomous cars, we still use them since they have the potential to reduce the amount of accidents, especially when technology advances. This is also the reason that the deflection or redirection of drones and projectiles may be done in this scenario, if we aim to further develop this technology to decrease potential harm and if we remove bias within the system.


A last expansion in our situation is a situation where there are also civilians nearby which can be harmed if a drone or projectile is deflected or redirected. In the case of autonomous cars this can be comparable to a situation where an autonomous car, in order to prevent a collision, hits a pedestrian who caused no threat to the driver (or person sitting in the car since the car is autonomous). Ethics on autonomous cars learns that it is not straightforward to apply this technology in this situation since it depends on how the technology is designed, used and regulated which requires a multidisciplinary approach.

Specifications

ID Requirement Preference Constraint Category Priority Testing Method
R1 Detect Rogue Drone Detection range of at least 30m No false negatives Hardware & Software M Simulate rogue drone scenarios in the field
R2 Object Detection 100% recognition accuracy Detects even small, fast-moving objects Software M Test with various object sizes and speeds in the lab
R3 Detect Drone Direction Accuracy of 90% Must account for evasive drone movements Hardware & Software M Use drones flying in random directions for validation
R4 Detect Drone Speed Accuracy within 5% of actual speed Must be effective up to 20m/s Hardware & Software M Measure speed detection in controlled drone flights
R5 Detect Projectile Speed Accurate speed detection for fast projectiles Must handle speeds above 10m/s Hardware & Software M Fire projectiles at varying speeds and record accuracy
R6 Detect Projectile Direction Accuracy within 5 degrees No significant deviation in direction detection Hardware & Software M Test with fast-moving objects in random directions
R7 Track Drone with Laser Tracks moving targets within a 1m² radius Must follow targets precisely within the boundary Hardware S Use a laser pointer to follow a flying drone in real-time
R8 Can Intercept Drone/Projectile Drone/Projectile is within the 1m² square Must not damage surroundings or pose threat Hardware C Test in a field, using projectiles and drones in motion
R9 Low Cost-to-Intercept Interception cost under $50 per event Hardware & Software S Compare operational cost per interception in trials
R10 Low Total Cost Less than $2000 Should include all components (detection + net) Hardware C Budget system components and assess affordability
R11 Portability System weighs less than 3kg Hardware C Test for total weight and ease of transport
R12 Easily Deployable Setup takes less than 5 minutes Must require no special tools or training Hardware C Timed assembly by users in various environments
R13 Quick Reload/Auto Reload Reload takes less than 30 seconds Must be easy to reload in the field Hardware C Measure time to reload net launcher in real-time scenarios
R14 Environmental Durability Operates in temperatures between -20°C and 50°C Must work reliably in rain, dust, and strong winds Hardware W Test in extreme weather conditions (wind, rain simulation)
Mass prediction 3.jpg
Mass prediction 1.png
Mass prediction 2.jpg
1


Mechanical design

TO BE ADDED

Detection

Drone Detection

The Need for Effective Drone Detection

With the rapid advancement and production of unmanned aerial vehicles (UAV), particularly small drones, new security challenges have emerged for the military sector.[9] Drones can be used for surveillance, smuggling, and launching explosive projectiles, posing threats to infrastructure and military operations.[9] Within our project we will be mostly looking at the threat of drones launching explosive projectiles. We have as an objective to develop a portable, last-line-of-defense system that can detect drones and intercept and/or redirect the projectiles they launch. An important aspect of such a system is the capability to reliably detect drones in real-time, while possibly in dynamic environments.[10] The challenge here is to create a solution that is not only effective but also lightweight, portable, and easy to deploy.

Approaches to Drone Detection

Numerous approaches have been explored in the field of drone detection, each with its own set of advantages and disadvantages.[11][10] The main methods include radar-based detection, radio frequency (RF) detection, acoustic-based detection, and vision-based detection.[9][11] It is essential for our project to analyze these methods within the context of portability and reliability, to identify the most suitable method, or combination of methods.

Radar-Based Detection

Radar-based systems are considered as one of the most reliable methods for detecting drones.[11] Radar systems transmit short electromagnetic waves that bounce off objects in the environment and return to the receiver, allowing the system to detect the object's attributes, such as range, velocity, and size of the object.[11][10] Radar is especially effective in detecting drones in all weather conditions and can operate over long ranges.[9][11] Radars, such as active pulse-Doppler radar, can track the movement of drones and distinguish them from other flying objects based on the Doppler shift caused by the motion of propellers (the micro-Doppler effect).[9][10][11]

Despite its effectiveness, radar-based detection systems come with certain limitations that must be considered. First, traditional radar systems are rather large and require significant power, making them less suitable for a portable defense system.[11] Additionally, radar can struggle to detect small drones flying at low altitudes due to their limited radar cross-section (RCS), particularly in cluttered environments like urban areas.[11] Millimeter-wave radar technology, which operates at high frequencies, offers a potential solution by providing better resolution for detecting small objects, but it is also more expensive and complex.[11][9]

Radio Frequency (RF)-Based Detection

Another common method is detecting drones through radio frequency (RF) analysis.[9][10][11][12] Most drones communicate with their operators via RF signals, using the 2.4 GHz and 5.8 GHz bands.[9][11] RF-based detection systems monitor the electromagnetic spectrum for these signals, allowing them to identify the presence of a drone and its controller on these RF bands.[11] One advantage of RF detection is that it does not require line-of-sight, implying that the detection system does not need to have a view of the drone.[11] It can also operate over long distances, making it effective in a large pool of scenarios.[11]

However, RF-based detection systems do have their limitations. They are unable to detect drones that do not rely on communication with another operator, as in autonomous drones.[10] Also, the systems are less reliable in environment where many RF signals are presents, such as cities.[11] Therefore in situations where high precision and reliability are a must, RF-based detection might not be too suitable.

Acoustic-Based Detection

Acoustic detection systems rely on the unique sound signature produced by drones, patricularly the noise generated by their propellers and motors.[11] These systems use highly sensitive microphones to capture these sounds and then analyze the audio signals to identify the presence of a drone.[11] The advantage of this type of detection is that it is rather low cost and also does not require line-of-sight, therefore this type of detection is mostly used for detecting drones behind obstacles in non-open spaces.[11][9]

However, it also has its disadvantages. In environments with a lot of noise, as in a battlefields, these systems are not as effective.[10][11] Additionally, some drones are designed to operate silently.[10] Also, they only work on rather short range, since sound weakens over distance.[11]

Vision-Based Detection

Vision-based detection systems use camera, either in the visible or infrared spectrum, to detect drones visually.[9][11] These system rely on image recognition algorithms, often by use of machine learning.[11][12] Drones are then detected based on their shape, size and movement.[12] The main advantage of this type of detection is that the operators themselves will be able to confirm the presence of a drone, and are able to differentiate between a drone and other objects such as birds.[11]

However, there are also disadvantages when it comes to vision-based detection systems.[10][11] These systems are highyl dependent on environmental conditions, they need a clear line-of-sight and good lightning, additionally weather conditions can have an impact on the accuracy of the systems.[10][11]

Best Approach for Portable Drone Detection

For our project, which focuses on a portable system, the ideal drone detection method must balance between effectiveness, portability and ease of deployment. Based on this, a sensor fusion approach appear to be the most appropriate.[11]

Sensor Fusion Approach

Given the limitations of each individual detection method, a sensor fusion approach, which would combine radar, RF, acoustic and vision-based sensors, offers the best chance of achieving reliable and accurate drone detection in a portable system.[11] Sensor fusion allows the strengths of each detection method to complement the weaknesses of the others, providing more effective detection in dynamic environments.[11]

  1. Radar Component: A compact, millimeter-wave radar system would provide reliable detection in different weather conditions and across long ranges.[10] While radar systems are traditionally bulky, recent advancements make it possible to develop portable radar units that can be used in a mobile systems.[11] These would most likely be less effective, therefore to compensate a sensor fusion approach would be used.[11]
  2. RF Component: Integrating an RF sensor will allow the system to detect drones communicating with remote operators.[11] This component is lightweight and relatively low-power, making it ideal for a portable system.[11]
  3. Acoustic Component: Adding acoustic sensors can help detect drones flying at low altitudes or behind obstacles, where rader may struggle.[9][11] Also this component is mainly just a microphone and the rest is dependent on software, and therefore also ideal for a portable system.[11]
  4. Vision-Based Component: A camera system equipped with machine learning algorithms for image recognition can provide visual confirmation of detected drones.[12][11] This component can be added by use of lightweight, wide-angle camera, which again does not restrict the device from being portable.[11]

Conclusion

To achieve portability in our system we have to restrict certain sensors and/or components, therefore to still achieve effectivity when it comes to drone detection, the best approach is sensor fusion. The system would integrate radar, RF, acoustic and vision-based detection. These together would compensate for each others limitations resulting in an effective, reliable and portable system.

Sensor Fusion

When it comes to detection, sensor fusion is essential for integrating inputs from multiple sensos types to achieve higher accuracy and reliability in dynamic conditions. Which in our case are radar, radio frequency, acoustic and vision-based. Sensor fusion can occur at different stages, with early and late fusion.[13]

Early fusion integrates raw data from various sensors at the initial stages, creating a unified dataset for procession. This approach captures the relation between different data sources, but does require extensive computational recourses, especially when the data of the different sensors are not of the same type, consider acoustic and visual data.[13]

Late fusion integrates the processed outputs/decisions of each sensor. This method allows each sensor the apply its own processing approach before the data is fused, making it more suitable for systems where sensor outputs vary in type. According to recent studies in UAV tracking, late fusion improves robustness by allowing each sensor to operate indepentently under its optimal conditions.[13][14]

Therefore, we can conclude that for our system late fusion is best suited.

Algorithms for Sensor Fusion in Drone Detection

  1. Extended Kalman Filter (EKF): EKF is widely used in sensor fusion for its ability to handle nonlinear data, making it suitable for tracking drones in real-time by predicting trajectory despite noisy inputs. EKF has proven effective for fusing data from radar and LiDAR, which is essential when estimating an object's location in complex settings like urban environments.[14]
  2. Convolutional Neural Networks (CNNs): Primarily used in vision-based detection, CNNs process visual data to recognize drones based on shape and movement. CNNs are particularly useful in late-fusion systems, where they can add a visual confirmation layer to radar or RF detections, enhancing overall reliability.[13][14]
  3. Bayesian Networks: These networks manage uncertainty by probabilistically combining sensor inputs. They are highly adaptable in scenarios with varied sensor reliability, such as combining RF and acoustic data, making them suitable for applications where conditions can impact certain sensors more than others.[13]
  4. Decision-Level Fusion with Voting Mechanisms: This algorithmic approach aggregates sensor outputs based on their agreement or “votes” regarding an object's presence. This simple yet robust method enhances detection accuracy by prioritizing consistent detections across sensors.[13]
  5. Deep Reinforcement Learning (DRL): DRL optimizes sensor fusion adaptively by learning from patterns in sensor data, making it particularly suited for applications requiring dynamic adjustments, like drone tracking in unpredictable environments. DRL has shown promise in managing fusion systems by balancing multiple inputs effectively in real-time applications.[13][14]

These algorithms have demonstrated efficacy across diverse sensor configurations in UAV applications. EKF and Bayesian networks are particularly valuable when fusing data from radar, RF, and acoustic sources, given their ability to manage noisy and uncertain data, while CNNs and voting mechanisms add reliability in vision-based and multi-sensor contexts. However, without testing no conclusions could be made on which algorithms can be applied effectively and which ones would work best.[13][14]

Interception

Drone interception refers to a range of methods used to incapacitate or destroy rogue drones once they have been detected and identified as threats. The rise of small drones, including both off-the-shelf and custom-built UAVs, has led to the development of many counter-drone systems that aim to neutralize drones in a non-destructive or destructive manner. The choice of method depends on various factors, including the environment, threat level, and available technology.

Key Approaches to Interception

  1. Kinetic Interceptors: Kinetic methods physically destroy or incapacitate drones through direct impact. These include missile systems and kinetic projectiles, which engage drones at medium to long ranges. While effective, kinetic interceptors are typically expensive and may pose risks of collateral damage. Systems like the U.S. Army’s 40mm net grenade are non-lethal alternatives that physically trap drones without destruction, and at a low cost.
  2. Electronic Warfare (Jamming and Spoofing): One of the most common drone neutralization techniques involves electronic warfare, such as radio frequency (RF) and GNSS jamming. These methods disrupt the drone’s control signals or GPS navigation, forcing it to lose connectivity and potentially crash. Spoofing, on the other hand, involves hijacking the drone’s communication system, allowing operators to redirect it. While jamming is non-lethal, it may affect other nearby electronics and is ineffective against autonomous drones that don’t rely on external control signals.
  3. Directed Energy Weapons (Lasers and Electromagnetic Pulses): Directed energy systems like lasers and electromagnetic pulses (EMP) are designed to disable drones by damaging their electrical components or destroying them outright. Lasers offer precision and instant engagement but are costly and susceptible to environmental conditions like rain or fog. EMP systems can disable multiple drones at once but may also interfere with other electronics in the vicinity.
  4. Net-Based Capture Systems: These systems use physical nets to ensnare drones, rendering them incapable of flight. The nets can be launched from ground-based platforms or other drones and are highly effective against low-speed, low-altitude UAVs. This method is non-lethal and minimizes collateral damage but has limitations in range and reloadability​.
  5. Geofencing: Geofencing involves creating virtual boundaries around sensitive areas using GPS coordinates. Drones equipped with geofencing technology are automatically programmed to avoid flying into restricted zones. This method is proactive but can be bypassed by modified or non-compliant drones

Objectives of Effective Drone Neutralization

When designing or selecting a drone interception system, several key objectives must be prioritised:

  1. Low Cost-to-Intercept: Cost-effectiveness is critical, as these small drones, off-the-shelf and custom-built UAVs are costing a lot more to intercept than to build, thus incurring a net negative cost on militaries.  For example, using a $2 million Standard Missile-2 to intercept a drone that may cost as little as $2,000 is a clear example of this asymmetry. (https://www.csis.org/analysis/cost-and-value-air-and-missile-defense-intercepts) especially in environments where multiple drones may need to be intercepted over time. Some methods, such as net-based systems, offer a low-cost solution, while others, like lasers or kinetic interceptors, are more expensive​(1-s2.0-S266737972200023…).
  2. Portability: The ideal counter-drone system should be portable, lightweight, and collapsible for easy transportation and deployment in various settings. Systems like RF jammers and net throwers are typically more portable than missile-based solutions​(AeroDefense Blog).
  3. Ease of Deployment: The ability to quickly set up and deploy a system is vital, particularly in fast-moving scenarios such as military operations or protecting large events. Systems that can be operated from vehicles or drones provide greater flexibility in dynamic environments​(Counter_Drone_Technolog…).
  4. Quick Reloadability or Automatic Reloading: In high-threat environments, rapid reloading or automatic reloading capabilities ensure continuous protection against multiple drone incursions. Systems like laser-based or RF jammers offer this advantage, whereas net throwers and kinetic projectiles may need manual reloading​(Counter_Drone_Technolog…).
  5. Minimal Collateral Damage: Especially in civilian areas, minimizing collateral damage is critical. Non-lethal methods such as jamming, spoofing, and net-based systems are preferred in such environments, as they neutralize threats without causing widespread damage​(Counter_Drone_Technolog…)​(1-s2.0-S266737972200023…).

Evaluation of Drone Interception Methods

Pros and Cons of Drone Interception Methods

  1. Jamming (RF/GNSS)
    • Pros: Effective at neutralizing communication between a drone and its operator. It is non-destructive, widely applicable, and can target multiple drones simultaneously.
    • Cons: Limited effectiveness against autonomous or pre-programmed drones that don’t rely on external signals. Can interfere with other electronics in the area​
  2. Net Throwers
    • Pros: Non-lethal and environmentally safe, nets can physically capture drones without destroying them, making them ideal for urban settings where collateral damage is a concern
    • Cons: Limited range and only effective on slower, low-altitude drones. Reloading can be slow unless automated.
  3. Missile Launch
    • Pros: High precision and range, effective at engaging fast-moving or long-range drones. Can target multiple drones
    • Cons: Extremely high cost per intercept and the risk of collateral damage. These systems are less portable and require significant infrastructure to deploy
  4. Lasers
    • Pros: Silent, fast, and capable of engaging multiple drones quickly without physical debris. Lasers offer precision and minimal collateral damage.
    • Cons: Expensive and susceptible to weather conditions (fog, rain). High energy requirements make portability a challenge
  5. Hijacking
    • Pros: Allows operators to take control of drones without destroying them. It’s a non-lethal approach, ideal for situations where it’s essential to capture the drone intact.
    • Cons: Collateral damage to surrounding electronics, limited range, and high operational costs​
  6. Spoofing
    • Pros: Redirects or manipulates drone signals to mislead operators. It is non-destructive and can be used to safely divert drones away from sensitive areas.
    • Cons: Complex to execute, especially on drones with advanced anti-spoofing countermeasures
  7. Geofencing
    • Pros: Prevents drones from entering restricted zones proactively. Non-lethal and offers permanent coverage in geofenced areas.
    • Cons: Can be bypassed by modified or non-compliant drones. Requires drone manufacturers to implement the technology

Pugh Matrix

Method Cost-to-intercept Portability Ease of Deployment Reloadability Minimum Collateral Damage Effectiveness Total Score
Jamming (RF/GNSS) Medium High High High High Medium 10
Net Throwers Low High High Medium High High 11
Missile Launch Low Low Medium Low Low High 5
Lasers High Medium Medium High High High 8
Hijacking Low High Medium Low High Medium 8
Spoofing Medium High Medium Medium High Medium 8
Geofencing Low High High High High Low 10

Path Prediction of Projectile[15][16][17][18][19][20][21][22][23][24][25][26][27][28][29][30][31][32][33]

Theory:

Catching a projectile requires different steps. At first the particle has to be detected, after which its trajectory has to be determined. If we know how the projectile is moving in space and time the net can be shot to catch the projectile. However based on the distance of the projectile it takes different amounts of time for the net to reach the projectile. In this time the projectile has moved to a different location. So the net must be shot to a position where the projectile will be in the future such that they collide.


Since projectiles do not make sound and do not emit RF waves, they are not as easy to detect as drones. For this part the assumption will be made that the projectile is visible. Making the system also detect projectiles which are not visible would probably be possible but this would complicate things a lot. The U.S. army uses electronic fields which can detect bullets passing. Something similar could be used to detect projectiles which are not visible, but this will not be done in this project due to the complexity.

To detect a projectile a camera with tracking software is used. Drones will have to be detected by this camera. This will be done by training an AI model for drone detection.

Now that the projectile is in sight its trajectory has to be determined. The speed of the projectile should only slow down due to friction in the air and speed up due to gravity. For a first model air friction can be neglected to get a good approximation of the flight of the projectile. Since not every projectile has the same amount of air resistance, the best friction coefficient should be found experimentally by dropping different projectiles. The best friction coefficient is when most projectiles are caught by the system. An improvement for this is to have different pre-assigned values for friction coefficients for different sizes of projectiles. Since surface area plays a big role in the amount of friction a projectile experiences, this is a reasonable thing to do.

With the expected path of the projectile known, the net can be launched to catch the projectile midair. Basic kinematics can give accurate results for these problems. Also, the problem can be seen as a 2D problem. Since we only protect against projectiles coming towards the system, we can always define a plane such that the trajectory of the projectile and the system are in the same plane, making the problem two dimensional. If the path of the projectile would exceed the plane and become a three dimensional problem the system does not need to protect against this projectile as it does not form a threat, since the system is in the (2D) plane.

Calculation:

PP1: Output 2D model

A Mathematica script is written which calculates at this angle the net should be shot. The script now makes use of made up values which have to be determined experimentally based on the hardware that is used. For example, the speed at which the net is shot should be tested and changed in the code to get a good result. The distance and speed of the projectile can be determined using sensors on the device. The output of the Mathematica script is shown in figure PP1. It gives the angle for the net to be shot at as well as a trajectory of the net and projectile to visualize how the interception will happen.



Accuracy:

PP2: Calculations accuracy

The height at which projectiles are dropped can be estimated by looking at footage of projectiles dropping. The height can be easily determined by assuming the projectile falls in vacuum, this represents reality really well. The height is given by: 0.5*g*t^2. Using a YouTube video[34] as data, it can be seen that almost every drop takes at least 4 seconds. This means that the projectiles are dropped from at least 78.5m. If we catch the projectile at two thirds of its fall, still having plenty of height to be redirected safely, and the net is 1 by 1 meter (so half a meter from its center to the closest side of the net), the projectile must not be dropped more than 0.75 meter (See figure PP2 for the calculation) next to us (outside of the plane) since the system would not catch this, if everything else went perfect. Even if the projectile would be dropped 0.7 meter next to the device, the net would hit the projectile with the side, which does not guarantee that the projectile will stay in the net.

An explosive projectile will do damage even when 0.75 meters away from a person. This means that the previously made assumption, where it was assumed that a 2D model would be fine, since everything happens in a plane, does not fulfill all our needs. Enemy drones are not accurate to the centimeter, since explosive projectiles, like grenades, can kill a person even when 10 meters away. This means that for better results a 3D model should be used.


3D model:

PP3: Output 3D model

It was tried to replicate the 2D model in 3D, but this did not work out with the same generality. For this reason some extra assumptions were made. These assumptions are based on reality and therefor still valid for this system. The only thing this changes is the generality of the model, where it could be used in multiple different cases instead of only projectiles dropped from drones.


In the 2D model the starting velocity of the projectile could be changed. However, in reality, drones do not have a launching mechanism and simply drop the projectile without any starting velocity. This means that the projectile will drop straight down (except some sideways movement due to external forces like wind). This was noted after watching lots of drone warfare video footage, where it was also noted that drones do not usually crash into people, but mainly into tanks since for tanks an accurate hit is required between the base and the turret of the tank. For people, drones drop the projectile from a standstill (to get the required aim). This simplification also makes the 2D model valid again, since there is no sideward movement in the projectile, it will never leave the plane and we can create between the path of the projectile and the mechanism which shoots the net.

Since this mechanism works in the real world (3D), it is decided to plot the 2D model at a required angle in 3D so there is a good representation of how the mechanism will work. The new model also gives the required shooting angle and then it shows the path of the net and projectile in 3D. To get further insight, the 2D trajectory of the net and projectile is also plotted, this can be seen in figure PP3.




Accuracy 3D model:

PP5: Interception with wind
PP4: Calculations weight net

The 3D model which is now set up only works in a “perfect” world, where there is no wind, no air resistance or any other external factors which may influence the path of the projectile and the net. Also we assume that the system knows where the drone is with infinite accuracy. This is in reality simply not true, but it is important to know how close this model replicates reality and if it can be used.

Wind plays a big role in the path of the projectile and of the net. It is important that the model also functions under these circumstances. In order to determine the acceleration of the projectile and net the drag force on both objects must be determined. Two important factors where the drag force depends on are the drag coefficient and the frontal area of the objects. Since different projectiles are used during warfare, like hand grenades or Molotov cocktails, it is unknown what the exact drag coefficient or frontal area is or the projectile. After a dive in literature it was decided to take an average value for the drag coefficient and the frontal area since these values lied on the same spectrum. For the frontal area this could be predicted since the drones are only able to carry objects of limited size.  After some calculations (see figure PP4) it was found that if the net (including the weights on the corners) weighs 3kg, the acceleration of the projectile and net due to wind effects is identical leading to still a perfect interception, which can be seen in figure PP5. This is based on literature values, for a later stage it is necessary to find the exact drag coefficient and surface area of the net and change the weight accordingly. As for projectiles which do not exactly satisfy the found drag coefficient or surface area, it is found with the use of the model that differences up to 50% of the used values do not influence the projectile so much that the interception misses. This range includes almost all theoretical values found for the different projectiles, making the model highly reliable under the influence of wind.

An uncertainty within the system is the exact location of the drone. We aim to accurately know where the drone, and thus the projectile is, but in reality this infinite accurate location is unachievable, but we can get close. The sensors in the system must be optimized to locate the drone as good as possible. Luckily there are sensors which are able to achieve high accuracy, for example a LiDAR sensor which has a range of 2000m and is accurate to 2cm. The 2000m range is well within the range that our system operates and the accuracy of 2cm is way smaller than the size of the net (100cm by 100cm) which should not cause problems for the interception.

Prototype

Components Possibilities

Radar Component (Millimeter-Wave Radar):

Component: Infineon BGT60ATR12C

Component: RFBeam K-LC7 Doppler Radar

  • Price: Around €55
  • Description: A Doppler radar module operating at 24 GHz, designed for short to medium range object detection. It’s used in UAV tracking due to its cost-efficiency and low power consumption.
  • Software: Arduino IDE or MATLAB can be used for basic radar signal processing.

RF Component:

Component: LimeSDR Mini

  • Price: Not deliverable at the moment
  • Description: A compact, full-duplex SDR platform supporting frequencies from 10 MHz to 3.5 GHz, useful for RF-based drone detection.
  • Software: LimeSuite, a suite of software for controlling LimeSDR hardware and custom signal processing.

Component: RTL-SDR V3 (Software-Defined Radio)

  • Price: Around €30-40
  • Description: An affordable USB-based SDR receiver capable of monitoring a wide frequency range (500 kHz to 1.75 GHz), including popular drone communication bands (2.4 GHz and 5.8 GHz). While not as advanced as higher-end SDRs, it’s widely used in hobbyist RF applications.
  • Software: GNU Radio or SDR# (SDRSharp), both of which are open-source platforms for signal demodulation and analysis.

Acoustic Component:

Component: Adafruit I2S MEMS Microphone (SPH0645LM4H-B)

  • Price: Around €7-10
  • Description: A low-cost MEMS microphone offering high sensitivity, commonly used in sound detection projects for its clarity and noise rejection.
  • Software: Arduino IDE or Python with SciPy for sound signature recognition.
  • Website: https://www.adafruit.com/product/3421

Component: DFRobot Ferminion MEMS Microphone Module - S15OT421(Breakout)

  • Price: Around €4
  • Description: A low-cost MEMS microphone offering high sensitivity, commonly used in sound detection projects for its clarity and noise rejection.
  • Software: Arduino IDE.
  • Website: https://www.dfrobot.com/product-2357.html

Vision-Based Component:

Component: Arducam 12MP Camera (Visible Light)

Component: Raspberry Pi Camera Module v2

  • Price: Around €15-20
  • Description: A small, lightweight 8MP camera compatible with Raspberry Pi, offering high resolution and ease of use for vision-based drone detection. It can be paired with machine learning algorithms for object detection.
  • Software: OpenCV with Python for real-time image processing and detection, or TensorFlow for more advanced machine learning applications.
  • Website: https://www.raspberrypi.com/products/camera-module-v2/

Component: ESP32-CAM Module

Sensor Fusion and Central Control

Component: Raspberry Pi 4

  • Price: Around €35
  • Description: A small, affordable single-board computer that can handle sensor fusion, control systems, and real-time data processing.
  • Software: ROS (Robot Operating System) or MQTT for managing communications between the sensors and the central processing unit.
  • Website: https://www.raspberrypi.com/products/raspberry-pi-4-model-b/

Component: ESP32 Development Board

  • Price: Around €10-15
  • Description: A highly versatile, Wi-Fi and Bluetooth-enabled microcontroller. It’s lightweight and low-power, making it ideal for sensor fusion in portable devices.
  • Software: Arduino IDE or Micropython, with MQTT for sensor data transmission and real-time control.
  • Website: https://www.espressif.com/en/products/devkits

Testing of Prototype components

To verify whether or not we have achieved some of these requirements, we have to devise some test scenarios, which will allow us to quantitatively determine our prototype's accuracy. The notion of accuracy of course may be ambiguous, as we do not have access to test our prototype in a warzone-like environment, and thus accuracy in a lab may not result in accuracy in the trenches. However, we will attempt to simulate such an environment through the use of a projector, as well as through the use of speakers.

Requirements verification

In this section we will show a step wise procedure to test if certain components will meet the requirements.

  • R1: Detect Rogue Drone
    • Objective: Verify that the system can detect a rogue drone within a specific detection range, minimizing false negatives. Setup Description: 1. Use a single drone labeled as “rogue.” Start by positioning the drone at a distance of 5 meters from the detection system, and incrementally increase the distance by 2 meters until the maximum possible distance within the room or 30 meters, whichever is smaller. 2. The room should be marked at each meter interval so that the drone can be placed accurately at each distance point. Test Procedure: 1.) At each distance interval, power on the drone and ensure it is hovering stably. The system should attempt to detect the drone’s presence at each point. 2.) Record whether the detection system correctly identifies the rogue drone's presence at each interval. Measurement Method: 1). Detection Confirmation: The system should provide a visible or audible signal (like an LED indicator, a beep, or a message on a screen) upon detecting the drone. 2.) Record Detection: Log each detection attempt, noting the distance at which the detection occurred successfully or failed. Quantitative Criteria: - Pass: The system should detect the rogue drone at all distances up to 30 meters (or the room’s maximum achievable distance) without any missed detections. - Fail: If the system fails to detect the drone at any distance within this range, it does not meet the requirement.
  • R4: Detect Drone Speed
    • Objective: Validate that the system can measure the drone’s speed accurately within an enclosed space. Setup Description: 1.) Mark two points on the floor, exactly 5 meters apart, to serve as start and end points for the drone to travel in a straight line. 2.) Program the drone to fly between these points at different speeds (e.g., 5 m/s, 10 m/s, 15 m/s) if supported by the drone’s settings. 3.) Position a high-speed camera or timer-based tool to measure the time taken for the drone to travel between the start and end points. Test Procedure: 1.) Set the drone to fly from the starting point to the end point at each speed setting (5 m/s, 10 m/s, 15 m/s). 2.) Use a stopwatch or high-speed camera to record the time taken for the drone to cross the 5-meter distance for each speed trial. 3.) Calculate the actual speed based on the time and distance traveled. Measurement Method: 1.) Manual Speed Calculation: Calculate the actual speed using the formula \( \text{Speed} = \frac{\text{Distance}}{\text{Time}} \) based on the measured time across the 5-meter distance. 2.) Compare with System Detection: Log the detected speed from the system and compare it with the manually calculated speed. Quantitative Criteria: - Pass: The system’s detected speed should be within ±5% of the manually calculated speed for each test. - Fail: If the detected speed deviates more than ±5% from the calculated speed in more than one trial, the requirement is not met.
  • R7: Track Drone with Laser
    • Objective: Test if the system’s laser can track the drone accurately as it moves within a 1m² radius. Setup Description: 1.) Mark a 1m² square area on the floor, with a 1m x 1m boundary. 2.) Place the drone within this marked area and set it to fly in circular, random, or figure-eight patterns within the boundary, keeping it stable and at a consistent height. 3.) Ensure the laser tracking system is positioned to follow the drone’s movements within this confined area. Test Procedure: 1.) Start the drone’s movement within the 1m² square and activate the laser tracking system. 2.) Observe the laser’s movement in real-time, ensuring it follows the drone as it stays within the boundaries of the 1m² area. Measurement Method: 1.) Visual Observation: Record the laser's accuracy with a high-speed camera to capture any deviations outside the 1m² boundary. 2.) Boundary Tracking Analysis: Playback the recorded footage and analyze if the laser consistently stays within the 1m² boundary around the drone. Quantitative Criteria: - Pass: The laser must remain within the 1m² boundary around the drone for at least 95% of the movement time during the test. - Fail: If the laser tracking drifts outside the boundary for more than 5% of the time, the system does not meet the requirement.
  • R8: Can Intercept Drone/Projectile
    • Objective: Verify the system’s ability to intercept the drone within a 1m² area without causing any unintended impact outside the target zone. Setup Description: 1.) Define a 1m² target area on the floor, marking it clearly. 2.) Position the drone in the center of this area and program it to hover or move slowly within the 1m² space. 3.) Configure the interception mechanism (net launcher, tagging device, etc.) to target the drone within this specified area. Test Procedure: 1.) Once the drone is positioned within the 1m² area, initiate the interception mechanism. 2). Repeat the interception test at different points within the 1m² target area to ensure consistency. Measurement Method: 1.) Interception Accuracy: Use a high-speed camera to capture the interception action, confirming that it occurs entirely within the designated 1m² area. 2.) Surrounding Impact Analysis: Review the footage to ensure that the interception mechanism does not affect any area outside the 1m² boundary. Quantitative Criteria: - Pass: The interception must occur within the 1m² target area with no impacts or interference outside the boundary for at least 95% of trials. - Fail: If the interception strays outside the target area or impacts surroundings in more than 5% of cases, the system does not meet the requirement.

Component testing process

In this section we will show and explain how the microphone and the camera are able to detect drones and if these components meet our requirements

Acoustic method

In order to detect drones out of a recorded sound we first of all need a reference sound to later on compare our recorded sound with. Our reference sound is obviously the sound of the drone. This sound file can be plotted as an audio plot with on the x-axis time and on the y-axis shows the signal’s strength in arbitrary units (Fig 15.1). After using the Fourier transform and normalising this, a plot in the frequency domain can be obtained. In this plot the frequency is on the x-axis and normalised strength on the y-axis (Fig 15.2). The same can be done for the recorded sound which we want to test. In the analysis of the sounds the frequency domain will be reduced to the domain 500 Hz-3000 Hz because our testing drones and the used background noises fall in this domain. The last thing left to do is search in the recorded normalised frequency domain plot (Fig 15.3) for the ‘global shape’ of the normalised frequency domain plot (Fig 15.2) of the reference sound. This part can be done in multiple strategies. I will first explain how this method works and then explain why we choose this method.

Total image.png
Equations1.png

Our goal is to have a set with different frequencies which represent the ‘fingerprint’ of the normalised frequency domain plot of the drone. We obtain this by setting a threshold value (λ) equal to 1. By lowering λ with small values and marking the frequency peaks we encounter with red dots, we can obtain a set with a total of  ‘n’ ‘peak’ frequencies representing this specific sound. The specific value of ‘n’ is with trial and error set to the optimal value of 100 for the reference sound and ‘n’ is set to 250 for the recorded sound. This is plotted in (Fig 15.4) for the drone sound and (Fig 15.5) for the recorded sound. Now two sets with frequencies are obtained representing both their own sounds and the only thing left for us to do is to compare these sets. This is done by (Eq 15.1). So for each frequency of the drone reference sound we will check if there is frequency in the recorded set such that the absolute value of the subtraction of the two is smaller than the tolerance. The tolerance is by trial and error set to 2 Hz. This is needed because the Doppler effect might play a role if the drone moves or because the microphone might pick up frequencies slightly different. The number of frequencies that meet the requirement of (Eq 15.1) (m), can thus lay between 0 and 100, where m=100 is perfect recognition of the drone. We have chosen this method because it is an intuitive concept and relatively easy to implement. In addition, the program is computationally very light and is thus quickly analysable. This number m needs to be transformed to a percentage which represents the change of a drone being present. The most simple conversion would be a linear relationship. However, after conducting a few experiments with varying distances with and without background noise the relationship did not seem to be linear. Until roughly m=20 we know with great certainty that there is not a drone, because in regular background noise often a few peaks will correspond.  In the same way, when m=90 we know with relatively good certainty that there is a drone present but the experiments concluded that, especially with background noise, often not all frequencies from the drone set are reflected in the recorded sound frequency set. Therefore the domain of u The ‘Logistic Formula’ (Eq 15.2) seems to be able to fit this quite well. In the program we have taken a=0.16 and b=55. The value of ‘b’ reflects the number of corresponding peaks at which the certainty of a drone being present is 50%. The number 55 is chosen because this is the middle of the uncertain domain. The values of ‘a’ and ‘b’ are unfortunately not fully validated, as we did not have the proper equipment and the proper amount of time to conduct a thorough experiment.

Visual method

Sensor Fusion methodology

From literary research we found five sensor fusion algorithms that could be relevant to our detection system. These being:

  1. Extended Kalman Filter
  2. Convolutional Neural Networks
  3. Bayesian Networks
  4. Decision-Level Fusion with Voting Mechanisms
  5. Deep Reinforcement Learning

The first three listed here we will not be able to apply properly with only a vision-based and an acoustic sensing component. The fifth algorithms will not be possible with our recourses and time limit of the project. Therefore we decided to take a deeper look into Decision-Level Fusion with Voting Mechanisms, and how we could apply this to our tests.

We looked into three different ways of applying this method. The first one being the following, we state there is a drone near if one of the two sensors claimed to have detected a drone. The second one being that we state there is a drone near if both sensors claimed to have detected a drone. Lastly, if one of the sensor claimed there to be a drone, we lowered the threshold for the other sensor, and we required double confirmation from the sensors.

With the first method we found there to be a lot of true negative results, impying that the system would indicate there is a drone near, while this is not the case. This was because our vision-based component was not very accurate.

For the second method we found that it worked better than the first method, but only in specific scenarios. If the drone was far away, then the camera could pick it up, but the microphone simply did not have the range.

The third method gives a balance between these. It does not give false postives as it still needs confirmation from all sensors, but also not as many true negatives as it does not only rely on a single sensor entirely.

However, this still would result in some false positive at times, consider the situation with the drone far away, therefore to adapt it in a better way for our system, we should analyse from the sensors whether another sensor is applicable in the situation. Therefore, if for example the visual-based component detects a drone, and it detects that the drone is a certain amount of meters away, then the acoustic component would be considered invaluable and taken out of the decision making. This is the case because if a drone were to be far away we would know from tested specifications that the microphone can actually detect a drone only up to a certain range, therefore if the camera were to pick up a drone far away, we know that the microphone will never detect this drone, and therefore the component data is invaluable and will not be considered. To determine this exact distance depends on the exact components that would be used, and it would require more testing to determine this accurately. This would work best when actually more than two components would be used, because otherwise in some cases the third method would become the first method which results in a lot of true negatives, which we would like to avoid.

Future Work

Expanding this project could focus on several key areas that enhance functionality, and reliability. Someone that were to expand this project could look into the following areas.

Improvement of Detection Accuracy through Improved Sensor Fusion

We have done research into the different components could be added to the system in regard to drone detection. We tested with two of these but not all, therefore future work could test the other components aswell. Additionally, more testing could be done on sensor fusion when all components are into play.

Testing Interception Method

We have done research into different interception methods, and also how a net could be used to intercept projectiles. Future work could test intercepting objects using a net, determine whether our calculations are correct, how to make a device that holds a net, how to shoot it using the system and how to reload the net into the system.

Field Testing in Diverse Environments

It is important to analyse whether our system would perform as we expect it to do in different environment. Therefore future work could be regarding testing how dynamic the system is and how to improve it to be better at adapting to different environment.

Literary Research

Autonomous Weapons Systems and International Humanitarian Law: Need for Expansion or Not[35]

A significant challenge with autonomous systems is ensuring compliance with international laws, particularly IHL. The paper delves into how such systems can be designed to adhere to humanitarian law and discusses critical and optional features such as the capacity to identify combatants and non-combatants effectively. This is directly relevant to ensuring our system's utility in operational contexts while adhering to ethical norms.

Artificial Intelligence Applied to Drone Control: A State of the Art[12]

This paper explores the integration of AI in drone systems, focusing on enhancing autonomous behaviors such as navigation, decision-making, and failure prediction. AI techniques like deep learning and reinforcement learning are used to optimize trajectory, improve real-time decision-making, and boost the efficiency of autonomous drones in dynamic environments.

Drone Detection and Defense Systems: Survey and Solutions[9]

This paper provides a comprehensive survey of existing drone detection and defense systems, exploring various sensor modalities like radio frequency (RF), radar, and optical methods. The authors propose a solution called DronEnd, which integrates detection, localization, and annihilation functions using Software-Defined Radio (SDR) platforms. The system highlights real-time identification and jamming capabilities, critical for intercepting drones with minimal collateral effects.

Advances and Challenges in Drone Detection and Classification[11]

This state-of-the-art review highlights the latest advancements in drone detection techniques, covering RF analysis, radar, acoustic, and vision-based systems. It emphasizes the importance of sensor fusion to improve detection accuracy and effectiveness.

Autonomous Defense System with Drone Jamming capabilities[36]

This patent describes a drone defense system comprising at least one jammer and at least one radio detector. The system is designed to send out interference signals that block a drone's communication or GPS signals, causing it to land or return. It also uses a technique where the jammer temporarily interrupts the interference signal to allow the radio detector to receive data and locate the drone's position or intercept its control signals.

Small Unmanned Aerial Systems (sUAS) and the Force Protection Threat to DoD[37]

This article discusses the increasing threat posed by small unmanned aerial systems (sUAS) to military forces, particularly the U.S. Department of Defense (DoD). It highlights how enemies are using these drones for surveillance and delivery of explosives.

The Rise of Radar-Based UAV Detection For Military: A Game-Changer in Modern Warfare[10]

This article discusses how radar-based unmanned aerial vehicle (UAV) detection is transforming military operations. SpotterRF’s systems use advanced radar technology to detect drones in all conditions, including darkness or bad weather. By integrating AI, these systems can distinguish between drones and non-threats like birds, improving accuracy and reducing false positives.

Swarm-based counter UAV defense system[38]

This article discusses autonomous systems designed to detect and intercept drones. It emphasizes the use of AI and machine learning to improve the real-time detection, classification, and interception of drones, focusing on autonomous UAVs (dUAVs) that can neutralize threats. The research delves into algorithms and swarm-based defense strategies that optimize how drones are intercepted.

Small Drone Threat Grows More Complex, Deadly as Tech Advances[39]

The article highlights the growing threat of small UAV to military operations. It discusses how these systems are used by enemies for surveillance and direct attacks, and the various countermeasures the U.S. Department of Defense is developing to stop these attacks. It eplores the use of jamming (interference of connection between drone and controller), radio frequency sensing, and mobile detection systems.

US Army invents 40mm grenade that nets bad drones[40]

This article discusses recently developed technology that involves a 40mm grenade that deploys a net to capture and neutralise hostile drones. This system can be fired from a standard grenade launcher, providing a portable, low-cost method of taking down small unmanned aerial systems (sUAS) without causing significant collateral damage.


Making drones to kill civilians: is it ethical?[41]

Usually, anything where harm is done to innocent people is seen as unethical. This would mean that every company which is somehow providing for items in war would do something which is at least partially unethical. However, during war an international law states that a country is not limited by all traditional ethics. This makes deciding on what is ethical and what not harder.

Sociocultural objections to using killer drones against civilians:

-       Civilians (not in war) are getting killed by drones since the drones are not able to see the difference between people in war and people not in war

-       We should not see war as a ‘clash of civilizations’ as this would induce that civilians are also part of war

Is it ethical to use drones to kill civilians?:

-       As said above, an international law applies during war between countries. This law implies:

o  Killing civilians = murder = prohibited

-       People getting attacked by drones, say that it is not the drones who kill people, but people kill people

A simple solution is to follow the 3 laws of robotics from Isaac Asimov:

-       A robot may not injure a human being or allow a human being to come to harm

-       A robot must obey orders given to it by human beings except when such orders conflict with the first law

-       A robot must protect its own existence, provided such protection does not conflict with the first or second law

But following these laws would be too simple, as these laws are not actual laws

The current drone killer’s rationale:

-       A person is targeted only if harmful to the interests of this country so lang as he/she remains alive

This rationale is objected by:

-       This rationale is simply assumed since the international law says nothing about random targeting of individuals

This objection is disproved by:

-       If the target is not in warzone, it is not harmful to the interests of the country, thus such a person would not be a random person

Is it legal and ethical to produce drones that are used to kill civilians?:

A manufacturer of killer drones may not assume its drones are only being used peacefully.

The manufacturers of killer drones often have cautionary recommendations, which are there to put these manufacturers in a legally safe place.

Conclusion:

The problem is that drone killing is not covered in the traditional war laws. Literature is not uniform in opposition to drone killing, but the majority states that killing civilians is unethical.


Ethics, autonomy, and killer drones: Can machines do right?[42]

The article looks into the ethics of certain weapons used in war (in the US). Since we can view back on new weapons back then in war (like atomic bombs) we can see if what they thought then, is what we think now is ethical. The article uses two different viewpoints to decide the ethics of a war technology, namely teleology and deontology. Teleology is focused on the outcome of an action, while deontology focusses more on the duty of an action.

The article looks first at the atomic bomb, which according to a teleologic viewpoint could be seen as ethical, as it would bring an end to war quickly which saves lives in the long term. Deontology also says it could be ethical since it would show superiority to have such strong weapons, which intimidates other countries in war.

Next up in discussion in a torture program. According to teleology this is an ethical thing to do, since torturing some people, to extract critical information from them could be used to prevent more deaths in the future.

Now the article questions AI-enabled drones. For AI ethics, the AI should always be governed by humans, bias should be reduced (lots of civilians are getting killed now) and there should be more transparency.  As for a country this is more challenging since they also have to focusses on safety and winning a war. This is why, in contrast to with the atomic bomb, where teleology and deontology said the same, there now is a contrast between teleology and deontology. Teleology wants to focus on outcome, thus security and protection. Deontology focusses on global values, like human rights. The article says the challenge is to use AI technologies effective while following ethical principles and letting everyone do this.


Survey on anti-dron systems: components, designs, and challenges[43]

Requirements an anti-drone system must have:

-       Drone specialized detection (detect the drone)

-       Multi drone defensibility (Defend for multiple drones)

-       Cooperation with security organizations (Restrictions to functionality should be discussed with public security systems (police/military)

-       System portability (lightweight and use wireless networks)

-       Non-military neutralization (Don’t use military weapons to defend for drones)

Ways to detect drones:

-       Thermal detection (Motors, batteries and internal hardware produce heat)

o  Works in all weather

o  Affordable

o  Not too much range

-       RF scanner (capture wireless signals)

o  Can’t detect drones what don’t produce RF signals

o  Long range

-       Radar detection (Detect objects and determine the shape)

o  Long range

o  Can’t see the drone if it is not moving since it thinks it is an obstacle

-       Optical camera detection (detect from a video)

o  Short range

o  Weather dependant

Hybrid detection systems to detect drones

-       Radar + vision

-       Multiple RF scanners

-       Vision + acoustic

Drone neutralization:

-       Hijacking/spoofing (Create fake signal to prevent drone from moving)

-       Geofencing (Prevent drone from approaching a certain point)

-       Jamming (Stopping radio communication between drone and controller)

-       Killer drones (Using drones to damage attacking drones)

-       Capture (Physically capture a drone) (for example with a net)

o  Terrestrial capture systems (human-held or vehicle-mounted)

o  Aerial capture systems (System on defender drones)

Determination of threat level:

-       Object

-       Flight path

-       Available time

(NOTE: The article goes into more depth about some mathematics to determine the threat level, which could be used in our system)


Artificial intelligence, robotics, ethics, and the military: a Canadian perspective[44]

The article not only looks at the ethics, but also the social and legal aspects of using artificial intelligence in the military. For this it looks at 3 main aspects of AI, namely Accountability and Responsibility, Reliability and Trust.

Accountability and Responsibility:

The article states that the accountability and responsibility of the actions of an AI system are for the operator, which is a human. However, when the AI malfunctions it becomes challenging to determine who is accountable.

Reliability:

AI now is not reliable enough and only performs well in very specific situations where it is made for. During military usage you never know in what situation an AI will be in, thus causing a lack in reliability. A verification of AI technologies is necessary, especially when you are dealing with live and death of humans.

Trust:

People who use AI in military should be thought how the AI works and to what extend they can trust the AI. Too much or too little trust in AI can lead to big mistakes. The makers of these AI systems should be more transparent so it can be understood what the AI does.

We need to have a proactive approach to minimize the risks we have with AI. This means that everyone who uses or is related to AI in military should carefully consider the risks that AI brings.


When AI goes to war: Youth opinion, fictional reality and autonomous weapons[45]

The article looks into the responsibilities and risks of fully autonomous robots in war. It does this by asking youth participants about this together with other research and theory.

The article found that the participants felt that humans should be responsible for actions of autonomous robots. This can be supported by theory which says that since robots do not have emotions like humans do, they cannot be responsible for their actions in the same way as humans. If autonomous robots were programmed with some ethics in mind, the robot could in someway be accounted for its actions as well. How this responsibility between humans and robots should be divided became unclear in this article. Some said responsibility was purely for the engineers, designs and government, while others said that the human and robot had a shared responsibility.

The article also found that there were still fears for fully autonomous robots. This came from old myths and social media which say that autonomous robots can turn against humans to destroy them.

As for the legal part of autonomous robots, they can potentially violate laws during war, especially if they are not accounted responsible for their actions. This causes worries for the youth.

The threats that fully autonomous robots bring outweigh the benefits for the youth. This is a sign for the scientific community to further develop and implement norms and regulations in autonomous robots.


Advances and Challenges in Drone Detection and Classification Techniques: A State-of-the-Art Review[46]

Summary:

•The paper provides a comprehensive review of drone detection and classification technologies. It delves into the various methods employed to detect and identify drones, including radar, radio frequency (RF), acoustic, and vision-based systems. Each has their strengths and weaknesses, after which the author discusses 'sensor fusion', where the combination of detection methods lead to improvements of system performance and robustness.

Key takeaways:

•Sensor fusion should be incorporated into system to improve performance and robustness

Counter Drone Technology: A Review[47]

Summary:

•The article provides a comprehensive analysis of current counter-drone technologies and categorizes counter-drone systems into three main groups: detection, identification, and neutralization technologies. Detection technologies include radar, RF detection, etc. Once a drone is detected, it must be identified as friend or foe. The review discusses methods such as machine learning algorithms and signature signal libraries. It covers various neutralization methods, including jamming (RF and GPS), laser-based systems, and kinetic solutions like nets or projectiles, and the challenges each method faces.

Key takeaways:

•Integration of multiple sensor technologies is critical

•Non-kinetic neutralization methods should be prioritized where possible to avoid unintended consequences

A Soft-Kill Reinforcement Learning Counter Unmanned Aerial System (C-UAS) with Accelerated Training[48]

Summary:

•This article discusses the development of a counter-drone system that utilizes reinforcement learning using non-lethal (“soft-kill") techniques. The system is designed to learn and adapt to various environments and drone threats using simulated environments.

Key takeaways:

•C-UAS systems must be rapidly deployable

•C-UAS systems should be trained in simulated environments to improve robustness and adaptability

Terrorist Use of Unmanned Aerial Vehicles: Turkey's Example[49]

Summary:

•The article examines how terrorist organizations have utilized drones for surveillance, intelligence gathering, and attacks. It highlights the growing accessibility of consumer drones, which are repurposed for malicious use and various counter-UAV technologies and tactics employed by Turkish forces to mitigate this threat.

Key takeaways:

•Running costs must be kept minimal. Access to affordable drones is widespread.

•Both kinetic and non-kinetic interception must be available if the system is to be used in urban or otherwise populated environments

Impact-point prediction of trajectory-correction grenade based on perturbation theory[50]

Summary:

•The article discussed trajectory prediction and correction methods for the use case of improving the accuracy of artillery projectiles. By modeling and simulating the effects of small perturbations in projectile flight, the study proposes an impact-point prediction algorithm. While this algorithm can be applied to improving artillery accuracy, it could potentially be used to predict the trajectory and impact location of drone-dropped explosives.

Key takeaways:

•Detailed description of real-time trajectory prediction corrections

•Challenge to balance efficiency with accuracy in path prediction algorithms


Armed Drones and Ethical Policing: Risk, Perception, and the Tele-Present Officer[51]

This paper talks about the tele-officier on ‘unmanned drones’. This paper looks at it from the point of view of attacking, but it can be looked at from the point of view of ‘attacking’ incoming drones where still a person should or should not ‘pull the trigger’ to intercept a drone, with the potential risks of redirecting it at another crowd.


The Ethics and Legal Implications of Military Unmanned Vehicles[52]

This papers states that human soldiers/marines also do not agree on what is ethical warfare. They give a few examples on which questions have controversial answers under the soldiers/marines. (we may use this to argue why/why not our device should be autonomous or not.


Countering the drone threat implications of C-UAS technology for Norway in an EU an NATO context[53]

This paper gives clear insight in different scenarios where drones can be a threat. For example on large crowds but also in warfare. This paper does however not give a concrete solution.


An anti-drone device based on capture technology[54]

This paper explores the capabilities of capturing a drone with a net. It also addresses some other forms of anti drone devices, such as lasers, hijacking, rf jamming…

For the rest is this paper very technical in the net captering.


Four innovative drone interceptors.[55]

This paper states 5 different ways of detecting drones. Acoustic detection and tracking with microphones positioned in a particular grid, video detection by cameras, thermal detection, radar detection and tracking and as last the detection through radio emissions from the drone. Because we want to also be able to catch  ‘off-the-shelf’ drones we have to investigate which ones are appropriate. For taking down the drone they give 6 options: missile launch, radio jamming, net throwers, machine guns, lasers, drone interceptors. The 4 drone interceptors they introduce are for us a bit above budget, as they are real drones with various kinds of generators to take down a drone (for example with a high electric pulse), but we could still look into this.


Comparative Analysis of ROS-Unity3D and ROS-Gazebo for Mobile Ground Robot Simulation[56]

This paper examines the use of Unity3D with ROS versus the more traditional ROS-Gazebo for simulating autonomous robots. It compares their architectures, performance in environment creation, resource usage, and accuracy, finding that ROS-Unity3D is better for larger environments and visual simulation, while ROS-Gazebo offers more sensor plugins and is more resource-efficient for small environments.

Target Market Interviews

Interview 1: Introductory interview with F.W., Australian Military Officer Cadet (military as a potential user)

General understanding:

  • What types of anti-drone (/projectile protection) systems are currently used in the military?
    • Trophy - Israeli, system is capable of intercepting anti-tank missiles, would be capable of intercepting dropped payload explosive, or drone
    • Iron Fist - Israel, similar system
    • Anduril
      • https://www.anduril.com/command-and-control/ (and other products in their catalog)
    • Lattice system detection and identification, given operating area in which all possible threats are detected, identified, and if necessary, intercepted
  • What are key features that make an anti-drone system effective in the field?
    • Often open terrain, fields, roads, not often in dense vegetation, drones are not very able to find and enter dense forest
    • Tactics that could be encouraged with use of our product could focus around this, perhaps including a training or tactics manual, something like that?
    • Best-use instructions, e.g. to hide and stay in cover while the system does its work
    • Don’t focus too much on interception, also valuable would be a warning system that will be able to identify incoming enemy drones to alert troops on the ground of the threat to allow troops to take cover
      • DroneShield - RfPatrol Mk2
      • Sensory warning, sound, vibration, etc.
      • Sharing threat with friendly forces
      • https://www.youtube.com/watch?v=JybGaR89Tt0&list=TLPQMTAwOTIwMjQP4piik5abrA&index=2&ab_channel=DailyMail (3:00-3:20)
    • Combination of detection, affordable warning system, with basic interception capability
    • Rewrite instructions to drone, attain control of drone system via jamming and reprogramming
      • Via powerful signal ‘ray’, disabling the FPV feed for example, renders the drone essentially useless, look into tech behind this and if it is realistically replicable
      • Common signal frequencies for FPV feeds are cycled through with signal of strong strength that overpowers the strength of the drone operator’s controller
      • 2.4 GHz and 5.8 GHz are two of the most expected commonnesses when dealing with FPV Quadcopter machines. 2.4 GHz is the common RF used by Quadcopters for connecting the ground transmitter to the drone
      • DroneShield take over the control of the drone and gets it to land, or even return to its operator, a process over which it can be tracked and followed
        • How are drones/signals determined to be friendly or not? Pathing? Certain other identifiers?
  • What are the most common types of drones that these systems are designed to counter?
    • Consumer drones used often
    • Homemade FPV drones
      • Incredibly effective and easy to make
      • Can be built to obtain incredible speed and accuracy
      • Specific motor frequency? Could be used to help detect via sound, frequency analysis
    • Interesting build: FPV-’mothership’ via large surveillance drone, signal repeating through the surveillance drone, massively increases FPV range and resistance to jamming
  • What are the most common types of drone interception that these systems employ?
    • Non-kinetic
      • RF jamming
      • Optical laser jamming (check Iron Fist list above)
      • Higher cost for system components but low-to-none cost-to-intercept
    • Kinetic
      • Regular rounds from e.g. shotgun, birdshot, etc.
        • Low cost-per-interception
      • Interception via net, net charge that opens the net and covers the drone
      • Blast interception (Iron Fist system)
        • High cost-per-interception
      • Air-burst ammunition?
        • High chance for interception, high cost
  • Are there any specific examples of successful or failed anti-drone operations you could share?
    • Drone Shield
    • Anduril
    • Epirus
    • Leonidas

Limitation of current systems:

  • What are the most significant limitations of current anti-drone systems?
    • Many systems are not affordable for a single soldier or group of soldiers, mostly just for a company or battalion rather than individual groups if at all (e.g. in Ukraine probably very difficult to afford and source these systems)
    • High cost-to-intercept
    • Complex systems possibly difficult to perform maintenance on when necessary only via a specialist
    • Usually vehicle-mounted if large-scale so transportable, low-cost detect-and-defeat definitely carryable by one person but anything else not transportable by hand for a single person, e.g. here referring for example to the Iron Fist system or comparable hard-kill system
    • Difficult to take on small, fast missions, trench application difficult due to unavailability of vehicle mounts, would not be possible to quickly move system around discreetly
  • Are there any specific environments (urban, desert, etc.) where anti-drone systems struggle to perform well?
    • Existing systems function well for large open-area applications where system can be relatively stationary, unless provided with vehicle transport, then system can cover a large moving area
    • Vegetation and more closed environment more of an advantage for soldiers to take cover and prevent detection rather than a limitation defense systems
    • Win-win for soldiers, defense system should still operate while finding cover and hiding is much easier

Cost-related:

  • Can you give a rough idea of the costs involved in deploying these systems?
    • Probably difficult to find this information, possible to send companies inquiries on their website but price is likely on a need-to-know basis, unlikely they will volunteer this for non-customers. Perhaps worth a try anyways.
      • Purchase cost
      • Maintenance cost
      • Cost-to-intercept
  • What are usual price ranges for systems like these?
    • Affordable options
    • Full-scale, full-feature systems (military-grade, fully-equipped, etc.) prices varying and hard to find

Ethics discussion:

  • Which ethical concerns may be associated with the use of anti-drone systems, particularly regarding urban, civilian areas?
  • How does the military handle ethical issues when deploying these technologies?
    • Big thing in the military, especially in AUS, huge amount of ethical training received, all new weapons (and non-weapon) systems are thoroughly tested and lawyers go over the ethics behind each of the systems
    • Another entire consideration of its actual use, ethics on a per-use basis as well
    • Should be much easier to justify for non-lethal systems, defense systems, but we need to be able to prevent unintended consequences of the system’s use
    • We would be held responsible for any consequences, however there are theories behind civilian risk and how to quantify and justify these things
    • However, it is not necessarily our responsibility to prove this. Legal and ethical is actually down to the military themselves and the user of the system. We just provide information on the system and they apply it to their discretion

Potential improvements:

  • What improvements do you think are necessary to make anti-drone systems more effective? What are current shortcomings?
  • Are there specific threats (related to the build of the drone or other factors) that these systems are weak against?
  • Do you think AI or machine learning could help enhance anti-drone systems? To what extent is it currently being used?
    • Droneshield systems use AI in their C-UAV tech.
    • Interesting application to anti-missile systems
    • Further research to be done by computer scientists

Technical questions:

  • Is significant training required for personnel to effectively operate anti-drone systems?
    • Time required for training
    • Infrastructure required for training
    • Cost?
    • What is expected of the companies behind these systems in terms of training?
      • DroneShield, essentially no training required, designed for someone with low skill level, straight out-of-the-box
      • Potential customers would largely favor systems for which little-to-no training is required, which is simple to use and straight forward
  • How do these systems usually handle multiple drone threats or swarm attacks?
  • Can you explain how systems differentiate between hostile and non-hostile drones?
  • How are these systems tested and validated before they are deployed in the field?
    • Huge amount of testing, validation, evaluation on the side of the military, putting it through its paces on how effective it is really
    • We would be responsible for establishing training and operating procedures with them, showing them how the system works and constantly being available to support customers in the use of the system, maintenance, spare parts, etc.
    • Maintenance, spare parts, things like this will be paid for but must be absolutely reliable, response time minimized, punctual service, friendly, helpful service will go a long way in keeping customers satisfied and get a good rep with customer-side counterparts

General Discussion, Notes

  • Affordable for individual soldiers
    • Necessary?
      • Identifying a suitable target market
      • Cost and competition considerations, small scale less competition and ability for us to provide affordable solutions
      • Suitable for scenario like Ukraine, where soldiers are required to often purchase their own gear in high-intensity environments where military may not be able to provide these systems
      • If required for modern (quality) militaries, there would likely be a large-scale system and/or tactics and procedures employed to protect soldiers from UAV threats.
  • Portable for individuals
  • Possibly suitable for civilian cars, regular vehicles that regular soldiers use
  • Aim-assisted shotgun for interception, very low cost-to-intercept
  • Optics for a shotgun for manual aim and fire
  • Compact autonomous aiming system
  • Detection and identification
  • Quick access, quick pick-up and exit
  • JAMMING RISK:
    • Fiber optic cable connection for drones, can’t be jammer via RF
    • Kinetic interception option necessary
    • Skeet shooting practice for the US military: how would we train soldiers to use this system?
      • https://www.twz.com/army-set-to-buy-computerized-rifle-sights-for-shooting-down-drones
  • Fighter jet heads-up system during use of on-board guns helps fighter pilot predict where target will be, i.e. where to aim
  • Shotgun area-of-effect wide, could intercept a drone even though it is maneuverable
  • Interception should occur as early as possible to minimize risk
  • Vehicle-mounted interception system upgrade module
    • https://en.wikipedia.org/wiki/Iron_Fist_(countermeasure)
    • https://elbitsystems.com/media/Catalog-Active-Protection-Systems-_5_-Web.pdf
    • https://elbitsystems.com/product/iron_fist/
  • High-altitude surveillance drone detection
    • Standard military-grade radar should be able to detect these, large size, almost aircraft-like
    • Lack of an advanced radar network therefore difficulties detecting these?
  • Ex. use case: forward-operating base defense system against incoming suicide drones and projectiles, alternatively vehicle-mounted (potentially retaining the possibility to be used by soldiers in individual positions on the front line)
    • Separate these use-cases:
      • Fixed-position protection
      • Vehicle-mount protection, “semi-maneuverable”
      • Light-weight, ‘mini’-system, portable for an individual soldier, quickly deployable and low-cost
        • For personal-carry: weight is absolutely critical

Further interviews:

  • People online who are willing to share their experiences in this environment
  • Youtube, Instagram, online blogs, forums, many of them doing journalism and reporting now
  • Send them a message and see if people are willing to discuss experiences in the field surrounding drones and related systems
    • e.g.: https://www.instagram.com/tiger_in_ukraine
  • With permission to record would be ideal, audio transcript for later analysis and note-taking


Interview 2: Interview with B.D. to specify requirements for system, Volunteer on Ukrainian front lines (volunteers and front line soldiers as potential users)


What types of anti-drone systems are currently used near the front lines?

  • Mainly see RF jammers and basic radar systems deployed here
  • Detection priority
  • Nothing for specifically intercepting munitions dropped by drones
    • Follow-up: What are general ranges for these RF jammers and other detection methods?
      • Very different, RF detectors or radar can have ranges of 100 meters but very much depends on the conditions
      • Rarely available consistently across positions
      • Soldiers usually listen for drones but this can be difficult when moving or during incoming/outgoing fire
      • Any RELIABLE range would be beneficial but to give enough time to react a range around 20 or 30 meters would probably be the minimum

What are key features that make an anti-drone system effective in the field?

  • Needs to be easy to move and hide, no bulky shapes or protruding parts, light
    • No space in backpacks, cars, trucks, already often full of equipment. Nothing you couldn’t carry and run with (estimate: 30kg)
    • Pack-up and deployment speed: exfil usually over multiple hours of packing up for drone operator squads, but for front line troops, constant movement. Anything above 30 seconds is insufficient [and that seems to already by pushing it]
  • No language barrier advisable [assuming: basically meaning no training]
  • Detection range is quite critical. A few seconds need to be given for soldiers or others near the front lines to prepare for an incoming impact
    • Follow-up: What kind of alarm to show drone detection would be best?
      • [...] Conclusion: sound is only real option without immediately giving away position of system, drones will also often fly over positions so to alarm soldiers with lights would risk revealing their positions where the drone otherwise would have flown past. Sound already risky.

What are the most significant limitations of current anti-drone systems?

  • Apart from RF and some alternatives no real solution to drone drops on either side
  • Systems that are light enough to be carried by one person and simple enough to operate without extensive training are rare if any exist at all

General Discussion on Day-to-Day Experiences

  • Drone attacks very unpredictable. Periods of constant drone sounds above, then days of nothing. Constant vigilance necessary, which tires you out and puts you under serious pressure.
  • According to people he has talked to, RF jammers and RF/radar-based detection useful but very difficult to counter an FPV drone with a payload if it is able to reach you and get close to you, but very environment-dependent, e.g. open field vs forest
  • Despite the conditions, there's a shared sense of purpose between locals, volunteers, etc.
  • Constant fundraising needed to gather enough funds to fix vehicles (such as recently the van shown on IG account), as well as equipment and supplies both for locals and soldiers

Logbook

Week 1
Name Total Break-down
Max van Aken 10h Attended lecture (2h), Attended meeting with group (2h), Analysed papers/patents [21], [22], [23], [24], [25] (5h), Summarized and described key takeaways for papers /patents [21], [22], [23], [24], [25] (1h)
Robert Arnhold 16h Attended lecture (2h), Attended meeting with group (2h), Analysed papers/patents [11], [12], [13], [14], [15] (10h), Summarized and described key takeaways for papers/patents [11], [12], [13], [14], [15] (2h)
Tim Damen 16h Attended lecture (2h), Attended meeting with group (2h), Analysed papers [12], [13], [14], [15], [16] (10h), Summarized and described key takeaways for papers [12], [13], [14], [15], [16] (2h)
Ruben Otter 17h Attended lecture (2h), Attended meeting with group (2h), Analysed papers/patents [1], [2], [3], [4], [5] (10h), Summarized and described key takeaways for papers/patents [1], [2], [3], [4], [5] (2h), Set up Wiki page (1h)
Raul Sanchez Flores 16h Attended lecture (2h), Attended meeting with group (2h), Analysed papers/patents [6], [7], [8], [9], [10] (10h), Summarized and described key takeaways for papers/patents [6], [7], [8], [9], [10] (2h),
Week 2
Name Total Break-down
Max van Aken 13h Attended lecture (30min), Attended meeting with group (1h), Research kinds of situations of device (2h), wrote about situations (1,5h), research ethics (6h), write ethics (2h)
Robert Arnhold 6.5h Attended lecture (30min), Attended meeting with group (1h), Worked on interview questions (1h), Organizing introductory interview (2h), Preparing interviews for next weeks(2h)
Tim Damen 13.5h Attended lecture (30min), Attended meeting with group (1h), Risk evaluation (2h), Important features (1h), Research on ethics of deflection (8h), Writing part about deflection (1h)
Ruben Otter 14.5h Attended lecture (30min), Attended meeting with group (1h), Analysed papers [2], [3], [4], [5] for research in drone detection (6h), Wrote about drone detection and its relation to our system using papers [2], [3], [4], [5] (4h), Analysed and summarized paper [6] (2h), Wrote about usage of simulation and its software (1h)
Raul Sanchez Flores 14.5h Attended lecture (30min), Attended meeting with group (1h) Researched and analysed papers about approaches to drone interception (4h) Researched and analysed papers about drone interception (6h), Evaluated different existing drone interception (3h)
Week 3
Name Total Break-down
Max van Aken 6h Attended lecture (30min), Meeting with group (1.5h), researched ethics (2h), rewriting ethics (2h)
Robert Arnhold 13h Attended lecture (30min), Meeting with group (1.5h), Completed interview (4h), planning next interview (1h), conceptualizing mechanical-side (3h), state-of-the-art research and review of previously prepared sources (3h)
Tim damen 10h Attended lecture (30min), Meeting with group (1.5h), Analysed papers [6], [7], [8], [9] [10], [11] (6h), Rewrote ethics of deflection based on autonomous cars (2h)
Ruben Otter 10h Attended lecture (30min), Meeting with group (1.5h), Research into possible drone detection components (6h), Created list of possible prototype components (2h)
Raul Sanchez

Flores

4.5h Attended lecture (30min), Meeting with group (1.5h) Created Pugh Matrix to compare different interception methods (2h) Email Ruud about components we need (30min)

Week 4

Name Total Break-down
Max van Aken 11h Attended lecture (30min), Meeting with group (1.5h), researched default drone specifications (2h) done calculation on mass prediction (6h), mathematica calculations (1h)
Robert Arnhold 13h Attended lecture (30min), Meeting with group (1.5h), investigating mechanical design concepts and documenting process (7h), summarizing discussion and learnings (2h, not yet finished), contacting new interviewees (2h)
Tim damen 17h Attended lecture (30min), Meeting with group (1.5h), Analysed 4 papers (4h), Research on how to catch a projectile with a net (6h), Written text on how to catch a projectile with a net (2h), Worked on a Mathematica script for calculations to catch a projectile (3h)
Ruben Otter 15h Attended lecture (30min), Meeting with group (1.5h), Attended meeting with Ruud (1h), Setup Raspberry Pi (2h), Research into audio sensors (2h), Started coding and connection audio sensor with Raspberry Pi(8h)
Raul Sanchez

Flores

Attended lecture (30min), Meeting with group (1.5h), Attended meeting with Ruud (1h),

Week 5

Max van Aken 6h Attended lecture (30min), Meeting with group (1.5h), spring calculations (4h)
Robert Arnhold 11h Attended lecture (30min), Meeting with group (1.5h), discussing schedule with potential new interviewee, developing mechanical design (2h), summarizing focus and key learnings from previous interview (2h), investigating mechanical components, materials, structures, and other design features (5h)
Tim damen 17h Attended lecture (30min), Meeting with group (1.5h), Finalize Mathematica to catch projectile in 2D (1h), Research on catching projectile in 3D (4h), Work on Mathematica to catch projectile in 3D (6h), Analyses on accuracy of 2D model (2h), Writing the text for on the wiki (1h), General research to write text and do calculations (1h)
Ruben Otter 11h Attended lecture (30min), Meeting with group (1.5h), Continued integrating microphone sensor with Raspberry Pi (9h)
Raul Sanchez

Flores

Attended lecture (30min), Meeting with group (1.5h)

Week 6

Max van Aken 8h Attended lecture (30min), Meeting with group (1.5h), begin with presentation/text (6h)
Robert Arnhold 10h Attended lecture (30min), Meeting with group (1.5h), performing second interview with volunteer in Ukraine (2h), collecting and structuring notes for review (2h), continuing mechanical design research and formalizing CAD design (4h)
Tim damen 8h Attended lecture (30min), Meeting with group (1.5h), Worked on creating the 3D model in Mathematica (5h), written part about the 3D model (1h)
Ruben Otter 11h Attended lecture (30min), Meeting with group (1.5h), Dowloaded and learned MatLab syntax (3h), Research into different types of drones (2h), Finding drone sound files for specific drones (1h), Coded in MatLab to analyse frequency of the sample drone sound (3h)
Raul Sanchez

Flores

Attended lecture (30min), Meeting with group (1.5h)

Week 7

Max van Aken Attended lecture (30min), Meeting with group (1.5h)
Robert Arnhold 12h Attended lecture (30min), Meeting with group (1.5h), summarizing secondary interview notes into key takeaways and confirmed specifications (3h), completing CAD design for given specifications (2h), 3D printing main body of system (4h) and constructing body of prototype (1h)
Tim damen 12.5h Attended lecture (30min), Meeting with group (1.5h), Research on type of projectiles, dimensions, effects of air resistance on these projectiles and on net (4h), Adapt model with wind (0.5h), Specified assumptions made based on research done (2h), Tested accuracy of 3D model (2h), Written text about accuracy of 3D model (1h), Cleared up some parts on the wiki page (1h)
Ruben Otter 16h Attended lecture (30min), Meeting with group (1.5h), Continue coding in MatLab to analyse the frequency of the sample drone sound and comparing these with other sound files (4h), Research into Sensor Fusion (3h), Apply Sensor Fusion methodology on the test results(2h), Create presentation(2h), Prepare for presentation(3h)
Raul Sanchez

Flores

Attended lecture (30min), Meeting with group (1.5h)

Week 8

Max van Aken Attended lecture (30min), Meeting with group (1.5h)
Robert Arnhold Attended lecture (30min), Meeting with group (1.5h)
Tim damen 5h Attended lecture (30min), Meeting with group (1.5h), Change order on wiki and improve some small parts (1h), Finished up last pieces of my text (1h), Improved the photos (1h)
Ruben Otter 10.5h Attended lecture (30min), Meeting with group (1.5h), Prepare for presentation(1h), Present the presentation(30min), Some additional research into Sensor Fusion(2h), Write part about sensor fusion from a theoretical point of view(2h), Write part about how we applied certain sensor fusion methodology in our testing(2h), Write future work section(1h)
Raul Sanchez

Flores

Attended lecture (30min), Meeting with group (1.5h)

References

  1. How to read a risk matrix used in a risk analysis (assessor.com.au)
  2. The ethics of driverless cars | ACM SIGCAS Computers and Society
  3. The redirection of attacks by defending forces | International Review of the Red Cross (icrc.org)
  4. Is Society Ready for AI Ethical Decision Making? Lessons from a Study on Autonomous Cars - ScienceDirect
  5. Autonomous Cars: In Favor of a Mandatory Ethics Setting | Science and Engineering Ethics (springer.com)
  6. Autonomous decision making for a driver-less car | IEEE Conference Publication | IEEE Xplore
  7. IJCAI17-AlgorithmicBias-Distrib (cmu.edu)
  8. BBC - Ethics - War: In an ethical war, whom can you fight?
  9. 9.00 9.01 9.02 9.03 9.04 9.05 9.06 9.07 9.08 9.09 9.10 9.11 Chiper F-L, Martian A, Vladeanu C, Marghescu I, Craciunescu R, Fratu O. Drone Detection and Defense Systems: Survey and a Software-Defined Radio-Based Solution. Sensors. 2022; 22(4):1453. https://doi.org/10.3390/s22041453
  10. 10.00 10.01 10.02 10.03 10.04 10.05 10.06 10.07 10.08 10.09 10.10 10.11 The rise of Radar-Based UAV Detection for Military: A Game-Changer in Modern Warfare. (2024, June 11). Spotter Global. https://www.spotterglobal.com/blog/spotter-blog-3/the-rise-of-radar-based-uav-detection-for-military-a-game-changer-in-modern-warfare-8
  11. 11.00 11.01 11.02 11.03 11.04 11.05 11.06 11.07 11.08 11.09 11.10 11.11 11.12 11.13 11.14 11.15 11.16 11.17 11.18 11.19 11.20 11.21 11.22 11.23 11.24 11.25 11.26 11.27 11.28 11.29 11.30 11.31 11.32 11.33 11.34 11.35 11.36 Seidaliyeva U, Ilipbayeva L, Taissariyeva K, Smailov N, Matson ET. Advances and Challenges in Drone Detection and Classification Techniques: A State-of-the-Art Review. Sensors. 2024; 24(1):125. https://doi.org/10.3390/s24010125
  12. 12.0 12.1 12.2 12.3 12.4 Caballero-Martin D, Lopez-Guede JM, Estevez J, Graña M. Artificial Intelligence Applied to Drone Control: A State of the Art. Drones. 2024; 8(7):296. https://doi.org/10.3390/drones8070296
  13. 13.0 13.1 13.2 13.3 13.4 13.5 13.6 13.7 Svanström, F.; Alonso-Fernandez, F.; Englund, C. Drone Detection and Tracking in Real-Time by Fusion of Different Sensing Modalities. Drones 2022, 6, 317. https://doi.org/10.3390/drones6110317
  14. 14.0 14.1 14.2 14.3 14.4 Samaras, S.; Diamantidou, E.; Ataloglou, D.; Sakellariou, N.; Vafeiadis, A.; Magoulianitis, V.; Lalas, A.; Dimou, A.; Zarpalas, D.; Votis, K.; et al. Deep Learning on Multi Sensor Data for Counter UAV Applications—A Systematic Review. Sensors 2019, 19, 4837. https://doi.org/10.3390/s19224837
  15. -       Autonomous Ball Catcher Part 1: Hardware — Baucom Robotics
  16. -       Ball Detection and Tracking with Computer Vision - InData Labs
  17. -       Detecting Bullets Through Electric Fields – DSIAC
  18. -       An Introduction to BYTETrack: Multi-Object Tracking by Associating Every Detection Box (datature.io)
  19. -       Online Trajectory Generation with 3D camera for industrial robot - Trinity Innovation Network (trinityrobotics.eu)
  20. -       Explosives Delivered by Drone – DRONE DELIVERY OF CBNRECy – DEW WEAPONS Emerging Threats of Mini-Weapons of Mass Destruction and Disruption ( WMDD) (pressbooks.pub)
  21. -       Deadliest weapons: The high-explosive hand grenade (forcesnews.com)
  22. -       SM2025.pdf (myu-group.co.jp)
  23. -       Trajectory estimation method of spinning projectile without velocity input - ScienceDirect
  24. -       An improved particle filtering projectile trajectory estimation algorithm fusing velocity information - ScienceDirect
  25. -       (PDF) Generating physically realistic kinematic and dynamic models from small data sets: An application for sit-to-stand actions (researchgate.net)
  26. https://kestrelinstruments.com/mwdownloads/download/link/id/100/
  27. Normal and Tangential Drag Forces of Nylon Nets, Clean and with Fouling, in Fish Farming. An Experimental Study (mdpi.com)
  28. A model for the aerodynamic coefficients of rock-like debris - ScienceDirect
  29. Explosives Delivered by Drone – DRONE DELIVERY OF CBNRECy – DEW WEAPONS Emerging Threats of Mini-Weapons of Mass Destruction and Disruption ( WMDD) (pressbooks.pub)
  30. Influence of hand grenade weight, shape and diameter on performance and subjective handling properties in relations to ergonomic design considerations - ScienceDirect
  31. 'Molotov Cocktail' incendiary grenade | Imperial War Museums (iwm.org.uk)
  32. My Global issues - YouTube
  33. ⁣4 Types of Distance Sensors & How to Choose the Right One | KEYENCE America
  34. Ukrainian Mountain Battalion drop grenades on Russian forces with weaponised drone (youtube.com)
  35. Willy, Enock, Autonomous Weapons Systems and International Humanitarian Law: Need for Expansion or Not (NOVEMBER 16, 2020). Available at SSRN: https://ssrn.com/abstract=3867978 or http://dx.doi.org/10.2139/ssrn.3867978
  36. Chmielus, T. (2024). Drone defense system (U.S. Patent No. 11,876,611). United States Patent and Trademark Office. https://patentsgazette.uspto.gov/week03/OG/html/1518-3/US11876611-20240116.html
  37. Kovacs, A. (2024, February 1). Small Unmanned aerial Systems (SUAS) and the force protection threat to DOD. RMC. https://rmcglobal.com/small-unmanned-aerial-systems-suas-and-the-force-protection-threat-to-dod/
  38. Brust, M. R., Danoy, G., Stolfi, D. H., & Bouvry, P. (2021). Swarm-based counter UAV defense system. Discover Internet of Things, 1(1). https://doi.org/10.1007/s43926-021-00002-x
  39. Small drone threat grows more complex, deadly as tech advances. (n.d.). https://www.nationaldefensemagazine.org/articles/2023/8/30/small-drone-threat-grows-more-complex-deadly-as-tech-advances
  40. Technology for innovative entrepreneurs & businesses | TechLink. (n.d.). https://techlinkcenter.org/news/us-army-invents-40mm-grenade-that-nets-bad-drones
  41. Making Drones to Kill Civilians: Is it Ethical? | Journal of Business Ethics (springer.com)
  42. Full article: Ethics, autonomy, and killer drones: Can machines do right? (tandfonline.com)
  43. IEEE Xplore Full-Text PDF:
  44. https://ojs.aaai.org/aimagazine/index.php/aimagazine/article/view/2848
  45. When AI goes to war: Youth opinion, fictional reality and autonomous weapons - ScienceDirect
  46. Seidaliyeva, U., Ilipbayeva, L., Taissariyeva, K., Smailov, N., & Matson, E. T. (2024). Advances and Challenges in Drone Detection and Classification Techniques: A State-of-the-Art Review. Sensors, 24(1), 125. https://doi.org/10.3390/s24010125
  47. Gonzalez-Jorge, Higinio & Aldao, Enrique & Fontenla-Carrera, Gabriel & Veiga Lopez, Fernando & Balvís, Eduardo & Ríos-Otero, Eduardo. (2024). Counter Drone Technology: A Review. 10.20944/preprints202402.0551.v1.
  48. Silva, Douglas & Machado, R. & Coutinho, Olympio & Antreich, Felix. (2023). A Soft-Kill Reinforcement Learning Counter Unmanned Aerial System (C-UAS) with Accelerated Training. IEEE Access. PP. 1-1. 10.1109/ACCESS.2023.3253481.
  49. Şen, Osman & Akarslan, Hüseyin. (2020). Terrorist Use of Unmanned Aerial Vehicles: Turkey's Example.
  50. Wang, Yu & Song, W.-D & Song, X.-E & Zhang, X.-Q. (2015). Impact-point prediction of trajectory-correction grenade based on perturbation theory. 27. 18-23.
  51. Armed Drones and Ethical Policing: Risk, Perception, and the Tele-Present Officer https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8367046/ Published online 2021 Jun 19. doi: 10.1080/0731129X.2021.1943844
  52. The Ethics and Legal Implications of Military Unmanned Vehicles,y Elizabeth Quintana, Head of Military Technology & Information Studies Royal United Services Institute for Defence and Security Studies https://static.rusi.org/assets/RUSI_ethics.pdf
  53. Countering the drone threat implications of C-UAS technology for Norway in an EU an NATO context https://www.researchgate.net/profile/Bruno-Martins-4/publication/348189950_Countering_the_Drone_Threat_Implications_of_C-UAS_technology_for_Norway_in_an_EU_and_NATO_context/links/5ff3240492851c13feeb0e08/Countering-the-Drone-Threat-Implications-of-C-UAS-technology-for-Norway-in-an-EU-and-NATO-context.pdf
  54. An anti-drone device based on capture technology Yingzi Chen, Zhiqing Li, Longchuan Li, Shugen Ma, Fuchun Zhang, Chao Fan, https://doi.org/10.1016/j.birob.2022.100060 https://www.sciencedirect.com/science/article/pii/S2667379722000237
  55. Four innovative drone interceptors. Svetoslav ZabunovB, Garo Mardirossian,https://doi.org/10.7546/CRABS.2024.02.09
  56. Platt, J., Ricks, K. Comparative Analysis of ROS-Unity3D and ROS-Gazebo for Mobile Ground Robot Simulation. J Intell Robot Syst 106, 80 (2022). https://doi.org/10.1007/s10846-022-01766-2