PRE2022 3 Group1: Difference between revisions

From Control Systems Technology Group
Jump to navigation Jump to search
Line 320: Line 320:
#If desired, you can use a hot glue gun to secure the foam material to the heating element and provide additional insulation.
#If desired, you can use a hot glue gun to secure the foam material to the heating element and provide additional insulation.


===Results===
==Results==
In this section we will talk about our results surrounding tests with our design, namely our physical demo test and our model tests.
 
====Demo results====
====Demo results====
Our main demo results are that it is possible to detect heat sources and follow a search pattern with a raspberry pi drone model. For the demo we only executed the adapted expanding square search pattern due to practicalities with our test setup, namely that our test rig was relatively small and rectangular.
Our main demo results are that it is possible to detect heat sources and follow a search pattern with a raspberry pi drone model. For the demo we only executed the adapted expanding square search pattern due to practicalities with our test setup, namely that our test rig was relatively small and rectangular.


====Model results ====
====Model results====
==Ethical analysis==
==Ethical analysis==
The paper "Ethical concerns in rescue robotics: a scoping review"<ref>https://link.springer.com/article/10.1007/s10676-021-09603-0</ref> describes seven ethical concerns regarding rescue robots. In this chapter, each ethical concern from the paper is summerized and then applied to our case.
The paper "Ethical concerns in rescue robotics: a scoping review"<ref>https://link.springer.com/article/10.1007/s10676-021-09603-0</ref> describes seven ethical concerns regarding rescue robots. In this chapter, each ethical concern from the paper is summerized and then applied to our case.

Revision as of 18:54, 8 April 2023


Group members
Name Student number Major
Geert Touw 1579916 BAP
Luc van Burik 1549030 BAP
Victor le Fevre 1603612 BAP
Thijs Egbers 1692186 BCS
Adrian Kondanari 1624539 BW
Aron van Cauter 1582917 BBT


Abstract

text

Introduction

Problem statement and objectives

Working on a cargo ship can be a dangerous job. Over a period of 7 years (2014-2021), 91 fatalities were recorded as a result of a man overboard (MOB) incident[1]. In a 2020 study[2] researchers examined 100 reports of a man overboard (MOB) incident using 114 parameters to create a MOB event profile. Out of the 100 incidents, 53 took place on cargo ships. In 88 cases the casualty was deceased as a result of the MOB incident, from those 88 cases 34 were assumed dead and 54 were witnessed dead. From the witness deaths 18 died before the rescue and 31 after the rescue, in 5 cases it is unknown. The cause of death was indicated for 42 cases, the most common cause is identified as drowning (26), followed by trauma (9), cardiac arrest (4) and hypothermia (3). During this project we specifically examine incidents that took place on cargo ships. From the 2020 study we conclude that in a best-case-scenario, the survival chance of a MOB incident on a cargo ship is about 22.64%, assuming that all the 12 survivors were working on a cargo ship.

With our project we aim to increase the survival chance for victims of MOB accidents on big (cargo) ships by developing a device that can help with the efficiency of finding the victim in the sea.

Scenario

The 2020 study showed that 53% of the MOB events happen on cargo ships, our focus therefore lies on developing a device for a cargo ship. It's worth noting that searching for a victim in the ocean can be a challenging and complex task. Factors such as weather conditions, ocean currents, and visibility can all affect the search process. We therefore make a few assumptions to narrow down our solution. Firstly, we assume that the weather conditions are mild at the time of accident and during the search and rescue operation, i.e. mild waves, a mild breeze, little rain, and an average sea and air temperature. We also assume that is night-time at the time of accident and that there is a sole victim who is capable of maintaining afloat, i.e. no major/life-threating injuries. Furthermore, we assume that the time of the incident is known so that we have a general understanding of the search area.

Users

The end users of our product are personnel of cargo ships, shipping companies that own a fleet of cargo ships, ports, and possibly the military.

State-of-the-art literature study

Before we can get started with the project it is important to gather knowledge about the state-of-the-art (SotA) concerning safety measures and pre-existing technology that aids victims of a MOB incident. We therefore performed a SotA literature study.

Search Area

A person floating in the ocean can drift up to 10 nautical miles per day[3], this is equal to 0.2 meters per second. We define the search area as a circle with a radius r, the radius of the circle will depend on time, we describe the search area with the following formula: r = 0.2 t. Here t is the time since the person fell overboard. The following table shows the search area size compared to time.

Time overboard (minutes) Search area size (m²)
1 452
5 11310
15 101788

Search patterns

Search procedures for when a man goes overboard already exist. In the IAMSAR manual some of these procedures are explained. In this paragraph we discuss and analyze different search patterns. For different environments and conditions different patterns are recommended. When the location of the MOB victim is known well, the Expanding Square or Sector Search is recommended. If the location of the accident is not accurately known different patterns are recommended such as Sweep Search.

Expanding Square Search

The expanding square search is a method that can be used to systematically inspect a body of water for signs of an MOB victim. The search should start at the last known location of the victim, from that point the search starts and expands outwards with course alterations of 90°, as can be seen in the image provided. The course of expansion heavily depends on the specification of the device that will be used. Thermal imaging can be used to scan the area, looking for any signs of the victim such as floating debris or the person themselves.

Sector Search

Sector search is another technique that can be used to search for a victim in the ocean. The sector search technique involves dividing the search area into sectors or pie-shaped sections and systematically searching each sector. It again starts at the last known location of the victim. From there the search area is divided and each sector is carefully searched in a clockwise fashion. Sector search can be effective in searching large areas, especially in cases where the search area is relatively circular or symmetric. However, sector search may not be the most efficient or effective method in all situations, especially if the search area is irregular or if weather and ocean conditions make it difficult to maintain a consistent search pattern.

Sweep search

The sweep search pattern is another technique that can be used to search for a victim in the ocean. It involves moving the search vessel or search device in a back-and-forth sweeping motion across the search area.

IAMSAR also mentions other factors need to be taken into account. If a person falls into the water they will for example be moved away by currents.

Expanding Sqaure, IAMSAR search patterns[4]
Sweep Search, IAMSAR search patterns[4]
Sector Search, IAMSAR search pattern[5]
Image showing the world's shipping routes on the Atlantic ocean[6]

Influence of cold water and oceanic temperatures

Since we're dealing with people who have fallen in water, we need to consider their survival time. When being submerged in water, your greatest risk is developing hypothermia as your body cools down 25 times faster in water than in air of the same temperature. The effects of hypothermia go from confusion and exhaustion, to unconsciousness and coma, and in the end lead to death. It is thus of utmost importance to detect and rescue a person in the water before the severe effects of hypothermia set in. The following table shows the relation between the temperature of the water and the time frame of the effects of hypothermia. The time windows are wide as the time when the effects of hypothermia begin depend on several factors, like age and percentage of body fat.[7]

Relation between water temperature and hypothermia[7]
Water temperature (C°) Time until unconsciousness Survival time
0 <15 min 15 - 45 min
0 - 5 15 - 30 min 30 - 90 min
5 - 10 30 - 60 min 1 - 3 hours
10 - 15 1 - 2 hours 1 - 6 hours
15 - 20 2 - 7 hours 2 - 40 hours
20 - 25 3 - 12 hours 3 - indefinite
Image showing oceanic temperatures on 22 March 2023[8]

Next, oceanic temperatures need to be considered. The two figures on the right show the world's shipping routes on the Atlantic ocean, and oceanic temperatures respectively. From this we can deduce that there are busy shipping routes in the colder regions of the Atlantic ocean, ranging from 3 to 7 degrees Celsius, and also the occasional ship in water at 1 degrees Celsius. The temperatures were taken in late March so it might even be colder in the middle of winter. Since these ships will benefit most from a technology that might result in faster rescue times, it can be concluded at the most desirable performance of the system is to find a person in the water in less than 10 minutes, leaving enough time for the rescue operation to start.[8][6]

Current Person in water detection tech

Radar technology

Currently, radars are used to detect the distance and velocity of an object by sending electromagnetic waves out and detecting the echo that results from objects. In water environments, a radome is needed to protect the radar but let the electromagnetic waves through. However, even with a radome the signal strength already falls to a third in a wet environment compared to dry. Multiple radar technologies exist with each their different advantages and disadvantages. Pulse Coherent Radar (PCR) can pulse the transmitting signal so it only uses 1% of the energy. Another aspect of this radar is coherence. This means that the signal has a consistent time and phase so the measurements can be incredibly precise. It can also separate the amplitude, time and phase of the received signal to identify different materials. This is not necessarily needed in our use case. The 'Sparse' service is the best service for detecting objects since it samples waves every 6 cm. You don't want millimeter precise measurements in a rough environment like the ocean so these robust measurements are ideal.[9]

Sea rescue drones (SOTA)

Intelligent Drone Swarm for Search and Rescue Operations at Sea

The research paper "Intelligent Drone Swarm for Search and Rescue Operations at Sea"[10] presents a novel approach for using a swarm of autonomous drones to assist in search and rescue operations at sea. The authors propose a system that uses machine learning algorithms and computer vision techniques to enable the drones to detect and classify objects in real-time, such as a person in distress or a life raft. The paper describes the hardware and software architecture of the drone swarm system, which consists of a ground station, multiple drones equipped with cameras, and a cloud-based server for data processing and communication. The drones are programmed to fly in a coordinated manner and share information with each other to optimize the search process and ensure efficient coverage of the search area. The authors evaluate the performance of their system in various simulated scenarios, including detection of a drifting boat, a person in the water, and a life raft. The results show that the drone swarm system is capable of detecting and identifying objects accurately and efficiently, with a success rate of over 90%. Overall, the research paper presents a promising solution for improving search and rescue operations at sea, leveraging the capabilities of autonomous drones and artificial intelligence.

"We propose an Intelligent Drone Swarm for tackling the aforementioned problems. The central idea is to use an autonomous fleet of intelligent UAVs to accomplish the task. The self-organizing UAVs network would enable the coverage of a larger area in less time, reducing the impact of the limited battery capacity. Recent works on UAVs networks shown their ability of coverage and connectivity improvements in emergency scenarios [15]. Indeed, the UAVs swarm is capable of generating a multi-hop communication network that will guarantee a higher bandwidth due to aerial-to-aerial communication links provided by longer LoS connections. Artificial Intelligence tools, in this case, would greatly improve not only the autonomy and coordination of the UAVs network, but also the communication efficiency. On one hand, autonomy and coordination are improved due to the autonomous adaptation of mobility actions and communication parameters to non-stationary environmental conditions. On the other, communication efficiency is ensured with the usage of smart detection algorithms locally running on the UAV, which limit the amount of information sent to the base station. Recent advances in AI techniques, such as specialized AI chips [16] and quantization/pruning techniques [17], may also help reducing the energy and computational resources needed for detection."

For larger scale operations, drone swarm can be used to cover greater areas for search and rescue. This method is useful when other factors such as the wind, current etc. make finding the victim more difficult. The drone swarm would be implemented in such a way so that areas around the ship, or any search area for that matter, can be searched in a shorter time span, increasing the chances of finding the victim and increasing the chances of survival. When multiple drones are deployed, they all take up a specific search pattern in unison. Depending on the situation different search patterns will be used. If a man overboard is detected rather quickly, one search pattern may be implemented in order to optimize the time it takes in order to locate the victim. When other factors influence the case, such as wind or current, different search patterns will be used in order to adapt to the situation.

Requirements and Limitations of Thermal Drones for Effective Search and Rescue in Marine and Coastal Areas

The research paper "Requirements and Limitations of Thermal Drones for Effective Search and Rescue in Marine and Coastal Areas"[11] explores the potential of thermal drones for search and rescue (SAR) operations in marine and coastal areas. The authors review the main features and capabilities of thermal drones, including their ability to detect heat signatures and identify objects in low light or night conditions. The paper highlights the key requirements and limitations of using thermal drones for SAR operations, such as the need for high-resolution thermal cameras, long flight time, and reliable communication systems. The authors also discuss the challenges of operating drones in harsh weather conditions and the importance of complying with regulatory and ethical guidelines. To demonstrate the effectiveness of thermal drones for SAR operations, the authors present a case study of a simulated rescue mission in a coastal area. The results show that thermal drones can provide valuable assistance in locating and identifying missing persons or distressed vessels, particularly in remote or inaccessible areas. Overall, the research paper provides a comprehensive overview of the potential of thermal drones for SAR operations in marine and coastal areas, while also highlighting the technical and operational challenges that need to be addressed to ensure their effective use.

"We used a custom hexacopter drone based around a Tarot 680 airframe and Pixhawk 2.1 flight controller. The drone was equipped with a custom designed two-axis brushless gimbal mechanism to provide stabilisation of the camera in both roll and pitch. The gimbal also allowed independent and remote control of the camera’s pitch angle. The thermal camera used was a FLIR Duo Pro R. This consists of a Tau2 TIR detector (pixels, 60 Hz) with a 13 mm lens ( field of view), and a 4K RGB camera (4000 × 3000 pixels, field of view) affixed side-by-side. The camera was interfaced to the flight controller to enable remote triggering and geotagging information to be recorded. Live video feedback was available in flight via a 5.8 GHz analogue video transmitter and receiver system."

AutoSOS: Towards Multi-UAV Systems Supporting Maritime Search and Rescue with Lightweight AI and Edge Computing

The research paper "AutoSOS: Towards Multi-UAV Systems Supporting Maritime Search and Rescue with Lightweight AI and Edge Computing"[12] proposes a novel approach for using multiple unmanned aerial vehicles (UAVs) equipped with lightweight AI and edge computing to support maritime search and rescue (SAR) operations. The authors describe the AutoSOS system, which consists of a ground station, multiple UAVs, and a cloud-based platform for data processing and communication. The UAVs are equipped with cameras and other sensors to detect and classify objects in real-time, such as a person in the water or a life raft. The lightweight AI algorithms are designed to operate on the UAVs, minimizing the need for high-bandwidth communication with the cloud-based platform. The paper highlights the advantages of using a multi-UAV system for SAR operations, such as increased coverage area, improved situational awareness, and faster response times. The authors also discuss the technical challenges of designing and implementing the AutoSOS system, such as optimizing the UAV flight paths and ensuring the reliability of the communication and data processing systems. To demonstrate the effectiveness of the AutoSOS system, the authors present a case study of a simulated SAR mission in a coastal area. The results show that the system is capable of detecting and identifying objects accurately and efficiently, with a success rate of over 90%. Overall, the research paper presents a promising solution for improving SAR operations at sea, leveraging the capabilities of UAVs, lightweight AI, and edge computing.

A Review on Marine Search and Rescue Operations Using Unmanned Aerial Vehicles

The research paper "A Review on Marine Search and Rescue Operations Using Unmanned Aerial Vehicles"[13] provides a comprehensive review of the current state of research on using unmanned aerial vehicles (UAVs) for search and rescue (SAR) operations in marine environments. The paper discusses the main challenges of conducting SAR operations in marine environments, such as limited visibility, harsh weather conditions, and the vast and complex nature of the search areas. The authors then review the different types of UAVs and their applications in SAR operations, such as fixed-wing UAVs for long-range surveillance and multirotor UAVs for close-range inspection and search missions. The paper highlights the advantages of using UAVs for SAR operations, such as increased coverage area, improved situational awareness, and reduced risk to human rescuers. The authors also discuss the technical challenges of designing and operating UAVs in marine environments, such as optimizing the flight paths, ensuring reliable communication and data transmission, and complying with regulatory and ethical guidelines. To demonstrate the effectiveness of UAVs for SAR operations, the authors review several case studies of real-world applications, such as detecting and rescuing distressed vessels and locating missing persons in coastal areas. The results show that UAVs can provide valuable support to SAR operations, improving response times, and increasing the chances of successful rescue. Overall, the research paper provides a comprehensive overview of the potential of UAVs for SAR operations in marine environments, while also highlighting the technical and operational challenges that need to be addressed to ensure their effective use.

Remote Drone Control

A lot of research has already gone into autonomous drone flight. Take a look at autonomous drone racing for example[14] . A lot of consumer drones also are capable of autonomous flight, for example DJI has a waypoint system based on GPS coordinates and their drones are also capable of tracking a moving person/object. DJI's waypoint system works by loading a set of GPS coordinates onto the drone, the drone will navigate itself to the first set coordinates. From there it goes on to the next. Until it has reached the final set of coordinates. The moving object tracking is a bit more complicated. This involves image recognition.[15]

However, the firmware on most of these commercial drones is not open to being interfaced with. This makes is much more difficult to make a more autonomous drone without building one from scratch. Another point is that drones, and especially the autonomous kind, are quite expensive. And due to the afformentioned lack of interfacing options with the flight controllers of commercial drones it is also not really possible to take a cheaper non-autonomous drone and hook it up to a controller that will act as a module to make the drone able to fly autonomously.

Drone search speed

X(axis) = time in seconds, Y(axis) = area, the blue line is area to be searched and the yellow line is area searched.

The drone search speed will be limited by multiple things such as flight speed, sensor resolution and quality/speed of the detection algorithm. Since we still have some unknowns we would like to make some assumptions. Lets assume the algorithm is able to detect a human if they take up at least a space of 25 by 25 pixels. Lets also assume a human at sea takes up 25 by 25 cm. A good sensor to use for the detection of humans at sea at night would be a IR camera. These cameras take images in the IR spectrum and thus aren't hindered by low light conditions. They have a field of view (FOV), a resolution and a refresh rate. The height the drone can fly will be determined by the fov, and resolution. A typical FOV for raspberry pi cameras is about 60 degrees of horizontal (https://www.raspberrypi.com/documentation/accessories/camera.html) and a typical resolution would be 1080*1920 pixels. Here 1920 is the horizontal direction. If the drone would have a sensor of these specs it would be able to fly about 15 meters high, here it would be able to scan a line of 19.2 meters wide. Lets assume the algorithm needs to have the human in shot for 5 seconds for it to be detected by the algorithm (https://www.researchgate.net/publication/350981551_Real-Time_Human_Detection_in_Thermal_Infrared_Images) The vertical FOV of raspberry pi sensors is about 40 degrees, this would result in a distance of about 10 meters vertical to be seen by the drone. This means it can fly at about 2 m/s. The area the drone can covers is thus 20m*2m/s*t where t is the time since dispatched. Another solution to the detection could be to not look for humans specifically, but for relatively hot areas in the image. This might be an options since out at sea there is a significant heat difference between a human with a body temprature of approximately 37 degrees celcius and the environment that consists mostly of cold water. When using this strategy it is however necessary to include some kind of verification in the system so that the drone will not signal that the victim is found when it, for example, finds a maritime animal close to the surface of the water.

However, for our estimation of the search speed we will still use the option where an algorithm is used that actually detects a human since this might still be necessary and we prefer to work from the worst case scenario with regards to bounding the minimum required efficiency. In the graph on the right we see that after about 5 minutes the drone is not able to keep up with the search area anymore (even with a perfect search pattern). If we start taking dispatch time into account it would mean these 5 minutes would be about result in about 2.5 of effective search time) this would move the yellow line to the left, this would also mean that one drone is not quick enough. and we would likely need to use multiple drones in order to search fast enough..

Communication Systems

In a scenario where a drone needs to send data, such as thermal images, to a container ship on the open sea, there are several reliable methods of communication that can be used, including:

  1. Satellite Communication: One of the most reliable methods of communication in remote areas, such as the open sea, is through satellite communication. The drone can use a satellite modem to transmit the data to a satellite in orbit, which then relays the data to a ground station or directly to the container ship. Satellite communication can support high data rates and can be used to transmit data over long distances.
  2. Long-Range Wi-Fi: Another option for communication between the drone and the container ship is through long-range Wi-Fi. The drone can be equipped with a Wi-Fi transmitter that is capable of transmitting data over a long distance, and the container ship can be equipped with a Wi-Fi receiver that is capable of receiving the data. Long-range Wi-Fi can support high data rates and can be used to transmit data over several kilometers, depending on the power of the transmitter and receiver.
  3. Cellular Communication: Depending on the location of the container ship and the availability of cellular coverage, the drone may be able to use cellular communication to transmit data to the ship. The drone can be equipped with a cellular modem that is capable of connecting to a cellular network, and the container ship can be equipped with a cellular receiver that is capable of receiving the data. Cellular communication can support moderate data rates and can be used to transmit data over several kilometers, depending on the strength of the cellular signal.
  4. Radio Communication: Another option for communication between the drone and the container ship is through radio communication. The drone can be equipped with a radio transmitter that is capable of transmitting data over a long distance, and the container ship can be equipped with a radio receiver that is capable of receiving the data. Radio communication can support low to moderate data rates and can be used to transmit data over several kilometers, depending on the power of the transmitter and receiver.

Overall, the choice of communication method will depend on the specific requirements of the scenario, such as the distance between the drone and the container ship, the data rate requirements, and the availability of communication infrastructure. Satellite communication is typically the most reliable and versatile option in remote areas, but it may also be the most expensive.[16]

FPV (First Person View) drones use a variety of wireless communication technologies to transmit live video feeds to the ground station or the pilot's goggles. The combination of wireless technologies used by an FPV drone depends on the drone's design and the user's preference. Here are some of the most common wireless communication technologies used by FPV drones:

  1. Analog video transmission: Analog video transmission is the most common technology used by FPV drones for video transmission. It uses a 5.8GHz radio frequency (RF) band to transmit the video signal wirelessly to the ground station or the pilot's goggles. The video signal is usually low-latency and provides a real-time video feed for the pilot.
  2. Digital video transmission: Digital video transmission is becoming increasingly popular among FPV drone users due to its higher image quality and reliability compared to analog video transmission. Digital HD (DHD) systems, such as DJI's HD FPV system, use a 2.4GHz RF band to transmit the video signal wirelessly to the ground station or the pilot's goggles. DHD systems provide high-definition video with low latency and low interference, making them ideal for professional FPV pilots.
  3. Wi-Fi video transmission: Some FPV drones use Wi-Fi to transmit the video signal wirelessly to a smartphone or a tablet. Wi-Fi video transmission is not as reliable as analog or digital transmission and can suffer from latency and interference issues. However, it is a convenient option for casual FPV users who want to view the drone's video feed on their mobile device.
  4. Cellular data transmission: Some FPV drones can use cellular data networks to transmit live video feeds to a remote location. This option requires a cellular data plan and may suffer from latency and data transfer limitations. It is typically used for long-range FPV flights and commercial applications such as search and rescue.[17]

Sensors

The main sensor that will be used to locate a victim is a thermal imaging camera (TIC). Since the general search area is small relative to the complete ocean, temperatures of the water will all be relatively the same. The victim will have a much higher temperature than their surroundings, making it easy to detect them. There are a couple important requirements for the TIC. The camera needs a sufficiently high resolution so that we can clearly distinguish the victim from other warm object (think of the marine life). It also needs to have a high refresh rate. Our goal is to detect and locate a victim as soon as possible so that the survival chance is the highest, we therefore need to 'scan' the search area fast and that requires a sufficient refresh rate. For the same reason the thermal camera is required to have a large field of view (FOV). It also needs the right temperature sensitivity.

Other sensors and actuators that we might want to include are a microphone and a loudspeaker. This would allow us to make contact with the victim. However, the issue with a microphone is that the sound of the ocean is most likely so loud that it is unlikely to understand the victim, trying to conversate with the victim might also fatigue them. A loudspeaker can be used to notify the victim rescue is on the way and help them calm down.

Thermal image recognition

For the image recognition we can make use of TensorFlow, Google's Neural Network library. We can, as mentioned before, also choose to not do advanced image detection and instead as soon as a heat source that could reasonably be the victim is detected, send the camera images to the ship where a human operator can verify whether the heat source is actually the victim.

Another option that could be looked into more would be using a hybrid of these two strategies. Namely, using the fast simple heat source detection until a heat source is found and then activating the more advanced AI image detection to verify if it is a human. This would then solve a part of the efficiency problem and would eliminate some need for human operators and higher data transfer requirements.

Project plan

Requirements

MoSCoW method
Must Should Could Won't
Have sensors to detect a person within 10 minutes Have life assist systems Have communication between victim and ship Have the ability to take the person to safety
Have a communication system to communicate with the involved ship Resist (bad) weather to a degree
Be able to withstand an air temperature range of -3°C to 36°C. A person recognition algorithm
Be able to operate in the dark
Be able to operate in the rain
Be able to withstand wind forces of 4 beaufort.

Design

Components

Sensors

  • Thermal imaging camera
  • Wind sensor
  • Servomotors (only needed if we wish to drop items)

Communication Equipment

  • Microphone
  • Loudspeaker
  • Wireless communication module

Drone with autonomous/waypoint flight capabilities

Small computer capable of:

  • Multithreading/multiprocessing
  • Running tensorflow or other image recognition AI software
  • Interfacing with electronics (reading the camera image and possibly interfacing with servo motors)
  • Running autonomous flight control software

Programming

All of the code used in this project is uploaded on the following github repository: https://github.com/ThijsEgberts/ProjectRobotsEverywhere.

Drone

For our real design we would like to use a drone that is able to fly accurately in mild wind and is able to carry some weight. Since the drone has to fly autonomously it will most likely be necessary to make a custom drone for this project since almost all commercially available drones don't allow interfacing with the flight controller in such a way that an external device/computer can control the drone. We have not really researched into how to make a drone from scratch since this is not really in the scope of the project, however for the autonomous flight a pixhawk flight controller can be used. The pixhawk is currently the most popular option for autonomous flight as far as we can tell. The pixhawk can then be interfaced with with a raspberry pi to controll the autonomous movement of the drone.

We have also not researched into the details of the controll of autonomous drones since we already found out pretty early that using an actual drone like this for the proof of concept would not be feasable. The details of the control software would not really contribute to the rest of the design so for our demo we assumed it was possible to transform carthesian waypoints to flight instructions for the drone.

Computer

We have decided to use a raspberry pi as our computer due to it's computing power to size/weight ratio. The raspberry pi can do all the things we wish our computer to do so it is a great choice.

3D-model of entire setup

Proof of concept

Designing and building

Materials:

  • Plastic container 40cm x 60cm
  • 2 Metal rods 8 mm x 100cm
  • Linear ball bearing
  • ABS (to print pieces)
3D model of slider

The design process of the prototype was done with Fusion 360. This program allows for detailed 3D-modeling and easy conversion to Prusaslicer which is the slicer program. After that the pieces that were designed are ready to be printed. The 3D-printer that was used is the Voron V0. We made a total of 4 corner attachment pieces, 2 sliders and 1 slider platform. The whole process required a lot of little design changes to properly fit and assemble the entire setup. Eventually the final assembly process was easily done with some screws and bolts. The search pattern was drawn on the bottom of the container to help the people moving the camera with following the movement instructions and make it more clear for the demonstration video what was going on.

3D model of slider platform
3D model of corner attachment piece






Demo

A Raspberry Pi and a thermal camera were mounted on a rig to demonstrate the proof of concept. Ropes were used to manually modify the camera's position. To make it easier to follow the search path, a map of the search pattern was drawn on paper. A tiny tea light was used as a heat source to imitate the relevant scenario and provide a distinct heat signature. Also, a rope that was tied to the lamp served as an ocean current replica. The heat source was identified by following the search pattern according to instructions being output by the raspberry pi.

Thermal camera

For our demo we used the MLX90640 thermal camera. This camera has a resolution of 32x24 pixels and a 55° field of view. The camera itself has a refresh rate of 60fps, however in our application this is limited due to the speed at which the I2C interface works. Therefore in our application we could process images for heat sources at about 7fps when not displaying the image and at about 1fps when displaying the image. This speed may in part be due the use of python but python did make the image manipulation and output much easier.

Heating element

For the test setup we need to replicate a heat signature in order for the thermal camera to detect it. The two requirements for the test setup is that the heat signature is big enough so that the camera can detect it and it needs to be small enough to fit in the test setup. The idea is to use a coil hooked up to a battery which will create a heat signature. The setup also needs to float on water.

We initially wanted to make a good heating element, however due to issues with the availability of the necessary materials we have used a simple tea light in the end.

For our better heating element we can use a coil made of Nichrome wire with a diameter of around 24 to 28 gauge. This wire is a popular choice for heating elements due to its low resistance and high melting point. A coil with a length of around 5-10 cm should be sufficient for a small heating element.

For the battery, we can use a single 18650 Lithium-ion battery, which is a compact and powerful battery commonly used in small electronics.

With this combination of Nichrome wire coil and 18650 Lithium-ion battery, ew should be able to create a small heating element that can reach a high temperature quickly. However, be sure to take appropriate safety precautions, such as using proper insulation and avoiding touching the coil directly while it is heating.


Materials:

  • Nichrome wire (24-28 gauge, 5-10cm length)
  • 18650 Lithium-ion battery
  • Battery holder or case
  • Single pole single throw (SPST) switch
  • Electrical tape or heat shrink tubing
  • Styrofoam or closed-cell foam material
  • Scissors or knife
  • Hot glue gun (optional)

Setup:

  1. Cut a length of Nichrome wire to create your coil, and wrap it around a cylindrical object of the desired diameter to form the coil.
  2. Connect one end of the Nichrome wire to the positive terminal of the battery holder or case.
  3. Connect the other end of the Nichrome wire to one of the terminals of the SPST switch.
  4. Connect a piece of wire between the other terminal of the SPST switch and the negative terminal of the battery holder or case.
  5. Insulate the connections with electrical tape or heat shrink tubing.
  6. Insert the battery into the holder or case, and turn on the switch to activate the heating element.
  7. Cut a piece of styrofoam or closed-cell foam material to a size that will fit around the heating element.
  8. Cut a small hole in the center of the foam material that is slightly smaller than the diameter of the Nichrome wire coil.
  9. Insert the heating element through the hole in the foam material, so that the Nichrome wire coil is completely covered by the foam material.
  10. Insulate the connections between the heating element, switch, and battery holder with electrical tape or heat shrink tubing.
  11. If desired, you can use a hot glue gun to secure the foam material to the heating element and provide additional insulation.

Results

In this section we will talk about our results surrounding tests with our design, namely our physical demo test and our model tests.

Demo results

Our main demo results are that it is possible to detect heat sources and follow a search pattern with a raspberry pi drone model. For the demo we only executed the adapted expanding square search pattern due to practicalities with our test setup, namely that our test rig was relatively small and rectangular.

Model results

Ethical analysis

The paper "Ethical concerns in rescue robotics: a scoping review"[18] describes seven ethical concerns regarding rescue robots. In this chapter, each ethical concern from the paper is summerized and then applied to our case.

Fairness and discrimination

Hazards and benefits should be fairly distributed to avoid the possibility of some subjects incurring only costs while other subjects enjoy only benefits. This condition is particularly critical for search and rescue robot systems, e.g., when a robot makes decisions about prioritizing the order in which the detected victims are reported to the human rescuers or about which detected victim it should try to transport first.

Fairness and discrimination in our scenario

When there’s a situation where multiple people have fallen over board, discrimination is something to look out for. When searching for people, discrimination can’t take place because if the robot has found one person it will continue looking for the others and since it doesn’t take people out of the water it doesn’t need to make decisions about prioritizing people. However, we are also considering dropping lifebuoys and here prioritizing order does come into account. In the situation where the robot can only carry one life vest, we propose that it drops the lifebuoy one the first person it finds, then comes back to the ship to restock, and continues the search. This will be the most efficient and also prevents discrimination between multiple victims.

False or excessive expectations

Stakeholders generally have difficulties with accurately assessing the capabilities of a rescue robot. May cause them to overestimate and rely too much on the robot and give false hopes for the robot to safe a victim. On the other hand, if a robot is underestimated, it may be underutilized in cases where it could have saved a person.

False or excessive expectations in our scenario

It is important for us to accurately inform stakeholders about the capabilities and limitations of the rescue robot.

Labour replacement

Robots replacing human rescuers might reduce the performance of human contact, situational awareness and manipulation capabilities. Robots might also interfere with rescuers attempt to provide medical advice and support.

Labour replacement in our scenario

Since it is important to not fully take the human aspect out of the equation, the robot should be equipped with speakers for people on the ship to talk to the person. They can check how he’s responding and reassure the victim that they’re coming to help. This will most likely help to calm the person. When the rescue attempt is being made by the crew or other rescuers, the drone should fly away since it’s no longer useful and as to not interfere with the rescue attempt.

Privacy

The use of robots generally leads to an increase in information gathering, which can jeopardize the privacy of personal information. This may be personal information about rescue workers, such as images or data about their physical and mental stress levels, but also about victims or people living or working in the disaster area.

Privacy in our scenario

The information gathered by the robots is not shared with anyone outside professional rescue organizations and is exclusively used for rescue purposes. We should also try to limit the gathering of irrelevant data as much as possible.

Responsibility

The use of (autonomous) rescue robots can lead to responsibility ascription issues in the case when for example accidents happen.

Responsibility in our scenario

Clearly state who is (legally) responsible in the case accidents happen during rescue (like operator, manufacturer etc.) Proposal: To ensure trust in our product, we should claim full (legal) responsibility as a manufacturer when accidents happen caused by design flaws or decisions made by the autonomous system.

Safety

Rescue missions necessarily involve safety risks. Certain of these risks can be mitigated by replacing operators with robots, but robots themselves, in turn, may determine other safety risks, mainly because they can malfunction. Even when they perform correctly, robots can still be harmful: they may, for instance, fail to identify and collide into a human being. In addition, robots can hinder the well-being of victims in subtler ways. For example, the authors argue, being trapped under a collapsed building, wounded and lost, and suddenly being confronted with a robot, especially if there are no humans around, can in itself be a shocking experience

Safety in our scenario

Risks should be contained as much as possible, but we do acknowledge that rescue missions are never completely risk free

Trust

Trust depends on several factors; reputation and confidence being the most important once. Humans often lack confidence in autonomous robots as they find them unpredictable.

Trust in our scenario

Ensure that the robot has a good reputation by informing people about successful rescue attempts and/or tests, and limit unpredictable behaviour as much as possible. We can also ensure trust by claiming full (legal) responsibility as a manufacturer when accidents happen caused by design flaws or decisions made by the autonomous system.

Conclusion


Discussion


Appendix

Task division

Person Week 1 Week 2 Week 3 Week 4 Week 5 Week 6 Week 7 Week 8
Luc Literature study (4-5 articles)+Subject picking Further Brainstorming and subject refining Literature study (4-5 articles): Search patterns, Remote drone control > necessary equipment Search pattern and area, TIC, (remote control) get familiar with raspberry pi
Thijs Literature study (4-5 articles)+Subject picking Further Brainstorming and subject refining Literature study(4-5 articles): Communication systems > necessary equipment Communication systems get familiar with raspberry pi
Geert Literature study (4-5 articles)+Subject picking Further Brainstorming and subject refining Literature study(4-5 articles):  Influence of cold water, Oceanic Weather Ethics of rescue robots, write on scenario (specifics) get familiar with raspberry pi, look into code for search algorithm/pattern.
Victor Literature study (4-5 articles)+Subject picking Further Brainstorming and subject refining Literature study(4-5 articles): Image recognition /sensors > necessary equipment, Take care of the wiki page TIC, other sensors and image recognition get familiar with raspberry pi, look into code for search algorithm/pattern.
Adrian Literature study (4-5 articles)+Subject picking Further Brainstorming and subject refining Literature study(4-5 articles):  Night time deck procedures > What does the victim have on him/her? Finalize state-of-the-art, communication system (from thursday) get familiar with raspberry pi
Aron Literature study (4-5 articles)+Subject picking Further Brainstorming and subject refining Literature study(4-5 articles):  Current tech > write a section on the current state of person detection tech in the wiki Look into dropping equipment from the drone. (take into account wind direction and ocean currents) get familiar with raspberry pi

Logbook

Week Name Work done & hours spent Total hours
1 Geert Touw Group setup etc (2h)
Luc van Burik Group setup etc (2h), Subject brainstorm (2 h) 4h
Victor le Fevre Group setup etc (2h)
Thijs Egbers Group setup etc (2h), Subject brainstorm (2h)
Adrian Kondanari Group setup etc (2h)
Aron van Cauter Group setup etc (2h)
2 Geert Touw Meeting 1 (1h), Meeting 2(1h)
Luc van Burik Meeting 1 (1h), Meeting 2(1h), Subject brainstorm (2h) 4h
Victor le Fevre Meeting 1 (1h), Meeting 2(1h)
Thijs Egbers Meeting 1 (1h), Meeting 2(1h), Subject brainstorm (2h) 4h
Adrian Kondanari Meeting 1 (1h)
Aron van Cauter Meeting 1 (1h)
3 Geert Touw
Luc van Burik Meeting 2 (1h), IAMSAR reading (3.5h), Drone navigation research (1.5h) 6h
Victor le Fevre Meeting 1 (1h), Meeting 2 (1h), Made the wiki look a bit more coherent (1h)

Wrote the problem statement and objective, users, scenario and added points to the MoSCoW (1.5h)

Read Sensor and Image Recognition papers (1h)

5.5h
Thijs Egbers Meeting 1 (1h), meeting 2 (1h), sea communication research (2h) 4h
Adrian Kondanari
Aron van Cauter
4 Geert Touw Meeting 1 (1h), Meeting 2(1h)
Luc van Burik Meeting 1 (1h), Meeting 2(1h), Search area (1.5hr), Drone search speed (1.5 hr), Wiki writing (0.5 hr) 5.5 hr
Victor le Fevre Meeting 1 (1h), Meeting 2(1h)
Thijs Egbers Meeting 1 (1h), Meeting 2(1h), drone controll research (3h)
Adrian Kondanari Meeting 1 (1h), Meeting 2(1h)
Aron van Cauter
5 Geert Touw Meeting 1(1h)
Luc van Burik Meeting 1(1h), Working on Image detection(2.5h), meeting 2 (1h), Square search (2.5h),

adapted square search(WIP) (1h)

8 h
Victor le Fevre Meeting 1(1h), designed and tried tried to make an xy-rig (3h), worked on the wiki (2h) 6h
Thijs Egbers Meeting 1(1h), meeting 2 (1h), worked on getting the raspi setup (3h),

try to get the camera and soldering (0.5h)

5.5h
Adrian Kondanari Meeting 1(0.5h)
Aron van Cauter Meeting 1(1h)
6 Geert Touw
Luc van Burik
Victor le Fevre Meeting 1 (1h), Meeting 2 (1h), Worked on wiki (4h) 6h
Thijs Egbers Meeting 1 (1h), Meeting 2 (1h), get the camera setup (1.5h), try to optimise camera reading (1h),

convert search pattern to work inside the rest of the program (3h)

7.5h
Adrian Kondanari
Aron van Cauter
7 Geert Touw
Luc van Burik
Victor le Fevre Meeting 2 (demo) (4h), Wiki (1h) 5h
Thijs Egbers Meeting 1 (1h), meeting 2/video making (4h), finishing touches test program (2h),

make presentation (WIP)

Adrian Kondanari
Aron van Cauter
8 Geert Touw
Luc van Burik
Victor le Fevre
Thijs Egbers Presentations (2.5h), short meeting with Luc (0.25h), working on the wiki (4h)
Adrian Kondanari
Aron van Cauter


Brainstorm Phase

Possible projects

Man over board (MOB) drone (Final project)

During the MOB protocol, the most challenging part is locating the victim. This can prove to be especially difficult during stormy weather or night time. Creating a device that is equipped with adequate sensors to locate the victim and life saving equipment would drastically increase the chances of survival for a man overboard. Another problem that our drone needs to tackle is providing appropriate care for the possibilities of drowning, hypothermia or any other injury.[19][20]

User: Ship's crew, rescue teams (coast guard); Problem: MOB, Requirement: Locate and provide appropriate care for the victim.

Manure Silo suffacation

Manure Silo's need to be cleaned. When this is done, people can sufficate in the toxic gasses released by the manure (even if the silo is almost empty). We want to develop a robot that alarms people when conditions become dangerous, and if this person is not able to leave the silo in time, supply clean air to them.[21][22]

User: Farmers, Problem: Manure silo suffication, Requirement: Supply clean air before suffication.

Extreme Sports Accidents

Thrillseekers are often a bit reckless when it comes to safety. We want to design a flying drone that can help people get out of sticky situations during parachute-jumping, base-jumping or even rock-climbing. The victim will be able to attach themselves to the drone using the parachute equipment or rock-climbing equipment and the drone will put them safely on the ground.[23][24]

User: Extreme sporters, rescue teams, Problem: Dangerous accidents, Requirement: Can safely attach to people and put them on the ground.

Literature study

To determine the state of the art surrounding our project we will do a literature study.

Disaster robotics

This article gives an overview of rescue robotics and some characteristics that may be used to classify them. The article also contains a case study of the Fukushima-Daiichi Nuclear power plant accident that gives an overview of how some robots where used. On top of that the article gives some challenges that are still present with rescue robotics.[25]

A Survey on Unmanned Surface Vehicles for Disaster Robotics: Main Challenges and Directions

This article gives an overview of the use of unmanned surface vehicles and gives some recommendations around USV's.[26]

Underwater Research and Rescue Robot

This article is about an underwater rescue robot that gives necessary feedback in rescuing missions. This underwater robot has more computng power than the current underwater drones and reduces delay by the use of ethernet cable.[27]

Mechanical Construction and Propulsion Analysis of a Rescue Underwater Robot in the case of Drowning Persons

This article is about a unmanned life-saving system that recovers conscious or unconscious people. This prevents other people from getting themselves in a dangerous situation by trying to save others. This drone is not fully autonomous since it needs to be operated by humans.[28]

Design and Dynamic Performance Research of Underwater Inspection Robots

Power plants along the coastline use water as cooling water. The underwater drone presented in this paper is used to research water near power plants and clean filtering systems to optimize the efficiency of the powerplant.[29]

Semi Wireless Underwater Recue Drone with Robotic Arm

This article highlights the challenges concerning underwater rescue of people and valuable object. The biggest challenge is wireless communication due to the harsh environment. The drone is also equipped with a robotic arm to grab objects and a 4K camera with foglights to navigate properly underwater.[30]

Rescue Robots and Systems in Japan

This paper discusses the development of intelligent rescue systems using high-information and robot technology to mitigate disaster damages, particularly in Japan following the 1995 Hanshin-Awaji earthquake. The focus is on developing robots that can work in real disaster sites for search and rescue tasks. The paper provides an overview of the problem domain of earthquake disasters and search and rescue processes.[31]

Two multi-linked rescue robots: design, construction and field tests

This paper proposes the design and testing of two rescue robots, a cutting robot and a jack robot, for use in search and rescue missions. They can penetrate narrow gaps and hazardous locations to cut obstacles and lift heavy debris. Field tests demonstrate their mobility, cutting, and lift-up capacity, showing their potential use in rescue operations.[32]

The current state and future outlook of rescue robotics

This paper surveys the current state of robotic technologies for post-disaster scenarios, and assesses their readiness with respect to the needs of first responders and disaster recovery efforts. The survey covers ground and aerial robots, marine and amphibious systems, and human-robot control interfaces. Expert opinions from emergency response stakeholders and researchers are gathered to guide future research towards developing technologies that will make an impact in real-world disaster response and recovery.[33]

Mobile Rescue Robot for Human Body Detection in Rescue Operation of Disaster

The paper proposes a mobile robot based on a wireless sensor network to detect and rescue people in emergency situations caused by disasters. The robot uses sensors and cameras to detect human presence and condition, and communicates with a network of other robots to coordinate rescue efforts. The goal is to improve the speed and efficiency of rescues in order to save more lives.[34]

Mine Rescue Robot System – A Review

Underground mining has a lot of risks and it is a very difficult task for rescuers to reach trapped miners. It is therefore great to deploy a wireless robot in this situation with gas sensors and cameras, to inform rescuers about the state of the trapped miners.[35]

Ethical concerns in rescue robotics: a scoping review

We also have to take the ethics of rescue robots into account. There are seven core ethical themes: fairness and discrimination; false or excessive expectations; labor replacement; privacy; responsibility; safety; trust[36]

Rescue robots for mudslides: A descriptive study of the 2005 La Conchita mudslide response

Robots assisted the rescuers who responded to the 2005 mudslide in La Conchita. The robots were waterproof and could thus be deployed in wet conditions, but they failed to navigate through the rubble, vegetation and soil. The paper thus suggests that rescue robots should be trained in a variety of environments, and advises manufacturers to be more conservative with their performance claims.[37]

Emergency response to the nuclear accident at the Fukushima Daiichi Nuclear Power Plants using mobile rescue robots

The 2011 earthquake and tsunami in Japan resulted in a meltdown of the Fukushima nuclear power plant. Due to the radiation levels, robots were deployed because it was too dangerous for humans. First various issues needed to be resolved, like the ability of the robot’s electrical components to withstand radiation. The ability to navigate and communicate was tested at a different nuclear powerplant similar to Fukushima.[38]

A Coalition Formation Algorithm for Multi-Robot Task Allocation in Large-Scale Natural Disasters

Robots are more reliable then humans in a lot of cases. This paper discusses a bit of prior research concerning older algorithms and looks into a new algorithm considering multi-robot task allocation is rescue situations. These algorithms should take a lot into account, like sensors needed for problems. They compare their algorithm with older ones in multiple cases like different problem sizes.[39]

References

  1. ANNUAL OVERVIEW OF MARINE CASUALTIES AND INCIDENTS 2022. (2022). In EMSA (Ares(2022)8241169). European Maritime Safety Agency. from https://www.emsa.europa.eu/newsroom/latest-news/download/7362/4867/23.html
  2. Gonel, O., & Çiçek, I. (2020, 25 december). STATISTICAL ANALYSIS OF MAN OVER BOARD (MOB) INCIDENTS. ResearchGate. https://www.researchgate.net/publication/348266442_STATISTICAL_ANALYSIS_OF_MAN_OVER_BOARD_MOB_INCIDENTS
  3. The Mechanics of Drifting on the Open Sea. (2012, 12 juli). Oregon Beachcomber Blog. http://www.oregonbeachcomber.com/2012/07/mechanics-of-drifting-on-open-sea.html#:~:text=Your%20average%20ocean%20wind%20is,blows%20552%20miles%20a%20day
  4. 4.0 4.1 O. (2021, 5 april). IAMSAR Search Patterns Explained with Sketches - Oways Online. Oways Online. https://owaysonline.com/iamsar-search-patterns/
  5. O. (2021, 5 april). IAMSAR Search Patterns Explained with Sketches - Oways Online. Oways Online. https://owaysonline.com/iamsar-search-patterns/
  6. 6.0 6.1 https://www.vox.com/2016/4/25/11503152/shipping-routes-map
  7. 7.0 7.1 https://aceboater.com/en-us/cold-water-immersion-and-hypothermia
  8. 8.0 8.1 https://www.seatemperature.org/
  9. Frotan, D. F., & Moths, J. M. (2022). Human Body Presence Detection in Water Environments Using Pulse Coherent Radar [Bachelor Thesis]. Malmo University.
  10. https://arxiv.org/abs/1811.05291
  11. https://www.mdpi.com/2504-446X/3/4/78
  12. https://arxiv.org/abs/2005.03409
  13. https://publications.waset.org/10001953/a-review-on-marine-search-and-rescue-operations-using-unmanned-aerial-vehicles
  14. https://link.springer.com/article/10.1007/s10514-021-10011-y
  15. http://acta.uni-obuda.hu/Stojcsics_56.pdf
  16. https://dev.px4.io/v1.11_noredirect/en/qgc/video_streaming_wifi_broadcast.html
  17. https://www.dronezon.com/learn-about-drones-quadcopters/learn-about-uav-antenna-fpv-live-video-transmitters-receivers/
  18. https://link.springer.com/article/10.1007/s10676-021-09603-0
  19. https://www.ussailing.org/news/man-overboard-recovery-procedure/
  20. https://www.sciencedirect.com/science/article/pii/S1877705812021492?via%3Dihub
  21. https://www.ad.nl/binnenland/vader-beukt-wanhopig-in-op-silo-maar-zoon-bezwijkt~a4159109/?referrer=https%3A%2F%2Fcstwiki.wtb.tue.nl%2F
  22. https://www.mestverwaarding.nl/kenniscentrum/1309/twee-gewonden-bij-ongeval-met-mestsilo-in-slootdorp
  23. https://www.nzherald.co.nz/travel/aussie-base-jumpers-two-hour-ordeal-after-parachute-gets-stuck-in-tree/HCN6DYMSSA4ZUBE3GVV2WCTNRQ/
  24. https://www.tmz.com/2022/11/30/base-jumper-crash-cliff-dangling-parachute-death-defying-video-moab-tombstone-utah/
  25. https://link.springer.com/chapter/10.1007/978-3-319-32552-1_60
  26. https://www.mdpi.com/1424-8220/19/3/702?ref=https://githubhelp.com
  27. https://www.researchgate.net/publication/336628369_Underwater_Research_and_Rescue_Robot
  28. https://www.mdpi.com/2076-3417/8/5/693
  29. https://www.hindawi.com/journals/wcmc/2022/3715514/
  30. https://www.researchgate.net/publication/363737479_Semi_Wireless_Underwater_Rescue_Drone_with_Robotic_Armhttps://www.researchgate.net/publication/363737479_Semi_Wireless_Underwater_Rescue_Drone_with_Robotic_Arm
  31. https://ieeexplore.ieee.org/abstract/document/1521744
  32. https://www.jstage.jst.go.jp/article/jamdsm/10/6/10_2016jamdsm0089/_pdf/-char/ja
  33. https://doi.org/10.1002/rob.21887
  34. https://d1wqtxts1xzle7.cloudfront.net/58969822/12_Mobile20190420-67929-tn7req-libre.pdf?1555765880=&response-content-disposition=inline%3B+filename%3DMobile_Rescue_Robot_for_Human_Body_Detec.pdf&Expires=1676230737&Signature=YQXJqYheT6M0hsHXSWDx4FbuCauvv9o9uvDR1Hl8dJL~SmI~KObXAhXbq7dDYZAMLhsydh7ipP5RBOayNkzsM~K0xP7pcXLmOKcW3-WFdt1aTyHvQWeG5hUKzhb5KLaVAj4Frfb313Yi5oyhFaHVb~ODSxbtpN73SGd3YE3UouzuexfeGSVqFyWTWi-3qMqMIQ3qfUKGiBF24QfyArHlj9mKkq8gVItdJsAS9OGBUGeBQaf~8j37WsIauoABw8cO5V73RFxhfLR~ehXXMgJegTRxzwT1tBMhE14OVMK~PkfcpYSAVkHFi3gqf~sawW4SFIut7MetNdUcKfcAwHEBHA__&Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA
  35. https://www.sciencedirect.com/science/article/pii/S187852201500096X
  36. https://link.springer.com/article/10.1007/s10676-021-09603-0
  37. https://onlinelibrary.wiley.com/doi/abs/10.1002/rob.20207
  38. https://onlinelibrary.wiley.com/doi/full/10.1002/rob.21439
  39. https://www.researchgate.net/publication/316283106_A_Coalition_Formation_Algorithm_for_Multi-Robot_Task_Allocation_in_Large-Scale_Natural_Disasters