PRE2019 3 Group17: Difference between revisions

From Control Systems Technology Group
Jump to navigation Jump to search
 
(262 intermediate revisions by 5 users not shown)
Line 35: Line 35:


=== Time Sheets ===
=== Time Sheets ===
[[File:4320.PNG | 1000px]]
[[File:group17final.png | 1000px]]


=== Peer Review ===
=== Peer Review ===
Will be added later
 
{| class="wikitable" style="margin: 1em; border-spacing:2px; border: 1px solid #aaa; border-collapse: collapse; color: black; background-color:#f9f9f9;"
|-
! scope="col" style="border: 1px #aaa solid; padding: 0.3em 0.4em; background-color:#f2f2f2; font-weight:bold; text-align: center;" | Name
! scope="col" style="border: 1px #aaa solid; padding: 0.3em 0.4em; background-color:#f2f2f2; font-weight:bold; text-align: center;" | Peer review individual relative grades
 
|-
! scope="row" style="text-align:left; border: 1px #aaa solid; padding:0.3em 0.4em; background-color:#f2f2f2;" | Amit Gelbhart
| style="text-align: left; padding: 0.3em 0.4em; border: 1px #aaa solid;" | 0
 
|-
! scope="row" style="text-align:left; border: 1px #aaa solid; padding:0.3em 0.4em; background-color:#f2f2f2;" | Marleen Luijten
| style="text-align: left; padding: 0.3em 0.4em; border: 1px #aaa solid;" | 0
 
|-
! scope="row" style="text-align:left; border: 1px #aaa solid; padding:0.3em 0.4em; background-color:#f2f2f2;" | Myrthe Spronck
| style="text-align: left; padding: 0.3em 0.4em; border: 1px #aaa solid;" | 0
 
|-
! scope="row" style="text-align:left; border: 1px #aaa solid; padding:0.3em 0.4em; background-color:#f2f2f2;" | Ilvy Stoots
| style="text-align: left; padding: 0.3em 0.4em; border: 1px #aaa solid;" | 0
 
|-
! scope="row" style="text-align:left; border: 1px #aaa solid; padding:0.3em 0.4em; background-color:#f2f2f2;" | Linda Tawafra
| style="text-align: left; padding: 0.3em 0.4em; border: 1px #aaa solid;" | 0
 
|}


== Project Goal ==
== Project Goal ==
=== The Problem ===
=== The Problem ===
One of the great benefits of the coral reef is that in case of natural hazards, such as coastal storms, the reef on average can reduce the wave energies by 97% [https://doi.org/10.1038/ncomms4794 (Ferrario, 2014)]. Meaning that it can prevent storms and flooding and thus protect the coastal inhabitants. Since roughly 40% of the world’s population is located within a range of 100 km from the coast [https://doi.org/10.1038/ncomms4794 (Ferrario, 2014)], protecting the coral reef will result in a reduction of a great amount of damage. This would not only be in regard to human lives but also to environmental destruction. In the case of these natural hazards, it is the government that will be imputable for the caused devastation. Sadly, the coral reefs have been degrading for a long time, and their recovery is lacking.
[[File:corals.jpg|thumb|right|350 px|Healthy vs dying corals. Image retrieved from https://https://phys.org/news/2015-08-high-seas-die.html/]]
Due to the acidification and warming of oceans due to climate change, pollution and destructive fishing practices [http://www.sciencedirect.com/science/article/pii/S0025326X13003020 (Ateweberhan et al., 2013)] the amount and variety of coral reefs is declining. Understanding and tracking how these developments happen is key to figuring out ways to protect and help the coral reefs.
Monitoring the reefs is labor-intesive and time-intesive work, which often involves having to dive down to take measurements and pictures. This kind of work: structured, often repeated and in environments humans are not suited to, could benefit greatly from robotic assistance.
However, for scientific work in a fragile environment like the coral reefs, not just any robot will do. In this project, we will investigate how to design a specialized robot that will help coral reef researchers to sustain and improve the coral reefs.
 
=== Approach ===
To design a robot specialized to coral reef environments, we will investigate the needs of both coral reef researchers, as well as the needs of the corals themselves. When using the robot in an already threatened ecosystem, it is not acceptable for it to cause any harm to the corals. Each of the aspects of the robot will need to be assessed in light of these needs. Interviewing experts will give us insights into both of these categories, and we will also do our own research into the coral reefs. Since this project is focused on the specialized needs for robots that operate near coral reefs, not all parts of underwater robot design in general, we will base much of our design on existing underwater robots and focus on what needs to be modified.
 
=== Deliverables ===
* An in-depth exploration of current robots like our objective and an assessment of how suitable they are for coral reefs.
* Guidelines for a robot specialized for coral reefs.
* A report on what coral reef researchers desire and how our guidelines meet those needs.
 
=== Problem Background ===
One of the great benefits of the coral reef is that in case of natural hazards, such as coastal storms, the reef on average can reduce the wave energies by 97% [https://doi.org/10.1038/ncomms4794 (Ferrario, 2014)]. Meaning that it can prevent storms and flooding and thus protect the coastal inhabitants. Since roughly 40% of the world’s population is located within a range of 100 km from the coast [https://doi.org/10.1038/ncomms4794 (Ferrario, 2014)], protecting the coral reef will result in a reduction of a great amount of damage. This would not only be in regard to human lives but also to environmental destruction. Coral reefs are also a source of income for many people, through the attraction of tourist but also as a source of fish. Coral reef fisheries contribute to the degredation of the reefs through overfishing, but they are also currently extremely important as a source of food and income for many people [https://doi.org/10.1016/j.cub.2007.02.054 (Newton et al., 2007)]. Healthy coral reefs are important for people and the environment, which makes it such a problem that the coral reefs have been degrading for a long time, and their recovery is lacking.
 
Proper monitoring of the reefs to track their development over time and identify factors that help and harm them is the main task that our research focussed on. However, a specialized coral reef robot could do much more. The initial idea behind this project was to design a robot that would not assist the researchers working with the corals, but would rather directly help the reefs themselves. While this focus was shifted later on in the project to allow us to work more directly on the user aspect, potential other uses for a coral reef robot were always on our minds.
 
For instance, one of the factors that could prevent the downgrading of a reef, is the resistance of a reef, its ability to prevent permanent phase-shifts; and the resilience of a reef, its ability to bounce back from phase-shifts [https://doi.org/10.1007/s00338-008-0426-z (Nyström et al., 2008)]. These phase-shifts are undesirable because the reef ends up in a state where it can no longer return to a coral-dominated state [http://www.sciencedirect.com/science/article/pii/S0169534710001825 (Hughes et al., 2010)]. If the reef has better resilience, it will be able to bounce back quicker. One of the ways to improve the resilience of the reef is increasing the species richness and abundance through the use of acoustics [https://www.nature.com/articles/s41467-019-13186-2?fbclid=IwAR2mivNWtUHZtsOXa8l0ng7rOnCyS5GmSLYoM_mWBC85rEuljJkmVvflcVg (Gordon, et al., 2019)], which improves the reef’s resilience by giving it protection from macroalgae [https://www.pnas.org/content/105/42/16201 (Burkepile and Hay, 2008)]. For a coral reef to flourish, a wide biodiversity of animals is needed. Fish that lay their larvae on corals are one of the essential components in a healthy reef ecosystem. However, once corals are dying, the fish do not use them for their larvae and the whole system ends up in a negative cycle. By playing sounds, with different frequencies, fish are tricked into believing that the corals are alive and come back with their larvae. This attracts other marine animals, which causes the entire system to flourish again [https://www.nature.com/articles/s41467-019-13186-2?fbclid=IwAR2mivNWtUHZtsOXa8l0ng7rOnCyS5GmSLYoM_mWBC85rEuljJkmVvflcVg (Gordon, et al., 2019)].
 
A good, coral-reef oriented roboted with a modular design could be modified to go down near the reefs to place speakers for acoustic enrichment, which would allow them to be placed faster and in areas potentially dangerous to humans.
 
Additionally, while our robot discussion is oriented towards the current standard coral reef research methods, which is mainly photographing, there are other ways of monitoring reefs that a robot could assist in. Monitoring the sounds of a reef has been suggested as a way of analyzing the fauna in the reef [https://asa.scitation.org/doi/abs/10.1121/1.2836780 (Lammers et al., 2008)], and any other number of other sensors could be attached to the robot. Specifications for a good coral reef oriented robot can therefore be applied to a wide variaty of applications.
 
== User Needs Research ==


Due to the acidification and warming of oceans due to climate change, pollution and destructive fishing practices [http://www.sciencedirect.com/science/article/pii/S0025326X13003020 (Ateweberhan et al., 2013)] the amount of coral reefs is declining. One of the factors that could prevent the downgrading of a reef, is the resistance of a reef, its ability to prevent permanent phase-shifts; and the resilience of a reef, its ability to bounce back from phase-shifts [https://doi.org/10.1007/s00338-008-0426-z (Nyström et al., 2008)]. These phase-shifts are undesirable because the reef ends up in a state where it can no longer return to a coral-dominated state [http://www.sciencedirect.com/science/article/pii/S0169534710001825 (Hughes et al., 2010)]. If the reef has better resilience, it will be able to bounce back quicker. One of the ways to improve the resilience of the reef is increasing the species richness and abundance through the use of acoustics [https://www.nature.com/articles/s41467-019-13186-2?fbclid=IwAR2mivNWtUHZtsOXa8l0ng7rOnCyS5GmSLYoM_mWBC85rEuljJkmVvflcVg (Gordon, et al., 2019)], which improves the reef’s resilience by giving it protection from macroalgae [https://www.pnas.org/content/105/42/16201 (Burkepile and Hay, 2008)]. For a coral reef to flourish, a wide biodiversity of animals is needed. Fish that lay their larvae on corals are one of the essential components in a healthy reef ecosystem. However, once corals are dying, the fish do not use them for their larvae and the whole system ends up in a negative cycle. By playing sounds, with different frequencies, fish are tricked into believing that the corals are alive and come back with their larvae. This attracts other marine animals, which causes the entire system to flourish again [https://www.nature.com/articles/s41467-019-13186-2?fbclid=IwAR2mivNWtUHZtsOXa8l0ng7rOnCyS5GmSLYoM_mWBC85rEuljJkmVvflcVg (Gordon, et al., 2019)].
Since the goal of this project is to design a robot that is especially adapted to the coral reefs, the robot can be used for a lot of things. The user of this robot will be researchers who could use our robot to research coral reefs, interact with them and support them. The robot will be used by everybody who is trying to help the corals by making changes in the reefs or by researching them.
Since the robot will have a lot of different possibilities the users and use cases can differ a bit. The robot will have variating jobs and attachment modules. Therefore the uses will also differ. The user could be a researcher that is trying to find out the current state of the coral reefs but it could also be a member of an organization that is trying to help the corals through acoustic enrichtment or delivering larvae.
All of the members of our user groups are passionate about the coral reefs. One of their priorities will therefore be the safety of the corals and therefore the robot will need to not harm its environment. The robot also will need possibilities to connect a variation of attachment modules sinds the robot will be used in a lot of different use cases.  
We have made contact with a member of one of our user groups. An interview was held with researchers at a university to find out what their perspectives on this project are.
===Interview===
[[File:IntExmpl.png|thumb|right|350 px|Example UI's we showed during the interview for feedback. Retrieved from https://www.kickstarter.com/projects/openrov/openrov-trident-an-underwater-drone-for-everyone]]


A robot could be used to place these speakers, or a robot could function as a speaker itself. However, this is not the only way in which robots can help the reefs. Robots could be created to replace labor and time-intensive work, as well as support scientists, researches and rescue organizations. Acoustic enrichment is one application, but consider also researching the current state of the coral reef by monitoring sound instead [https://asa.scitation.org/doi/abs/10.1121/1.2836780 (Lammers et al., 2008)], or removing the dangerous crown-of-thorns starfish [http://www.sciencedirect.com/science/article/pii/0160932782900047 (Endean, 1982)], or any other number of applications. The goal for this projects is to design a robot that can safely move through coral reefs. Because the coral reefs and ocean are sensitive, a robot that operates there needs to have some alterations in order to not risk damaging the reefs. The reefs are already endangered, any tool that is send down there to help them recover must be absolutely safe, and this will require some specialization.
We interviewed [https://www.wur.nl/en/Persons/Erik-dr.-EHWG-Erik-Meesters.htm Erik Meesters] and [https://www.wur.nl/nl/Personen/Oscar-dr.-OG-Oscar-Bos.htm Oscar Bos], both doctors at the Wageningen University & Research. Dr. Meesters works with coral reefs and specifically studies the long-term development of the reefs in Bonaire and Curacao, and dr. Bos focusses on the preservation of the biodiversity in the North Sea. They recently bought a robot that is helping them with their research. An interview was done with them to find user needs and therefore a series of questions was asked about what they would want from this robot.


In this project we will consider current robots that have been used in a number of environments, from land robots to robots used in the general deep sea, and even robots currently used in coral reefs. We will consider their benefits and drawbacks and discuss how their design could support a specialized coral reef robot.
We only managed to arrage this meeting further along in our project, when we had already started research into current underwater robots and coral reefs. This made the interview flow better, as we could refer to our findings during it and relate it to the information we already had.


=== The User ===
While we intended to let the discussion flow naturally and get as much information on what the researchers want without pushing them in any particular direction with guided questions, we did prepare a list of questions to ask if the discussion ever slowed down and to make sure we covered all relevant parts of our project:
Our user will be researchers who could use our robot to research coral reefs, interact with them and support them. We have made contact with one researcher at a university, but we are still in discussion with him. He informed us his department recently ordered a drone for research, so we hope to get many useful insights from him.
We will have a meeting, during which we intent to discuss the following questions:
* What are issues you are facing while studying/ managing/ maintaining coral reefs?
* What are issues you are facing while studying/ managing/ maintaining coral reefs?
** How do you deal with them now?  
** How do you deal with them now?  
** How could a robot help?
** How could a robot help?
** Is a underwater robot which is specialized for coral reefs relevant?
** Is a underwater robot which is specialized for coral reefs relevant?
* Would this/ our robot be suitable?
* Do you like the robot you bought recently?
** Why?/ Why not?
** Why/ why not?
** What can be improved?
** What is the primary reason you chose to invest in this robot over the other robots on the market?
** What is the most important aspect of the design which we need to consider?
** Which aspects of your robot should be carried over into our design? And are those aspects already good enough or do they need to be optimized further?
* Do you like the robot you bought lately?  
** Is it easy to control the robot? If not, is the problem more in the UI or the input method?
** Why?/ Why not?
** Are you satisfied with the control method? (At this point, we can show the various control methods we have seen for existing robots and ask for their preferences)
** Which aspects should our robot also have and what can be optimized?
** Regarding the UI, what kind of information would you like to be visible on screen while you are controlling the robot?
** Is it easy to control the robot? Does it have a good user interface?  
* What things should we keep in mind regarding useful and safe movement of the robot?
* If a specialized robot was available for a reasonable price, what applications would you like to use it for?
** In videos of current drones moving around coral reefs, they usually float near the top, they do not go into grooves or move between the corals. Would that be a function that would be helpful?  
* What qualities would a drone need to be reef safe?
** How much fluidity and freedom do you need in movement? (Here we can show the robots with very complex movement, like the Scubo, and those with more basic movement like the RangerBot)
** Certain chemicals to avoid specifically?
** Most underwater robots use thrusters, how do you feel about some alternative movement methods? (Here we can show the Aqua2 and the JenniFish)
** What do you find concerning about your current drone in this area?
** What are your concerns about having a robot moving near the corals? Is it just collisions, or also the way it moves through the water?
** How fast do you need the robot to move? Is fast movement necessary for your research? Is there potential danger in having fast-moving propellors?
** Is changing the depth of the robot via buoyancy control a viable method in your opinion?
* What materials would a robot need to be made from to be considered safe?
** Are there certain chemicals to avoid specifically?
** Are there certain battery types you would recommend avoiding?
** Are there any parts of your current drone that concern you in this area?
* What would you consider a reasonable price for a robot that could assist in your research?
** If a robot was available at a reasonable price, what applications would you want to use it for, besides just research?
 
=== Interview Results ===
 
We found out that the primary application the robot would be used for would be taking photographs of known reefs to monitor their development over time, this already happens but robots would allow it to go faster and over larger areas. In the interview, dr. Meesters mentioned this being more worthwhile for researchers than single-application targeted robots like the RangerBot. This focus on research and data collection informs a lot of their needs.
 
Needs:
# Good lighting is needed for high-quality photographs. If the reefs are very near the surface of the water this is less needed, but ideally the robot would also be useful in deeper reefs and when it is less sunny.
# A positioning system is also important, since the information of where the picture was taken should be recorded to make repeated studies of the same area valuable. The robot that dr. Meesters and dr. Bos described used an acoustic positioning device that measures the relative distance to a pole that is hung into the water from a boat. Sonar or gps were also suggested. This kind of positioning system can apparently be really expensive, it doubled the cost of their robot.
# A scale on the pictures would be very useful, so they can be compared more easily.
# The topographic complexity of the reef (the variation in coral heights) is very important, so that information should also be recorded. 3D images would be one way to visualize them, but the time cost (of taking the pictures and more importantly, processing them) is too much compared to their use. Instead, lasers or another distance measure tool, like a small chain dropped down from the robot are much easier. A multibeam echosounder is also worth looking into.
# The closer the robot can get to the corals, the higher quality the pictures will be.
# The pictures should be consistent, so the camera should be steady and preferably not rotate. If it does rotate, the angle should be part of the image data.
# The main concern regarding protecting the corals is not bumping into them. Collision detection is highly recommended. It should not only detect the corals, but also what is ahead and preferably also what is above the robot, so that it does not get stuck under something. This is particularly important in applications where the robot is being controlled live instead of travelling along a pre-programmed route.
# If the robot is being controlled live, rather than travelling along a pre-programmed route, this control should be as simple as possible. The researchers will likely be on a boat as they are controlling the robot, and the movement of the boat and seasickness makes controlling the robot much harder. We cannot take it for granted that everyone is used to gamepad controllers or joysticks.
# For the UI, it should be as clean and simple as possible, with no unnecessary information on screen. However, things like current depth, map position and the angle and direction of the camera would be useful. Multiple screens could make the interface easier to understand and calmer on the eyes.
# The robot need not be fast, since it needs to take clear pictures anyway, which means stopping every so often or moving very slowly. However, the robot should be able to resist the current. We showed some of the weirder movement systems (the Aqua2 and JenniFish), but their preference went out to the traditional thrusters, which allow for movement in multiple directions, and are more flexible and precise.
 
Other points:
# There is not actually a need for the robot to go into the grooves of the reef, which was one of our concerns since this would require the robot to be much smaller and more flexible, and would increase the risk of bumping into the corals. Due to the degradation of the reefs, there are no deep grooves anymore, so floating above the reef is sufficient.
# If the robot kicks up sand when thrusting up near the ocean floor, this would hurt the picture quality and the sand could also get into the thrusters and damage them. They recommended making the robot slightly buoyant so that it could float a little bit away from the ocean floor before thrusting up.
# Unexpectedly for us, since we saw a lot of robots advertise their real-time controls, both of the interviewees were very interested in having the robot follow pre-planned routes. Since a lot of the work is monitoring known sites, it would be ideal to be able to tell the robot where to go and how to take the pictures and have it just do that without further input needed. Real-time control is only needed for exploring unknown locations. In that scenario, more complicated movements are needed. Having some of these movements pre-programmed rather than being dependent on advanced user control is recommended, to make control easier.
# A tether causes drag on the robot, which is not good. This problem could be addressed by putting a weight on the line, so that is can rest securely at the ocean floor and be connected to the robot from that point, instead of moving directly up from the robot to the boat.
# The safety of the reefs was not a major concern when it came to material, instead the resistance of the material to salt water is the primary concern. If oil and/or grease is used it should be biologically degradable.
# Batteries can cause problems, not just in use but also in transport. Not all types of batteries are allowed to be taken onto airplanes. Some batteries can also cause explosions when charged, so they are a much bigger concern than we anticipated.
# The budget for a robot like this could range anywhere between 15 thousand to 80 thousand euros. To make a robot more appealing to the university to invest in, a modular design that makes it useful in multiple different projects helps a lot.
# The survey’s that dr. Meesters described doing in the Caribbean having 115 sites with lines of around 150 meters over the reefs, where divers have to dive down and take pictures along those lines. This exactly the kind of work that robots could assist with. Pictures should show 0.5 to 1 m2 in a picture.
 
== Literature Research ==
=== Coral Reefs ===
To design a robot for use in and around the coral reefs, we needed to research the reefs themselves, their vulnerabilities and general environments. This information is needed so we can make sure the robot does not damage the corals.
 
==== Collisions ====
It is important to consider the risks associated with having the robot move close to the corals. If these risks are manageable, then there might be value in making a smaller, more agile robot that can get up close to the reefs if needed. However, if the risks are so great that having the robot too near the reefs is unaccaptable, that option can be ignored and the robot can be slightly larger and more bulky, which opens up other design opportunities.
Direct contact between the robot and the corals could cause great damage to the corals. Corals are quite fragile, and bumping into them could easily cause them to break or damage their outer layer, which might make them move vulnerable to disease [https://doi.org/10.1016/j.biocon.2004.03.021 (Barker and Roberts, 2004)]. Especially branching corals are vulnerable to breaking, as their phsyical shape makes them very fragile (Hawking and Roberts, 1997).
Additionally, if the robot causes pieces of the coral to break off and this is not recorded, this would not be good for research as it would seem like pieces of the corals were being broken off more often without clear cause.
This makes it clear that physical contact between the robot and the corals should absolutely be avoided. This affects things like collision detection and how close the robot is allowed to get to the corals, but it also means that the robot's movement systems should be able to reliably resist the current, so that it is not pushed into the corals.


Due to scheduling issues and later the COVID-19 "lockdown" it has been hard to get this meeting, but we do have multiple contacts now so we hope to have some online meetings soon. However, these delays mean we had to do a lot of our work without direct user input.
==== Movement ====
The damage might also be more indirect. The propellors could kick up the sediment and cause turbidity in the water, if this persists too long the light penetration to the corals is affected, which can cause them to die off. If these effects persist for a very long time, this could even change the coral reef's biodiversity, as the more tolerant corals survive and the senitive corals die (PIANC, 2010). The scale of the environmental effects of the robot should therefore be kept in mind, it should not affect the environment on the long term.


=== Objectives ===
==== Material ====
To keep the robot from damaging the reefs, the materials used should also be reef safe. In discussions surrounding coral reefs, plastics are often mentioned as a problem. When they wrap around corals, they can seal them from light and oxygen and release toxins which can make the corals ill [https://www.plasticsoupfoundation.org/en/2018/01/plastic-is-making-coral-reefs-sick/ (Plastic Soup Foundation, 2018)] [https://science.sciencemag.org/content/359/6374/460?casa_token=N4b543DlBMoAAAAA:tKcErk0lZH62b-SBpCgNZ8DtgE65rBg_OGZJ_Sy9knda6WEkZM7En8jbAPTD6dAkCrBW3c_LoPFWMOY (Lamb et al., 2018)]. However, these concerns are with loose pieces of plastic that might wrap around the corals, not a robot that will quickly leave again. Other frequent plastic concerns are surrounding plastic pellets, which can spread organic pollutants [https://doi.org/10.1016/j.marpolbul.2007.03.022 (Rios, Moore and Jones, 2007)], but this is a problem caused by the manufacturing of plastic materials, not the prescense of an object made out of plastic in the ocean. Toxins are also not a great concern in this application, due to the extremely slow degredation of plastic [https://www.mdpi.com/2073-4360/5/1/1/htm#B35-polymers-05-00001 (Webb et al., 2013)]. From all of this we can conclude that using plastic for the robot is not a problem for the health of the corals, as long as the robot is not present in the same place for a long stretch of time. This means that there should be an easy way to retrieve the robot should it break down, since that is the only scenario that would lead to it being stationary in one place for that long.


The main objective of our project is to design a robot that can safely move through the coral reefs.
If water-proof coatings or paints are used, it is important that these do not contain chemicals that might damage the corals. The [https://haereticus-lab.org/protect-land-sea-certification-3/ Haereticus Environmental Laboratories] test various products for chemical pollutants, they specifically recommend avoiding anything that contains:
Our robot needs to be able to move and be stationary on the bottom of the ocean. The robot needs to navigate through the corals without bumping into it since this could damage the reefs.
* Any form of microplastic sphere or beads.
The robot design should not pollute the ocean. There should not be any big risk for the ocean. So the risk of things like battery leakage or losing the robot (which would cause it to become waste inside the ocean) should be minimalized. Additionally, the materials should be chosen right. The robot needs to be nigh unbreakable so no parts will get loose and float away. However, it should also not be too hard, as that would create too great a risk of damage to the reefs.
* Any nanoparticles like zinc oxide or titanium dioxide.
The robot has to have a tele-operating system that makes sense to the user. The user needs a way to let the robot do what they want. This can be by preprogramming or with a tele-operating system that is adapted to the needs of the user.
* Oxybenzone
The robot needs to not disturb the life that is in the coral. Fish should not see the robot as food or as predator.
* Octinoxate
Finally, the chemical components of the robot's coating should be considered, it should not cause coral bleaching or poison any fish.
* 4-methylbenzylidene camphor
* Octocrylene
* Para-aminobenzoic acid (PABA)
* Methyl Paraben
* Ethyl Paraben
* Propyl Paraben
* Butyl Paraben
* Benzyl Paraben
* Triclosan


=== Deliverables ===
==== General Coral Reef Information ====
(Needs further updating since our project goal shifted after the review session)
Finally, some general information on where the coral reefs are is useful, as it tells us in what environments our robot might be used. Most of our information comes from the [https://www.unep-wcmc.org/resources-and-data/world-atlas-of-coral-reefs-2001 World Atlas of Coral Reefs (1st edition)], the first chapter of which provides a lot of general explanation of what the coral reefs are. There are many types of corals, but for the coral reefs it is the hermatypic (reef-building) corals that are important. They flourish in warm, shallow water. Hermatypic corals grow extremely slowly, with just a few millimetres each year. Branching corals grow much faster, but even they only grow about 150 millimetres each year. Hermatypic corals lay down a skeleton of calcium carbonate, and these structures form the basis of the coral reefs, since other corals and organisms can then grown on these structures. This source provides much more information regarding the different types of reefs, their spread over the Earth and the organisms found in reefs, but this more specific information is not relevant at this stage of our project: if it turns out that the exact nature of the corals in a reef have a strong influence on the requirements of the robot, this is a topic worth revisiting, but otherwise this general information is enough.
* An in-depth exploration of current robots like our objective and an assessment of how suitable they are for coral reefs.
* A suggestion or several suggestions, based on our research, of how to design a specialized coral reef robot.
* A report on what coral reef researchers desire and how our suggested designs could meet those desired applications.


=== Connection to the Larger Problem ===
According to the Scripps Institution of Oceanography at the University of California San Diego: “Coral reefs can be found at depths exceeding 91 m (300 ft), but reef-building corals generally grow best at depths shallower than 70 m (230 ft). The most prolific reefs occupy depths of 18–27 m (60–90 ft), though many of these shallow reefs have been degraded.” [https://scripps.ucsd.edu/projects/coralreefsystems/about-coral-reefs/biology-of-corals/ (Scripps, n.d.)] Studying these degrading coral reefs will likely be one of the main applications of a research assisting robot, so the information that those coral reefs are largely close to the surface is useful.
This robot can be used to further research coral reefs more easily and it can be outfitted with various components to help improve the reefs. For instance, it could be outfitted with speakers for acoustic enrichment, so that people do not need to dive down to place them.
(To be added: more detailed examination of use cases)


== The Robot - to be moved==
=== General Underwater Robotics ===
=== Coral Reef Considerations ===
Operating in water, particularly salt water, has a great impact on the design of a robot. It needs to be well adapted to this environment, it should be able to resist the corrosion of the salt water and be waterproof so that the electronics are not damaged. It should also be able to move freely through the water.  
Risks by direct contact with coral:
* Coral breaking when bumping into coral. Especially branching corals are the most vulnerable since they are fragile because of their growth form (Hawking and roberts, 1997).
* Coral breaking when getting stuck in the propeller.
* Anchoring can break the coral.
* When not capable of handling strong currents, the robot could damage the coral by being pushed to the coral with great force.


Risks by indirect contact with coral:
==== Movement ====
* Sedimentation/turbidity caused by the propellor, which can lead to mortality of coral species as it reduces the light penetration. If sedimentation or turbidity persists for too long the coral reef’s diversity  can change, where the tolerant species replace the sensitive coral species(PIANC, 2010). 
* Plastic can seal light and oxygen from the corals and can release toxins, which can increase the chance of coral becoming ill [https://www.plasticsoupfoundation.org/en/2018/01/plastic-is-making-coral-reefs-sick/ (Plastic Soup Foundation, 2018)].
* Bacteria can travel on plastic. When pathogenic bacteria reach the coral, nothing can be done to fix it [https://www.plasticsoupfoundation.org/en/2018/01/plastic-is-making-coral-reefs-sick/ (Plastic Soup Foundation, 2018)].
* Anti-fouling paint can damage coral (PIANC, 2010)


Information on the different kinds of reefs and where they grow. Coral needs warm, well lit water as well as solid surfaces to settle on. There are 5 types of reef (mentioned in order of distance from the coast): fringing, patch, barrier, atoll and bank or platform reef, which are all explained in the paper. Per region/oceans over the world the area of reef in km2 is given. Also species diversity in coral reef over the world is given.
Many ROVs [Remote Operate Vehicles] and AUVs [Autonomous Underwater Vehicles] use a wide variety of different moving methods and techniques to navigate underwater. Some employ biomimicry, these robots move around the water in ways that are inspired by nature. However, most robots that are more oriented at professional users, such as marine researchers, use a number of propellers to move around in all directions underwater. This provides multidirectional movement. It is important that these thrusters are powerful enough to move through the current in the ocean.  
Spalding, M. D., Green, E. P., & Ravilious, C. (2001). World Atlas of Coral Reefs (1st edition). Berkeley, Los Angeles, London: University of California Press.


=== Requirements, Preferences and Constraints ===
Moving along the x- and y-axes is not a great problem, thrusters can be used to push it in a direction or some kind of steering wheel could be manipulated to allow for turning. Moving up and down is a bit more complicated. Thrusters could also be used for this, but having an underwater robot gives an alternative option as well: using buoyancy. If the robot is naturally slightly buoyant, it will start floating up if there is no downwards force acting on it, this construction would mean that thrusters are used for downward movement, but going up just means turning the thrusters off. Alternatively the density of the robot could be flexibile by having it suck water into a tank to increase the density and move down, and having it push the water out again to return to its natural, slightly buoyant state and have it move up. If this movement system is chosen, it will affect the chosen materials for the robot since the final robot will need to be less dense than salt water.
(Table still needs to be made, but for consideration:)


(In order to prevent further downgrading and instead increase the growth of the reef, large biodiversity of animals is required. It is therefore important that the robot can navigate through the water to guide the fish towards the reef. For this, the robot should firstly be able to detect where fish should be guided and then maneuver in the ocean without damaging any of the already existing reefs. Establishing which parts of the reef need to be tackled can be done by scanning the reef by means of taking pictures and comparing them to a database full of images of coral reef. A camera that operates well underwater is, therefore, a must. This camera will also be used in combination with a filter to navigate the robot in the water.
==== Sensors ====
It is likely we will want some form of obstacle avoidance, regardless of whether the movement of the robot is fully automated or controlled by the user. This system will make sure that in case the user makes a wrong judgment, no collision will happen. This is important since no harm should come to the corals.
Sonars are frequently used for collision detection. The resolution and reliability of sonar sensors degrade when being close to objects. Since the robot will have to manoeuvre a lot in coral reefs, sonar will not work sufficiently (Dunbabin, Dayoub, Lamont & Martin, 2018).


Since fish can be tricked in believing the coral is still alive through sounds of different frequencies, the robot will have to be capable of creating these sounds underwater. To achieve this, an underwater sound system is needing to be implemented inside the robot. Research done by Enger, Karlsen, Knudsen, and Sand (1993) shows that there is a wide range in frequency of what fish can hear. In this research, tests were done on different fish which resulted in hearing thresholds differing between the species. To know what frequency to send out at what moment, the robot will have to know what kind of fish are around and adjust its emitting sound. A database of the types of fish is needed to compare the with the camera detected fish.
There are real-time vision-based perception approaches that make it possible to provide a robot in coral reefs with collision avoidance technologies. To make obstacle avoidance possible the challenges of real-time vision processing in coral reef environments needs to be overcome. To do so image enhancement, obstacle detection and visual odometry can be used. Cameras are not used frequently in underwater robots but will work well in coral reefs since the coral reefs are quite close to the surface (Spalding, Green & Ravilious (2001)), and the water in these areas is very clear.  


Currently, the work that is needing to be done to make sure the coral reef is either still intact or regrowing, requires divers to go down in the ocean to explore and help out. Since the robot is going to replace these divers, they will have to be able to do exactly what the divers already can. This means that the robot should not solely be capable of helping the coral grow back, but also be able to communicate the current state of the ocean back to researchers. For this reason, underwater wireless communication should be possible.
Image enhancement is useful for making the processed images more valuable. To do so different types of colour correction are applied. And for detection semantic monocular obstacle detection can be used. Dunbabin et al.(2018) explain that for shallow water position estimation visual odometry combined with operational methods to limit odometry drift was explored and evaluated in early work using a vision only AUV. This already showed navigation performance errors of <8% of distance travelled (Dunbabin et al. 2018). Therefore this visual odometry can be used for the robot.  
For to robot to be moving around in the water and going fairly deep, it should contain a battery that can last long enough for the robot to go down explore the area and lure fish long enough to an area that they will migrate there. However, implementing a system inside the robot that will generate energy, would increase the robot's functionality. This way the robot can be used more effectively over greater distances.)


== State of the Art and Literature Study - Potentially Scrapped ==
==== Localization ====
(This is left over from when we were still going to design our own robot from scratch. This research still can be used so it will be left here, though it will be reformatted or removed for the final delivery.)
[[File:Localization image.png|thumb|right|350 px|'''fig. 1''' Illustrating how an AUV can be located by means of transducers sending out signals to transponders located on the seafloor. ''Note.'' Reprinted from “Absolute Positioning of an Autonomous Underwater Vehicle Using GPS and Acoustic Measurements”, by Kussat, N. H., Chadwell, C. D., & Zimmerman, R., 2005, IEEE Journal of Oceanic Engineering, 30(1), 153–164.]]
=== Movement ===
* Options to move left and right:
** Servo motors (or stepper) can be used to turn the tail → need to research the difference and see which one is easier and better to use
** The rotor itself can turn → does this exist?
* Options to move up and down:
** Regulate the density by sucking water in or out a water tank
** Use a servo or stepper motor to tilt a tail up and down (which causes it to move up or down) → need to prototype to find out whether it works
* The robot can move down to the reef and be navigated to a spot where it can stay stationary. Two clips will grab a piece of rock to attach it to the ground
* Waterproof/ materials/ making process:
** To 3D print it, is not the most viable option as it has to be printed in multiple parts and a 3D print needs a coating to make it waterproof. It is also a challenge to make a waterproof system which clicks into each other so that the prototype can be opened up for testing and maintenance
** We could use a mold and make it from rubber or silicone. These parts can easily be screwed together as they are less rigid than 3D printed plastic. Through tests we need to find out whether this is waterproof
** We could also put all the electronics in a bottle or waterproof bag so that they will not get damaged if the casing leaks


=== Sensors ===
As a user, you want to know where your robot is, particularly if you are controlling it from a distance. For this, localization is needed.
* https://backend.orbit.dtu.dk/ws/portalfiles/portal/112200773/ChristensenPAAR2015.pdf
* https://www.sciencedaily.com/releases/2010/11/101123121105.htm
Fraunhofer-Gesellschaft. (2010, November 23). Underwater robots on course to the deep sea. ScienceDaily. Retrieved February 8, 2020 from www.sciencedaily.com/releases/2010/11/101123121105.htm
* “ The engineers from Fraunhofer Institute for Optronics, System Technologies and Image Exploitation in Karlsruhe, Germany are working on the "eyes" for underwater robots. Optical perception is based on a special exposure and analysis technology which even permits orientation in turbid water as well. First of all, it determines the distance to the object, and then the camera emits a laser impulse which is reflected by the object, such as a wall. Microseconds before the reflected light flash arrives, the camera opens the aperture and the sensors capture the incident light pulses.
* “The powerful but lightweight lithium batteries”


=== User Interaction and Communication===
As GPS is known to not work subsurface and thus cannot solely be used to detect the location of our underwater robot. It is, however, possible for GPS positions to be transferred underwater. As Kussat, Chadwell, & Zimmerman (2005, p. 156) state, the location of autonomous underwater vehicles (AUVs) can be determined by acquiring ties between the GPS before and after subsurface and integrating the acceleration, velocity, and rotation of the vehicle during the time of subsurface. However, they go on stating that this method causes an error of 1% of the distance traveled, which means a 10 m error with an AUV track of 1 km. According to them, this error occurs due to the quality of the inertial measurement unit (IMU).  
Different underwater communication technologies are mentioned. Each technology has its own benefits and downsides. This paper explains how each works and what the speed, distance, power etc. it needed or can be reached.  


Kaushal, H., & Kaddoum, G. (2016). Underwater Optical Wireless Communication. IEEE Access, 4, 1518–1547. https://doi.org/10.1109/access.2016.2552538
Having such an error in an underwater robot that is being teleoperated can create many issues with relocating where the robot has been. The ocean is yet still a very unknown and widely spread out space where there is constant movement that can interrupt the location constantly. This could be problematic if one were to want to research in the same location again. For this reason, it is highly important to be able to detect an object's location, as accurately as possible as well as above water as underwater. Kussat et al.(2005, p. 156), continue explaining how a much more precise localization for AUVs can be achieved by combining precise underwater acoustic ranging and kinematic GPS positioning together.  


* Our implementation needs to be long-range
To use such a method, a precise measuring of travel time is required. Acquired travel time data with a resolution of only a couple of microseconds, can be achieved by improving the correlator system (Spiess, 1997). This can be done by having fixed delay lines and cross-correlation of a coded signal in the transponders (Kussat et al., (2005, p. 156). Kussat et al, (2005, p.156) continue to explain that the method starts off by determining the location of the transducers, usually aboard a ship, by means of kinematic GPS. With their method transponders were placed on the seafloor, receiving signals from the transducers. This was done so a coordinate frame could globally be referenced. As a second step the autonomous underwater vehicles were located relative to these transponders by means of acoustic signals (see fig. 1).  
* Our implementation should not bother the fish or reefs
* We should consider/mention having some basic autonomous movement in case the wireless connection is disrupted. Also, in case of interference the robot should reject command it believes will damage the coral reefs (probably, the risk there is that it will misunderstand its environment and overrule the commands it receives and damage the coral reefs as a result)
* The kind of systems for telecommunication used to reach the depth of the ocean will be too expensive in our example case. We will probably use a way cheaper module in our example case and mention what could be used in the final product.
* Best currently available to normal people: http://www.top10drone.com/best-underwater-drones/
** First, third and fourth ones uses wi-fi (fourth mentions having a wifi module inside the robot itself, but it is intended for close-to-civilization work)
** Second one is connected to a cord but also has bluetooth functionality
** Note: these all seem to go at most 100m deeps, whereas corals can be 400-6000m deep (https://ocean.si.edu/ocean-life/invertebrates/corals-and-coral-reefs). But there are also shallow water coral reefs that are as high as 15m below the surface, so we should be good. But we could look into what coral reefs we can reach with our modules. (Amit has a source that says up to 70m deep is fine)
* Options for real deep sea:
** Bluetooth
** Wifi (previously mentioned drones mention interference when using wifi, https://www.powervision.me/en/product/powerdolphin/specs mentions using wifi to connect to a mobile device on the shore, assume there is no interference)
** Ultrasound (will it interfere with the fish? They are bothered more by low-frequency sounds than high-frequency sounds but is it completely safe? I can’t quickly find sources that say it’s no problem at all but a lot of researches have suggested using ultrasound to research coral reefs so it is likely fine) (we should also consider if it is a reliable way of doing things, since the soundwaves could bounce off of the corals, which would cause a lot of interference)
** Worst case option is using a tether - but that could cause damage or be damaged
** We don’t really have to worry about the water-air barrier, we can use buoy with signal receiver underwater to avoid interference. https://www.oceantechnologysystems.com/store/ffm-buddy-phone-packages/interspiro-aga-mkii-ffm-buddy-phone-package/ example of underwater communication between divers - ultrasonic
* Options for prototype: we can just use a cheap wifi module or ultrasound or something. It can be quite weak since we will be prototyping it in clear, shallow water.


=== Components ===
==== User Interaction and Communication ====
* “The most prolific reefs occupy depths of 18–27 m (60–90 ft), though many of these shallow reefs have been degraded.” - Biology of Corals | Coral Reef Systems
If the robot is sent out along a preprogrammed path, it can collect and store all its data while traveling and does not need to transfer it during its movement underwater. It would be good to have the robot send some signal to the users of roughly where it is so that if something goes wrong the users know where to pick it up, but this information is quite simple and does not need to be updated constantly, regular updates of its current position would be enough. On the other hand, if we want the user to be able to control the robot in real-time, the user should have access to real-time information about where the robot is and what its surroundings are. This will likely involve video footage and potentially the data from some other sensors as well, for instance, close range sonar data to see if the robot is about to bump into something in any of the directions the camera is not pointing. This is a lot of information that needs to be transmitted very quickly to the user. The user should also be able to give commands to the robot that it responds to very quickly. "high-frequency EM signals cannot penetrate and propagate deep in underwater environments. The EM properties of water tend to resist their propagation and cause severe attenuation." (Qureshi, et al., 2016). Due to this, a tether must be used for most communication with the robot in the use-case of live control. While there are systems that exist for underwater wireless communication they are rare, pricey, and reserved for military and large scale project use.
* Building an Arduino-powered underwater ROV - could be useful to see the software and component selection process
* Arduino-Based Submersible Robot Maps the Threatened Coral Reefs
* Underwater robot control system based on Arduino platform and robot vision module


* Motors
== Robot Design Requirements ==
** Waterproof Servos
From our literature research and our user interview, we can conclude a number of important factors to keep in mind when looking at a robot for coral reef researchers. A brief summary of those points:
* Camera
* The robot's movement should not damage the reefs or the fish around the reefs.
** Arduino designed VS external camera
** This means that obstacle avoidance or some other way of preventing damage to the reefs in case of collision is required. According to dr. Meesters, obstacle avoidance is the way to go here
* Arduino based
** The method of movement also should not damage the reefs, for instance: there should be no danger of fish or parts of the corals getting stuck in the thrusters, if thrusters are used. While we expected this to be important, our interview told us it was not a major concern, mainly since the robot will likely not be getting that close to the corals.
* Wireless communication module (Wi-Fi?)
** The movement of the robot should not cause permanent changes to the environment of the reef. If there are no collisions, this is less of a major concern. Kicking up sand when taking off is a problem to consider, but this is less important for the reef environment and more for the quality of the photos and the robot not getting damaged.
** Arduino and HC-12 Long Range Wireless Communication Module
* The materials that make up the robot should not be damaging to the coral reefs, even if the robot gets damaged itself.
* Power source
** Some way of locating and retrieving the robot in case of failure will be needed, to prevent the robot from getting lost and becoming waste in the ocean.
* Weights?
** Batteries should be chosen so as to minimize damage in case of leakage, and they should also be safe to transport and charge.
** We need to calculate the density it needs to navigate based on the motor
** Chemicals or materials that might damage the reefs should not be used.
* Sensors
* The robots physical design should suit the coral reefs.
** Location
** Fish should not see it as prey or predators to avoid the robot being damaged by fish or causing too great a disruption.
** Pressure
** Depending on how close the robot moves to the corals, it should not have bits that stick out and might get caught in the corals. Based on our interview there will not be a major need for the robot to move that close to the corals, but a sleak design will still be useful for movement.
** Gyro / accelerometer
* The robot should suit the needs of the researchers (our user group) in use.
** The main task the robot will be assisting with is taking pictures for research into the state and development of the reefs. This means at least one good camera is needed, and the pictures should be taken in such a way that the information is consistent and useful. This means the camera should take all pictures at the same angle and should include information like the scale of the picture and the location the picture was taken at.
** The user-interface and control methods should be adapted to the needs of the user, simplicitly is key here. The robot should not be too confusing or difficult to control, and more nuanced control might actually make it harder to get consistent results.
** Specific attention should be paid to how the needs of coral reef researchers differ from those of hobbyists or general divers, since our research into existing robots will include robots aimed at different user bases.
* The researchers should want to buy this robot.
** Besides the usefulness of the robot, this involves researching what cost would be acceptable. The main point here seems to be that it is important to consider what features might make a university, research institute or organisation willing to pay for such a product. For instance, if the robot would be more likely to get bought if it could be applied in multiple different ways.


== Research into Current Underwater Drones ==
== Research into Current Underwater Drones ==
We are not the first people to think of using robots in underwater environments or for research, so it is worth considering what already exists on the market. This gives us insight into the standards for underwater robots and what might still be missing for our specific user group.
=== Scubo / Scubolino ===
=== Scubo / Scubolino ===
https://tethys-robotics.ch/index.php/robots/
[[File:Scubo2.png|thumb|right|350 px|Scubo 2.0. Image retrieved from https://tethys-robotics.ch/index.php/robots/]]
https://blog.arduino.cc/2016/06/13/scubo-is-an-omnidirectional-robot-for-underwater-exploration/


Scubo is an ROV built by Tethys Robotics, a robotics team from ETH Zurich. (https://tethys-robotics.ch/index.php/robots/). Scubo's defining feature is its omnidirectional movement, as well as its modularity. Scubo uses 8 propellers that extrude from its main body in order to allow the tethered robot to move with extreme agility underwater. The robot is made out of a carbon cuboid, which features a hole throughout the middle for better water flow and also cooling of the electronic components. It is constructed to be neutrally buoyant, allowing depth control through natural movement through the 8 propellers. On its body, there are 5 universal ports for modularity. The robot’s tether provides power for the onboard batteries, as well as allow direct control from a computer outside the water. It is controlled with a SpaceMouse Joystick. For processing, Tethys say they use an Arduino Due for the hard, real-time tasks, and an Intel NUC for high-performance calculations. (https://blog.arduino.cc/2016/06/13/scubo-is-an-omnidirectional-robot-for-underwater-exploration/)
Scubo is an ROV built by Tethys Robotics, a robotics team from ETH Zurich [https://tethys-robotics.ch/index.php/robots/ (Tethys, 2019)]. Scubo's defining feature is its omnidirectional movement, as well as its modularity. Scubo uses 8 propellers that extrude from its main body in order to allow the tethered robot to move with extreme agility underwater. The robot is made out of a carbon cuboid, which features a hole throughout the middle for better water flow and also cooling of the electronic components. It is constructed to be neutrally buoyant, allowing depth control through natural movement through the 8 propellers. On its body, there are 5 universal ports for modularity. The robot’s tether provides power for the onboard batteries, as well as allow direct control from a computer outside the water. It is controlled with a SpaceMouse Joystick. For processing, Tethys say they use an Arduino Due for the hard, real-time tasks, and an Intel NUC for high-performance calculations [https://blog.arduino.cc/2016/06/13/scubo-is-an-omnidirectional-robot-for-underwater-exploration/ (Arduino Team, 2016)].


=== LarvalBot / RangerBot ===
=== LarvalBot / RangerBot ===
===== Descriptions =====
 
[[File:RangerBot.jpg|thumb|right|350 px|The RangerBot robot swimming above the a coral reef. Image retrieved from https://good-design.org/projects/rangerbot/]]
 
Initially, the Queensland University of Technology (QUT) designed the COTBot to deal with the dangerous crown-of-thorns starfish that threatens coral reefs. This design got improved quite a bit (most importantly, it was reduced in size and cost) to make the RangerBot, which has the same purpose as the COTBot. This design then got shifted around a bit to make the LarvalBot, which is extremely similar to the RangerBot, but instead of killing starfish this robot is used to deliver coral larvae to promote the growth of new coral. All three of the robot designs were specifically for coral reefs. Most sources on the robots focus on the video analysis capabilities most of all, so that seems to be where the innovation of this robot lies, not the coral reef application.
Initially, the Queensland University of Technology (QUT) designed the COTBot to deal with the dangerous crown-of-thorns starfish that threatens coral reefs. This design got improved quite a bit (most importantly, it was reduced in size and cost) to make the RangerBot, which has the same purpose as the COTBot. This design then got shifted around a bit to make the LarvalBot, which is extremely similar to the RangerBot, but instead of killing starfish this robot is used to deliver coral larvae to promote the growth of new coral. All three of the robot designs were specifically for coral reefs. Most sources on the robots focus on the video analysis capabilities most of all, so that seems to be where the innovation of this robot lies, not the coral reef application.
The RangerBot has some extra features as well, such as water quality sensors and the ability to collect water samples [https://www.smithsonianmag.com/innovation/sea-star-murdering-robotsa-are-deployed-in-great-barrier-reef-180970177/ (Braun, 2018)].


===== Movement =====
Both RangerBot and LarvalBot have 6 thrusters [https://doi.org/10.1115/1.2018-OCT-2 (Zeldovich, 2018)] that allow full six degrees-of-freedom control, including hover capabilities [https://www.semanticscholar.org/paper/Real-time-Vision-only-Perception-for-Robotic-Coral-Dunbabin-Dayoub/6bae34be3181362f6ebec71815fc2d6c39c8f305 (Dunabin et al., 2019)].
Both RangerBot and LarvalBot have 6 thrusters (https://asmedigitalcollection.asme.org/memagazineselect/article/140/10/36/369143/The-Starfish-TerminatorResearchers-Looking-to-Stop) that allow full six dof control, including hover capabilities ((https://scholar.google.com/scholar?cites=10376427326136424177&as_sdt=2005&sciodt=0,5&hl=en).
Both are controlled with an app that, according to the creators, takes just 15 minutes to learn [https://www.qut.edu.au/research/article?id=135108 (Dunabin, August 2018)].
Both are controlled with an app that, according to the creators, takes just 15 minutes to learn (https://www.qut.edu.au/research/article?id=135108).
The LarvalBot follows only preselected paths at a constant altitude, with the release of the larvae being controlled by user input [https://www.qut.edu.au/news?id=137688 (Dunabin, November 2018)], it is unclear if the RangerBot also follows a pre-selected path, but it is likely since it operates fully automatically to dispatch the crown-of-thorns starfish, and there is no mention of automatic pathfinding in any of the report.
The LarvalBot follows only preselected paths at a constant altitude, with the release of the larvae being controlled by user input (https://www.qut.edu.au/news?id=137688), it is unclear if the RangerBot also follows a pre-selected path, but it is likely since it operates fully automatically to dispatch the crown-of-thorns starfish, and there is no mention of automatic pathfinding in any of the report.


===== Collision Detection =====
RangerBot and LarvalBot have 2 stereo cameras [https://doi.org/10.1115/1.2018-OCT-2 (Zeldovich, 2018)]. The RangerBot uses these for obstacle avoidance amongst other functions, it is not specified if the LarvalBot has obstacle avoidance [https://www.qut.edu.au/research/article?id=135108 (Dunabin, August 2018)]. RangerBot uses video for obstacle avoidance because the usual method, sonar, does not work in coral reef environments. Unwater use of sonar and ultrasound is difficult either way, but the degree of complexity in coral reefs makes it manageable. Video is usually not used for this purpose because deep in the water it quickly gets dark and murky. However, coral reefs are quite close to the surface (or at least, the coral reefs we are concerned with are) and they are in clear water, so this environment is uniquely suited for cameras as a sensor [https://www.semanticscholar.org/paper/Real-time-Vision-only-Perception-for-Robotic-Coral-Dunbabin-Dayoub/6bae34be3181362f6ebec71815fc2d6c39c8f305 (Dunabin et al., 2019)].
RangerBot and LarvalBot have 2 stereo cameras (https://asmedigitalcollection.asme.org/memagazineselect/article/140/10/36/369143/The-Starfish-TerminatorResearchers-Looking-to-Stop). The RangerBot uses these for obstacle avoidance amongst other functions, it is not specified if the LarvalBot has obstacle avoidance. (https://www.qut.edu.au/research/article?id=135108). RangerBot uses video for obstacle avoidance because the usual method, sonar, does not work in coral reef environments. Unwater use of sonar and ultrasound is difficult either way, but the degree of complexity in coral reefs makes it manageable. Video is usually not used for this purpose because deep in the water it quickly gets dark and murky. However, coral reefs are quite close to the surface (or at least, the coral reefs we are concerned with are) and they are in clear water, so this environment is uniquely suited for cameras as a sensor. (https://scholar.google.com/scholar?cites=10376427326136424177&as_sdt=2005&sciodt=0,5&hl=en).
===== Design, Materials, and Shape =====
Pictures of the design are here: https://good-design.org/projects/rangerbot/
RangerBot weighs 15 kg and is 75 cm long (https://www.qut.edu.au/research/article?id=135108), the LarvalBot will be bigger because it has the larvae on board, but the base robot has the same size. The RangerBot floats above the reefs and can only reach the easy-to-reach COTS (https://asmedigitalcollection.asme.org/memagazineselect/article/140/10/36/369143/The-Starfish-TerminatorResearchers-Looking-to-Stop), that is why it can have the slightly bulky shape and can have the handle at the top without bumping into the reefs being a concern.
===== Power Source =====
RangerBot and LarvalBot can both be untethered. However, the current version of LarvalBot is still tethered ((https://www.barrierreef.org/news/news/Robot%20makes%20world-first%20baby%20coral%20delivery%20to%20Great%20Barrier%20Reef). It is not mentioned in the report why, but I would assume this is because you need reliably real-time video to be able to tell the LarvalBot to drop the larvae at the right time. When without a tether, the robot can last 8 hours and it has rechargeable batteries. (https://www.youtube.com/watch?v=7zjKTvj0lB4).


===== Sources and Facts (Myrthe delete this?)=====
RangerBot weighs 15 kg and is 75 cm long [https://www.qut.edu.au/research/article?id=135108 (Dunabin, August 2018)], the LarvalBot will be bigger because it has the larvae on board, but the base robot has the same size. The RangerBot floats above the reefs and can only reach the easy-to-reach crown-of-thorns starfish [https://doi.org/10.1115/1.2018-OCT-2 (Zeldovich, 2018)], that is why it can have the slightly bulky shape and can have the handle at the top without bumping into the reefs being a concern.
(Will be cleaned up and sources added to the reference section eventually)
There is unfortunately no information available online on what materials the robot was made out of.
* LarvalBot is an updated version of the RangerBot, both are specifically made for coral reefs
* LarvalBot specializes in delivering baby corals to coral reefs, RangerBot in killing crown-of-thorns starfish. But their general design is the same and specific for coral reefs. Most sources focus more on the video analysis capabilities of the robot, not things like hull design and material, we might need to contact the creators  (https://research.qut.edu.au/qcr/people/matthew-dunbabin/).
* https://phys.org/news/2018-11-reef-rangerbot-larvalbot-coral-babies.html - the robots follow predetermined paths with manual input
* https://www.barrierreef.org/news/news/Robot%20makes%20world-first%20baby%20coral%20delivery%20to%20Great%20Barrier%20Reef - need to do further research, but this implies the modification from RangerBot to LarvalBot was made quite quickly, so the general design must be pretty adaptable. At time of this article, it was a tethered robot, but they were discussing doing it wirelessly.  
* https://www.qut.edu.au/news?id=137688 - LarvalBot does preselected paths at a constant altitude
* https://www.qut.edu.au/research/article?id=135108 - RangerBot has a camera and is controlled by an app. It took about 2 years to develop. This one has computer vision, real-time navigation and obstacle avoidance. “Multiple thrusters so it can move in every direction”. This article claims the RangerBot is innovative for using vision-based sensors instead of acoustics-based sensors.“Weighing just 15kg and measuring 75cm, it takes just 15 minutes to learn how to operate RangerBot using a smart tablet.”
* https://good-design.org/projects/rangerbot/ - high def pictures
* https://www.smithsonianmag.com/innovation/sea-star-murdering-robotsa-are-deployed-in-great-barrier-reef-180970177/ - “They also fleshed out RangerBot’s kit, giving it water-quality sensors, lights, removable batteries, and an extra thruster so that it could gather water samples, operate at night and for longer periods, and maneuver in all directions.”
* https://en.wikipedia.org/wiki/COTSBot - the version before RangerBot apparently used GPS for navigation
* https://asmedigitalcollection.asme.org/memagazineselect/article/140/10/36/369143/The-Starfish-TerminatorResearchers-Looking-to-Stop - finally some actual info on the COTSBot and RangerBot: “Equipped with six thrusters, two stereo camera systems used for navigation and detection, RangerBot will be smaller, more maneuverable, and hopefully affordable enough to build a fleet. The idea is that RangerBots would rid the reef of as many easy-to-reach COTS as possible, and then the divers would pull other starfish out of crevices with hooks and finish them off manually.”
* https://scholar.google.com/scholar?cites=10376427326136424177&as_sdt=2005&sciodt=0,5&hl=en - paper on using vision for navigation and collision detection, using the RangerBot as an example case. Explains that sonar doesn’t really work in coral reef environments since they are too complicated. But the camera’s works better than in other marine scenarios because the coral reefs are quite close to the surface, and the water is very clear. “The RangerBot AUV is built around two stereo camera pairs which provide all navigation, obstacle avoidance and science/management task information. The downward stereo pair has a camera baseline of 75 mm, with the forward stereo camera pair having a baseline of 120 mm. All image processing and mission execution software run on-board the AUV using an NVIDIA Jetson TX2 module as the primary computation capability.” “Its unique thruster configuration allows full six Degree-of-Freedom control, including hover capabilities which is essential for low altitude maneuvering in complex coral reef environments.


=== Ocean One ===
RangerBot and LarvalBot can both be untethered. However, the current version of LarvalBot is still tethered [https://www.barrierreef.org/news/news/Robot%20makes%20world-first%20baby%20coral%20delivery%20to%20Great%20Barrier%20Reef (Dunabin, December 2018)]. It is not mentioned in the report why, but it can be assumed this is because you need reliably real-time video to be able to tell the LarvalBot to drop the larvae at the right time. When without a tether, the robot can last 8 hours and it has rechargeable batteries [https://www.youtube.com/watch?v=7zjKTvj0lB4 (Queensland University of Technology, 2018)].
===== Movement =====
The robot is designed for research purposes. Due to its design, the robot can take over tasks from human divers such as exploring archaeological sites or collecting a specimen. Moreover, it can excess places where human divers cannot go, such as places below 50 meters deep.  


The robot has a high level of autonomy and is controlled via an intuitive, haptic interface. The interface “provides visual and haptic feedback together with a user command center (UCC) that displays data from other sensors and sources” (Khatib et al., 2016, p.21) . There are two haptic devices (sigma.7), a 3D display, foot pedals, and a graphical UCC. A relay station, which is connected to the controller, allows the robot to function without a tether.
=== OpenROV v2.8 ===


“The body is actuated by eight thrusters, four on each side of the body. Four thrusters
[[File:openROV.jpg|thumb|right|350 px|The OpenROV v2.8. Image retrieved from https://openrov.dozuki.com/c/OpenROV_v2.8_%28Kit_Assembly%29]]
control the yaw motion and planar translations, while four others control the vertical translation, pitch, and roll. This thruster redundancy allows full maneuverability in the event
of a single thruster failure” (Khatib et al., 2016, p.21).
 
===== Design, Material, and Shape =====
The robot has an anthropomorphic shape as it should have the same capabilities as human divers. The design of the robot’s hands allow “delicate handling of samples, artifacts, and other irregularly-shaped objects” (Khatib et al., 2016, p.21). “The lower body is designed for efficient underwater navigation, while the upper body is conceived in an anthropomorphic form that offers a transparent embodiment of the human’s interactions with the environment through the haptic-visual interface” (Khatib et al., 2016, p.21).
===== Power Source =====
The relay station can be used as a nearby charging station.
===== Collision Detection =====
In the paper, the authors seem to hint that the robot is equipped with a collision detection system. The hard- and software behind it is not explained.


=== OpenROV v2.8 ===
The purpose of the OpenROV robot kit is to bring an affordable underwater drone to the market. It costs about 800 euro to buy the kit and 1300 euro to buy the assembled robot (OpenROV, n.d.). The user is challenged to correctly assemble the robot, a fun and rewarding experience. Once the robot is made, the user is free to decide where and how to use the robot. The most common use is exploring the underwater world and making underwater videos.
=====Movement=====
The purpose of the robot kit is to bring an affordable underwater drone to the market. It costs about 800 euro to buy the kit and 1300 euro to buy the assembled robot (OpenROV, n.d.). The user is challenged to correctly assemble the robot, a fun and rewarding experience. Once the robot is made, the user is free to decide where and how to use the robot. The most common use is exploring the underwater world and making underwater videos.


The robot is remotely controlled and has a thin tether. Via a software program, which is provided and can be installed on a computer, the user is able to control the robot. The camera on the robot provides live feedback on the robots’ surroundings and the way it reacts to direction changes.  
The robot is remotely controlled and has a thin tether. Via a software program, which is provided and can be installed on a computer, the user is able to control the robot. The camera on the robot provides live feedback on the robots’ surroundings and the way it reacts to direction changes.  
Line 247: Line 283:
Other crucial parts are a Genius WideCam F100 camera, a BeagleBone Black processor, an Arduino, a tether, lithium batteries and extra weights.
Other crucial parts are a Genius WideCam F100 camera, a BeagleBone Black processor, an Arduino, a tether, lithium batteries and extra weights.


=====Design/ material/ shape=====
All the components are held in place by a frame and stored in waterproof casings. The frame and multiple casings are made from laser cutted acrylic parts. The acrylic parts are glued together with either acrylic cement or super glue. When wires are soldered together, heat shrinks are used to secure a waterproof connection. Epoxy is used to fill up areas between wires to prevent leakage.
All the components are held in place by a frame and stored in waterproof casings. The frame and multiple casings are made from laser cutted acrylic parts. The acrylic parts are glued together with either acrylic cement or super glue. When wires are soldered together, heat shrinks are used to secure a waterproof connection. Epoxy is used to fill up areas between wires to prevent leakage.


The design of the robot does not look very streamlined but moves surprisingly smooth. A drawback of the design is that the lower part, which contains the batteries, is always visible on the camera feed. This can be annoying when watching the video.
The design of the robot does not look very streamlined but moves surprisingly smooth. A drawback of the design is that the lower part, which contains the batteries, is always visible on the camera feed. This can be annoying when watching the video.


The robot is relatively small and can be compared to the size of a toaster. This is ideal for moving through places such as shipwrecks and coral reefs.  
The robot is relatively small and can be compared to the size of a toaster. This is ideal for moving through places such as shipwrecks and coral reefs.


=====Power source=====
The robot uses 26650 (26.5mm × 65.4mm) Lithium batteries. The batteries need to be charged beforehand. The guide warns the user that the batteries could be dangerous and must be charged with the correct amount of voltage, which is 3V.
The robot uses 26650 (26.5mm × 65.4mm) Lithium batteries. The batteries need to be charged beforehand. The guide warns the user that the batteries could be dangerous and must be charged with the correct amount of voltage, which is 3V.


=== Aqua 2 ===
=== Aqua2 ===
 
[[File:aqua2.jpg|thumb|right|350 px|The Aqua2. Image retrieved from https://auvac.org/configurations/view/179]]
 
The Aqua2 robot is designed to assist human divers. It is currently employed for monitoring underwater environments and provides insights regarding robotic research and propulsion methods. The robot can go 30 meters deep and has a maximum speed of 1 m/s (AUVAC, n.d.).
 
The robot can operate autonomously or be controlled via a tablet. The tablet is waterproof, so that the divers can control the robot while being underwater. When the user tilts the tablet in a certain direction, the robot moves in the same direction (De Lange, 2010).
 
Aqua2 has six flippers which move independently. The fins move slowly, this results in little disturbances in the robot's surroundings. The robot can also be employed on land. There it walks with a speed of 0.7 m/s.
 
The rectangle shaped body of the AquaA2 robot is made from aluminum. The flippers are made from vinyl and have steel springs inside. These materials are sea water proof.
 
The robot is powered through two lithium batteries. The batteries have a voltage of 28.8 V and a capacity of 7.2 Ah. After 5 hours, the batteries must be recharged, which takes 8 hours.
 
It is equipped with image collection and processing software. This allows the robot to operate autonomously. The image procession is made available through the ROS development environment and the OpenCV vision library.
 
=== Ocean One ===
 
[[File:ocean1.jpeg|thumb|right|350 px|The Ocean One. Image retrieved from https://spectrum.ieee.org/automaton/robotics/humanoids/stanford-humanoid-submarine-robot]]
 
The robot is designed for research purposes. Due to its design, the robot can take over tasks from human divers such as exploring archaeological sites or collecting a specimen. Moreover, it can excess places where human divers cannot go, such as places below 50 meters deep.
 
The Ocean One has a high level of autonomy and is controlled via an intuitive, haptic interface. The interface “provides visual and haptic feedback together with a user command center (UCC) that displays data from other sensors and sources” (Khatib et al., 2016, p.21) . There are two haptic devices (sigma.7), a 3D display, foot pedals, and a graphical UCC. A relay station, which is connected to the controller, allows the robot to function without a tether.
 
“The body is actuated by eight thrusters, four on each side of the body. Four thrusters
control the yaw motion and planar translations, while four others control the vertical translation, pitch, and roll. This thruster redundancy allows full maneuverability in the event
of a single thruster failure” (Khatib et al., 2016, p.21).
 
It has an anthropomorphic shape as it should have the same capabilities as human divers. The design of the robot’s hands allow “delicate handling of samples, artifacts, and other irregularly-shaped objects” (Khatib et al., 2016, p.21). “The lower body is designed for efficient underwater navigation, while the upper body is conceived in an anthropomorphic form that offers a transparent embodiment of the human’s interactions with the environment through the haptic-visual interface” (Khatib et al., 2016, p.21).
 
The relay station can be used as a nearby charging station.
 
In the paper, the authors seem to hint that the robot is equipped with a collision detection system. The hard- and software behind it is not explained.


=== M-AUE ===
=== M-AUE ===
===== Current Use =====
 
Answering questions about plankton in the ocean. The M-AUEs study environmental processes in the ocean (Reisewitz, 2017).
[[File:maue.PNG|thumb|right|350 px|An exploded view of the M-AUE. Image retrieved from https://www.nature.com/articles/ncomms14189/figures/2]]
===== Movement =====
 
The M-AUE's were developed to do research into plankton in the ocean. The M-AUEs study environmental processes in the ocean (Reisewitz, 2017).
 
The idea of the M-AUEs is that they move just like planktons in the ocean, by adjusting their buoyancy (through programming) going up and down while drifting with the current (Reisewitz, 2017). The author continues stating that the M-AUEs are moving vertically against the currents that are caused by the internal waves; they do this by repeatedly changing the buoyancy.  
The idea of the M-AUEs is that they move just like planktons in the ocean, by adjusting their buoyancy (through programming) going up and down while drifting with the current (Reisewitz, 2017). The author continues stating that the M-AUEs are moving vertically against the currents that are caused by the internal waves; they do this by repeatedly changing the buoyancy.  


“The big engineering breakthroughs were to make the M-AUEs small, inexpensive, and able to be tracked continuously underwater,” said Jaffe (inventor of the M-AUEs) in Reisewitz (2017).
“The big engineering breakthroughs were to make the M-AUEs small, inexpensive, and able to be tracked continuously underwater,” said Jaffe (inventor of the M-AUEs) in Reisewitz (2017).
===== Control =====
 
The M-AUEs are designed to not go as deep to a point that they would be coming close to the seafloor. This will mean that in case of being used for coral research they will not maneuver through the coral, but instead hover above them. The M-AUEs are preprogrammed with a PID control algorithm, meaning that they will sense the actual depth that they are currently in and adjust the settings to the desired depth that they want to be at.
The M-AUEs are designed to not go as deep to a point that they would be coming close to the seafloor. This will mean that in case of being used for coral research they will not maneuver through the coral, but instead hover above them. The M-AUEs are preprogrammed with a PID control algorithm, meaning that they will sense the actual depth that they are currently in and adjust the settings to the desired depth that they want to be at.
===== Localizing =====
 
According to Reisewitz (2017), acoustic signals are used to keep track of the M-AUEs while submerging, since GPS does not work under water. Receiving three-dimensional  information on the location every 12 seconds, showed where exactly the M-AUEs are inside the ocean. Augliere (2019), says that GPS equipped moorings (which float on the surface of the ocean) are being used to send out sonar pings in those 12 seconds. These pings are then received by the hydrophones of the M-AUEs (which are located around 50 meters deep from the surface), giving data which researchers can use to localize them.
According to Reisewitz (2017), acoustic signals are used to keep track of the M-AUEs while submerging, since GPS does not work under water. Receiving three-dimensional  information on the location every 12 seconds, showed where exactly the M-AUEs are inside the ocean. Augliere (2019), says that GPS equipped moorings (which float on the surface of the ocean) are being used to send out sonar pings in those 12 seconds. These pings are then received by the hydrophones of the M-AUEs (which are located around 50 meters deep from the surface), giving data which researchers can use to localize them.
===== Battery =====
 
As stated by Augliere (2019), the M-AUEs have batteries with power to last for several days and also data storage that can last that long. “The system is powered by a battery pack made from 6 Tenergy 1.2 V, 5000, mA hour NiMH cells in a series configuration. The battery pack is fused with a 2 A fuse and is recharged at an optimal rate of 600 mA” (Jaffe et al., 2017)
As stated by Augliere (2019), the M-AUEs have batteries with power to last for several days and also data storage that can last that long. “The system is powered by a battery pack made from 6 Tenergy 1.2 V, 5000, mA hour NiMH cells in a series configuration. The battery pack is fused with a 2 A fuse and is recharged at an optimal rate of 600 mA” (Jaffe et al., 2017)
===== Design and Material =====
 
The size of the M-AUEs is to be compared with the size of a grapefruit. It exists out of 2 shells that are concentric and made out of syntactic foam (Jaffe et al., 2017) which slide on top of each other. Inside these shells the batteries, a piston and an antenna are positioned and on the bottom, on the outside of the shell, multiple sensors are placed.
The size of the M-AUEs is to be compared with the size of a grapefruit. It exists out of 2 shells that are concentric and made out of syntactic foam (Jaffe et al., 2017) which slide on top of each other. Inside these shells the batteries, a piston and an antenna are positioned and on the bottom, on the outside of the shell, multiple sensors are placed.
===== Possible Future Work =====
 
Reisewitz (2017) states that adding a camera to the M-AUEs will provide images of the state of the ocean, resulting in enabling to map the coral habitats and the movement of larvae or to do further research on the creatures in the ocean such as the planktons.  
Regarding possible future work, Reisewitz (2017) states that adding a camera to the M-AUEs will provide images of the state of the ocean, resulting in enabling to map the coral habitats and the movement of larvae or to do further research on the creatures in the ocean such as the planktons.  
Enhancing the robot with audio recording devices (hydrophones), will let the M-AUEs function as an ‘ear’ by being able to keep track of the sounds in the ocean.
Enhancing the robot with audio recording devices (hydrophones), will let the M-AUEs function as an ‘ear’ by being able to keep track of the sounds in the ocean.
=== JenniFish ===
[[File:Robotjellyfish.jpg|thumb|right|350 px|The JenniFish and its jellyfish-inspired movement. Image retrieved from Florida Atlantic University]]
Designed at Florida Atlantic University [https://www.fau.edu/research/magazine/2019/01/dor-robo-jelly-fish.php (Floride Atlantic University, 2019)], this robot is based on the way jellyfish move through the water to give an alternative to the standard underwater robot design of a hard body propelled forward by thrusters. Coral reefs were amonst the environments that this robot is intended to be used in. This design is not much suited to our needs, as its erratic movement makes it unsuitable for taking pictures [https://www.bbc.com/news/technology-45574309 (Turner, 2018)]. It is also too small to really have the variability we want for a modular design. However, the robot does present an interesting alternative way of movement, and gives some insight into what kinds of soft materials might be used near the coral reefs, should we come to the conclusion that we want some kind of shock absorption material.
The JenniFish is slightly negatively buoyant, but can propel itself upwards by filling its "tentacles" with water and pushing this water out again. These tentacles are soft, which allows the JenniFish to squeeze through narrow spaces. Various plastics, including polyester, silicon and plexiglass were used for this robot.
Two 6V submersible impeller pumps were used to control the absoption and ejection of water. A 9V primary cell alkaline battery was used to power the system [https://fau.digital.flvc.org/islandora/object/fau%3A33675 (Frame, 2016)], though it should be noted that this description comes from early prototype design and the choice of battery was largely motivated by price and availability, this might not be the best choice for the final product.
== Research Conclusions ==
From our research into user needs, coral reef needs, underwater robotics in general and current underwater robots already used in practice, we can draw conclusions on what a good robot for coral reef researchers would need to look like.
==== Movement ====
Through interviewing the researchers, we gained some insights on the required range and method of movements the robot will need in order to be sufficient. First and foremost, the robot must have 2 operation modes to be optimal. These modes are tethered live control (teleoperation) and tetherless pre-programmed routes. These 2 modes ensure that the robot can be used comfortably and for a wide range of uses, a robot that can be used for many different tasks is, after all, a more cost-effective and sustainable solution. The teleoperated mode would ideally be used to explore new sectors and find new sites to scan underwater. The tetherless operation is intended for the scanning and photographing of known sites, this eliminates the need for operation by an expert in order to study certain sites. The researchers we interviewed were asked about biomimicry, robots that are inspired by nature for their design and operation. Examples of robots that use biomimicry are the Aqua2 and the JenniFish. Our conclusion is that biomimicry is not the answer for most types of research done on reefs. This is because most forms of movement that these types of robots use are not nearly as stable and precise as the thruster system can be. The way underwater animals move is inherently different than what this robot needs to accomplish to provide stability for underwater pictures and provide precise positioning for measurement. Therefore we decided that it is not a suitable movement method for our robot. Therefore, the multi-propeller system, such as Scubo 2.0, is much preferred. This is due to its agility and ability to move fluently in all directions. Furthermore, omnidirectional thrusters can be used to stabilize the robot while taking photographs or scans underwater, and steady and clear scans and photographs are, according to researchers, one of the single most important features this robot can have. This movement must be followed by powerful enough thrusters to counter some strong currents (up to 0.5 knots) in order to be able to remain stable in the water.
For controlling upwards and downwards movement, the method preferred by dr. Meesters and dr. Bos was having the downward movement done by thrusters, but making the upwards movement facilitated by building the robot to be slightly buoyant, so that it will float towards the surface when the thrusters are turned off. This will affect the materials of the robot. This method means that the robot can float away from the ocean floor without kicking up sand, which could damage the thrusters and mess up images taken.
==== UI and Control ====
[[File:Sigma7.jpg|thumb|right|350 px|The Sigma7 haptic feedback control device. Image retrieved from https://www.forcedimension.com/products/sigma-7/gallery/55]]
[[File:User_interface.jpg|350px|thumb|right|An exploration of a preferred user interface]]
User Interface [UI] and Control of an ROV are very crucial, without an easy method to send instructions to the robot, the operator cannot efficiently carry out the task that is required, and in research use cases, this could mean bad data collection and potential time wasting. Currently most robots use a few different kinds of input methods to control the robots via a computer interface. Most amateur oriented robots use on screen controls on a tablet or smartphone, while most professionally oriented robots use either standard gamepads, joysticks, or more specialised systems for a more unique robot operation (such as the sigma.7 haptic controller and foot pedals used to control the Ocean One robot). The use of a standard joystick or gamepad seems to be the most common control input method, which is likely due to most of these hardware devices being widely available, easy to operate, and very cost effective (a standard Xbox One controller costs 60 euros (Xbox)). On-screen controls seem to be missing from most serious professional ROV. The user interface of most of these robots seem to follow a general feature list, while the organization on screen varies slightly from system to system. Some features that can be found on the user interface of these robots include: a live camera feed, speedometer, depth meter, signal strength, battery life, orientation of the robot, compass, current robot location, and other less relevant features on some systems.
The system used by the researchers we interviewed uses an xbox one controller to navigate the robot, they had little to say on this matter as neither of them are avid gamers or have used this robot yet. This leads us to believe that the control input method is less relevant in a general sense, but the key importance is that it should be a standardized system in such a way that any user can plug in their controller of choice. This ensures that not only are the users using their most prefered hardware, but also will enable them to use existing hardware to avoid having to spend more resources on buying another item. Each person will have their prefered method to control the robot (which is made increasingly difficult when operating from a moving vessel on the water) and will therefore be more comfortable in controlling the robot.
Furthermore, we discussed the case of the UI with the researchers, and the conclusion was that the less busy the display is, the more ease of use the system has. The most important features we concluded are: the depth meter, the compass and orientation display, and the current location of the robot. According to researchers, the most important thing when controlling a robot underwater is knowing where you are at all times (this is why underwater location systems are so complex, expensive, and crucial to these operations). Finally, there should be some way of indicating the scale of the picture, how far away from the corals it was taken so that their size can be assessed.
In the movement section, 2 operation methods were discussed; a tethered live control and a tetherless pre-programmed operation. For the tetherless operation, UI can be just as crucial for two distinct reasons. The first being that you want to be able to easily program the needed path and instructions you want the robot to take. Most marine researchers will not also be programmers, so the software to program the paths needs to be very intuitive and full of features primarily being the ability to control the area of operation, and the type and size of photographs (or other types of readings) the robot will take. The second use of this software would be similar to the tethered operation but except for the live control of it, this will be for monitoring purposes and ideally has the exact same features as the tether operation setting, with the small addition of a progress meter to indicate how the route is progressing.
=== Materials, and Shape ===
Before the user interview, we had assumed that the shape of the robot would be very important in letting it safely move near the reefs. After the interview, it became clear that the robot will not be getting that up close and personal with the corals, so the shape does not need to be adapted to this. Instead, the ease of movement and the option of putting various components, like extra camera's or sensors, on the robot is much more important. A simple, squarish shape like the Scubo or the OpenROV is good enough. However, when it comes to specific guidelines for robots for coral reef researchers, shape is not a major concern for them so there is no strong recommendation to give here.
For the materials, the main points are that they should stand up to salt water and that the final robot sould be naturally slightly buoyant to allow it to float up slowly without needing to use its thrusters near the ocean floor. This limits the use of metal, stainless steel is a nice material that is quite resistant to salt water, but its density (about 7 times that of salt water) means that it cannot be used in large quantities. The use of plastics, specific kind dependent on the flexibility needed in that part of the robot, is therefore more viable for large parts of the robot. The specific type of plastic will be dependent on the hardness and flexibility requirements for specific parts of the robot, but this is no longer part of the guidelines specific to coral reef research robots.
In general, the shape and materials of the robot turned out to not need to be changed that much for the coral reef application, since the safety of the corals will largely be covered by the obstacle avoidance systems. Some of the more unique designs, like the JenniFish and the Ocean One, are very interesting but do not give any benefits for the functions that our users want.
===Power Source===
In the robots which were researched, power was either supplied by batteries, a tether or a combination of both. There are different kinds of tethers. Some tethers only provide power, others transfer data and some do both (Kohanbash, 2016). The most commonly used batteries are lithium based. As mentioned in the online guides from OpenROV one should be cautious when using Lithium batteries as they can be dangerous (OpenROV, n.d.). During the interview, the researchers said that once something went wrong with the charging of a battery and caused a fire on a ship.
As described in the movement chapter, the robot we will be designing operates in two settings. Firstly, the user can program a path via software on a laptop, upload the path and the robot will autonomously move along the path. In this case, just the batteries will suffice in providing power. Without a tether, the robot can move closer to the reefs with the risk of getting entangled. In the second setting, the user can control the robot via a controller and real time video feedback. In that case, a tether is needed for the data transfer. Power can still be supplied by the batteries only.
During the interview, the possibility of incorporating both settings was discussed. The researchers got very excited about this idea and the opportunities it provides for research. As mentioned in the movement chapter, having both functions would make the robot unique.
===Additional Features===
Collision detection is an important feature to keep the robot safe around the coral reefs. As described in the general underwater robots research section and shown by the RangerBot, obstacle avoidance for obstacles in the direction the camera is moving in can be done with visual odometry. Even with a separate camera for movement control and taking the pictures, a camera still cannot cover all sides of the robot. Particularly in the tethered, exploration application of the robot this could be a problem. In the interview it was suggested that sonar could be used for simple obstacle detection, when it is just used to avoid getting directly stuck under something, so sonar could be used for detecting obstacles directly above the robot.
The M-AUE’s use a method for localization that is very similar to the one described in the general underwater robots research section. The difference in method is that Kussat er al, (2005, p.156) use transponders that are located on the seafloor, whereas the M-AUEs receive the acoustic signal with their built-in hydrophones and respond back to floating moorings. The moorings sent out 8-15 kHz signals at a time interval of 2 seconds, localizing the M-AUEs with an accuracy of ± 0.5 km (Jaffe et al., 2017). Whereas the method that includes using transponders on the seafloor achieving an accuracy of ± 1m (2-𝜎) of the horizontal positioning of the AUVs (Kussat et al., (2005, p. 156). Which method is best to use depends on the type of research for which a robot is going to be used. When for example, globally analyzing the state of the coral reefs, it is less important to have the exact location of where images have been taken than when repetitive research in one area is being done, since in that scenario you would need to know exactly where each picture was taken.
A feature that is not mentioned in any of the robots and also did not come up in general research for underwater robots at all, was sensors that can be used to detect topographic complexity of the coral reefs. In our interview, it became clear that this was a much desired feature. Because the importance of this feature was only discovered later on in the project we could not research it in detail and the implementation of this sensor is something that would need to be investigated further. The chain-and-tape method that dr. Meesters described in the interview (where a chain is laid over the corals so that it follows the grooves between them, and the total distance is then measured with a tape) would likely not work for the basic robot we have outlined so far, since it would need ways of manipulating the chain, marking where the measurement is started and ended and retrieving the chain in such a way that the measurement could be done. This might be a feasible module to be added onto the robot, it could not be part of the basic design. Alternative options are giving it a way to detect the highest point within an area and the lowest point, for this it would only need to be able to detect the furthest distance to a single point. This could be accomplished with sonar or laser distance measurement techniques. A final option is to use a technique similar to the multibeam echosounder that ships use to map the topography of the ocean floor. But the validity of these options and how they are affected by having a robot that can move close to the reefs would need to be further investigated.
=== Overview Table ===
[[File:16.jpg|550 px|]]
== Final Results ==
=== Guidelines ===
The main result of our project is an understanding of what coral reef researchers need out of a robot to help in their research, and seeing where those meets are not being met yet by the robots currently on the market. The results of the work we have done here could be used by coral researchers themselves who want to know where the robots that they might invest in might be lacking, but it could also be used by the people developing these robots to see what features and attributes are worth investing in to make such a robot more appealing and useful for users. As this is very much a project that was build on collecting and organising information, not all the useful insights we have written down can necessarily be translated to concrete development guidelines, but we can summarise the main takeaways in this form.
# For the robot to be able to move around a fragile environment like the coral reefs, it must have reliable obstacle avoidance methods build in. It is worth considering the use of camera's and visual obstacle detection methods for this, since the coral reefs are uniquely suited to this method of obstacle avoidance.
# If the robot uses thrusters to move, it should not kick up sand when moving up and away from the ocean floor. One of the ways of avoiding this is by making the robot slightly naturally buoyant, which will allow it to float up if the engines are turned off. Therefore, the density of the materials chosen to build the robot should be slightly below the density of salt water, though the materials should also be resistant to salt water corrosion.
# The main application the robot will likely be used for is taking pictures of coral reefs that are being studied to track their development over time. A camera is therefore needed and lights to make sure the pictures are of high quality. The pictures must be consistent and useful, so the angle of the camera should not change, the scale that the pictures are taken at must be recorded, as well as the precise location where they were taken. Ideally, there would be some system for tracking the height of the corals as well, so that the topographic complexity can be studied.
# If the robot is only going to be moving along known sites of study, it does not need to have live control functionality. It can follow along pre-programmed paths with a few set parameters like how often it should take a picture, how big the area covered in each picture should be, etc. This method of control is preffered to real-time user control in this application.
# However, the additional option of real-time user control is a very desirable additional feature. The robot will need to be tethered in this case so that live video footage is possible, and users will likely want more detailed environmental information like simple sonar-based obstacle detection above the robot. This would allow the robot to be used to explore unknown environments.
# When the robot is being used for research, it is important to know the robot's location so it is known where the images have been taken. Since GPS does not work underwater, acoustic signals should be used instead to determine where the robot is.
# For ease of use, particularly while the user is on a boat, the interface should be as clean and simple as possible. There should be a live camera feed if the robot is being controlled live, as well as depth meters, the compass and orientation display, and the current location of the robot.
# The robot will be bought by universities and organizations, not individuals. To make the robot more appealing to these buyers, it is better to give it some basic functionalities and the ability to add modular components to give it extra applications, rather than build it from the ground up for just a single application.
===Infographic===
[[File:infog5.png|550 px|]]
== Discussion ==
As there are quite a few underwater robots already out there, a robot build according to our guidelines will be the one that is suitable to use for research and more specifically for research in coral reefs. Getting in touch with two experts during this project has allowed us to get into the perspective of the user and understand what needs to be looked at when designing such an underwater robot. Since there is only so much you can find online on the functionalities of products, collecting this information by means of interviewing the users was greatly needed. For us to gather as much valuable information as possible from the interview, it was important to do lots of research beforehand. By knowing what already exists and by being able to envision ourselves interacting with the robots, higher quality questions could be asked. This work, however, was slightly overlooked during this project. As this project has only a duration of 8 weeks and as at the start, time was spent finding a concrete research question, the interview resulted in taking place a bit later in the process than initially hoped, and we were only able to interview two researchers which limits the generality of our observations. This and the sudden need in having to change the form of our research due to COVID-19 took away the opportunity for us to realize our findings into a working robot, which had been our original intention to validate our design, and a lot of early effort had gone into planning how we would approach building this physical prototype. Although, as a result to this the research that has eventually been conducted, has resulted to much more profound guidelines to how such a robot should be made. These guidelines and their corresponding research can function as a base for futuristic work.
== Process ==
Since the project changed direction some times, here is an overview of the process of this project. Due to shifting priorities over the course of our project, not everything we did ended up being relevant to our final research.
===Phase 1===
* Discussed our strengths, weaknesses and interests in order to find out what we want to do during the project
* Decided that we wanted to focus on coral reefs, a topic with real global importance and that would hopefully be specific enough that we could focus on good design for a specific environment
* Wanted to make (parts of) a physical robot
===Phase 2===
* Did a lot of research into coral reefs, what threatens them and how they are being helped. Some of this research ended up in the problem background and general coral reef information sections, but most was never developed in enough detail to be added to the wiki. It was an exploration of the topic
* Came across the topic of acoustic enrichment, a lot of research was done into how we could use a robot to assist with acoustic enrichment and how it worked exactly
* Decided to make a robot which contains a speaker
[[File:IMG-20200304-WA0007.jpg|250 px|Robot which contains a speaker.]]
* Specified the parts of the robot we would be working on: movement and motor system, design and user interface
[[File:WhatsApp Image 2020-02-13 at 16.25.46.jpeg|250 px|Robot which contains a speaker.]]
* Made sketches to iterate on the shape and design
[[File:shape.jpg|250 px|Shape sketches.]]
* Did research on user interfaces and controllers
* Built a small physical prototype of the way the movement of the robot could be controlled
[[File:WhatsApp_Image_2020-03-26_at_16.30.40.jpeg|250 px|Shape sketches.]]
===Phase 3===
This phase is elaborated, discussed and explained on this wiki.
* The acoustic enrichment focus was let go when it became clear that work in that area would mean we would have to focus more on collecting and analyzing audio databases, which is not what interested us or where our specialties lie. Instead, we decided to focus on making a robot suitable to the coral reefs, with the application still being somewhat open. The researcher can add modules according to their needs
* Decided to step away from actually building the robot and more focus on making guidelines on what the robot should be able to do and which components it needs. This decision was taken due to the narrow time frame (ordering components took and waiting on the delivery would take up about a week). Moreover, coming together to work on the prototype was no longer possible due to the COVID-19 virus and restrictions from the TU/e and government
* Did a lot of research on current robots. Individual robots were studied based on their valuable aspects.
* Researched into collision detection and localization was done since there was a need to really understand how it worked before making conclusions about these areas.
* Did user research. An interview with researchers was done to get user needs.
* Finally, combined the research into general underwater robots and coral reefs from earlier in the project with the information we had received from the interview and research into existing underwater robots to make guidelines for robots useful for coral reef researchers


== References ==
== References ==


[https://www.sciencedirect.com/science/article/abs/pii/S0025326X13003020 Ateweberhan, M., Feary, D. A., Keshavmurthy, S., Chen, A., Schleyer, M. H., Sheppard, C. R. C. (2013). Climate change impacts on coral reefs: Synergies with local effects, possibilities for acclimation, and management implications. Marine Pollution Bulletin, Volume 74, Issue 2, pages 526-539. Retrieved from http://www.sciencedirect.com/science/article/pii/S0025326X13003020]
*[https://blog.arduino.cc/2016/06/13/scubo-is-an-omnidirectional-robot-for-underwater-exploration/ Arduino Team. (2016, June 13). Scubo is an omnidirectional robot for underwater exploration. Retrieved from https://blog.arduino.cc/2016/06/13/scubo-is-an-omnidirectional-robot-for-underwater-exploration/]
 
*[https://www.sciencedirect.com/science/article/abs/pii/S0025326X13003020 Ateweberhan, M., Feary, D. A., Keshavmurthy, S., Chen, A., Schleyer, M. H., & Sheppard, C. R. C. (2013). "Climate change impacts on coral reefs: Synergies with local effects, possibilities for acclimation, and management implications", ''Marine Pollution Bulletin'', volume 74, issue 2, pages 526-539].
 
*[https://www.earthmagazine.org/article/eyes-sea-swarms-floating-robots-observe-oceans Augliere, B. (2019). "Eyes in the sea: Swarms of floating robots observe the oceans", ''Earth magazine''. Retrieved March 17, 2020.]
 
*[https://auvac.org/configurations/view/179 AUVAC. (n.d.). "AUV System Spec Sheet". Retrieved March 25, 2020]
 
*[https://doi.org/10.1016/j.biocon.2004.03.021 Barker, N. H. L., & Roberts, C. M. (2004). "Scuba diver behaviour and the management of diving impacts on coral reefs", ''Biological Conservation'', volume 120, issue 4, pages 481-489.]
 
*[https://www.smithsonianmag.com/innovation/sea-star-murdering-robotsa-are-deployed-in-great-barrier-reef-180970177/ Braun, A. (2018). "Sea-Star Murdering Robots Are Deployed in the Great Barrier Reef", ''Hakai Magazine''. Retrieved on March 20, 2020.]
 
*[https://www.pnas.org/content/105/42/16201 Burkepile, D. E., & Hay, M. E. (2008). "Herbivore species richness and feeding complementarity affect community structure and function on a coral reef", ''Proceedings of the National Academy of Sciences'', volume 105, issue 42, pages 16201-16206.]
 
*[https://www.qut.edu.au/research/article?id=135108 Dunabin, M. (August 2018). "Robot reef protector sees a new way to check Great Barrier Reef health", ''Queensland University of Technology website''. Retrieved on March 20, 2020.]
 
*[https://www.qut.edu.au/news?id=137688 Dunabin, M. (November 2018). "Reef RangerBot becomes ‘LarvalBot’ to spread coral babies", ''Queensland University of Technology website''. Retrieved on March 20, 2020.]
 
*[https://www.barrierreef.org/news/news/Robot%20makes%20world-first%20baby%20coral%20delivery%20to%20Great%20Barrier%20Reef Dunabin, M. (December 2018). "Robot makes world-first baby coral delivery to Great Barrier Reef", ''Great Barrier Reef Foundation website''. Retrieved on March 20, 2020.]
 
*[https://www.semanticscholar.org/paper/Real-time-Vision-only-Perception-for-Robotic-Coral-Dunbabin-Dayoub/6bae34be3181362f6ebec71815fc2d6c39c8f305 Dunbabin, M., Dayoub, F., Lamont, R. & Martin, S. (2019). "Real-time Vision-only Perception for Robotic Coral Reef Monitoring and Management", ''International Conference on Robotics and Automation''.]
 
*[http://www.sciencedirect.com/science/article/pii/0160932782900047 Endean, R. (1982). "Crown-of-thorns starfish on the great barrier reef", ''Endeavour'', volume 6, issue 1, pages 10-14.]
 
*[https://pdfs.semanticscholar.org/7491/2a618da033b24796f48e88e71eaa00a9b57d.pdf Enger, P. S., Karlsen, H. E., Knudsen, F. R., & Sand, O. (1993). "Detection and reaction of fish to infrasound", ''ICES mar. Sei. Symp.'', volume 196, pages 108–112.]
 
*[https://doi.org/10.1038/ncomms4794 Ferrario, F., Beck, M. W., Storlazzi, C. D., Micheli, F., Shepard, C. C., & Airoldi, L. (2014). "The effectiveness of coral reefs for coastal hazard risk reduction and adaptation", ''Nature Communications'', volume 5, issue 1.]
 
*[https://www.fau.edu/research/magazine/2019/01/dor-robo-jelly-fish.php Florida Atlantic University (2019). "Bio-Inspired Robot at Home Under Water", ''Florida Atlantic University Website''. Retrieved in April 2020.]
 
*[https://fau.digital.flvc.org/islandora/object/fau%3A33675 Frame, J. (2016). "Self-Contained Soft Robotic Jellyfish with Water-Filled Bending Actuators and Positional Feedback Control", ''Florida Altlantic University Thesis paper''.]
 
*[https://www.nature.com/articles/s41467-019-13186-2?fbclid=IwAR2mivNWtUHZtsOXa8l0ng7rOnCyS5GmSLYoM_mWBC85rEuljJkmVvflcVg Gordon, T. A. C., Radford, A. N., Davidson, I. K., Barnes, K., Mccloskey, K., Nedelec, S. L., & Simpson, S. D. (2019). "Acoustic enrichment can enhance fish community development on degraded coral reef habitat", ''Nature Communications'', volume 10, issue 1.]
 
* Haereticus Environmental Laboratory (n.d.). "Protect Land + Sea Certification", ''Haereticus Website'', retrieved March 2020.
 
*Hawkins, J.P., & Roberts, C.M. (1997). "Estimating the carrying capacity of coral reefs for SCUBA diving", ''Proceedings of the 8th International Coral Reef Symposium'', volume 2, pages 1923–1926.
 
*[http://www.sciencedirect.com/science/article/pii/S0169534710001825 Hughes, T. P, Graham, N. A. J, Jackson, J. B. C., Mumby, P. J., & Steneck, R. S. (2010). "Rising to the challenge of sustaining coral reef resilience", ''Trends in Ecology & Evolution'', volume 25, issue 11, pages 633-642.]
 
*[https://doi.org/10.1038/ncomms14189 Jaffe, J. S., Franks, P. J. S., Roberts, P. L. D., Mirza, D., Schurgers, C., Kastner, R., & Boch, A. (2017). "A swarm of autonomous miniature underwater robot drifters for exploring submesoscale ocean dynamics", ''Nature Communications'', volume 8, issue 1.]
 
*[https://doi.org/10.1109/joe.2004.835249 Kussat, N. H., Chadwell, C. D., & Zimmerman, R. (2005). Absolute Positioning of an Autonomous Underwater Vehicle Using GPS and Acoustic Measurements. IEEE Journal of Oceanic Engineering, 30(1), 153–164. https://doi.org/10.1109/joe.2004.835249]
 
*[https://doi.org/10.1109/access.2016.2552538 Kaushal, H., & Kaddoum, G. (2016). "Underwater Optical Wireless Communication", ''IEEE Access'', volume 4, issues 1518–1547.]
 
*[https://ieeexplore.ieee.org/document/7742315 Khatib, O., Yeh, X., Brantner, G., Soe, B., Kim, B., Ganguly, S., Stuart, H., Wang, S., Cutkosky, M., Edsinger, A., Mullins, P., Barham, M., Voolstra, C. R., Salama, K. N., L'Hour, M., & Creuze, V.  (2016). "Ocean One: A Robotic Avatar for Oceanic Discovery", ''IEEE Robotics & Automation Magazine'', volume 23, issue 4, pages 20-29.]
 
*[http://robotsforroboticists.com/tethers/ Kohanbash, D. (2016, September 20). "Tether’s: Should Your Robot Have One?" Retrieved March 27, 2020]
 
*[https://science.sciencemag.org/content/359/6374/460 Lamb, J. B., Willis, B. L., Fiorenza, E. A., Couch, C. S., Howard, R., Rader, D. N., True, J. D., Kelly, L. A., Ahmad, A., Jompa, J., & Harvell, C. D. (2018) ."Plastic waste associated with disease on coral reefs", ''Science'', 26 Jan 2018, pages 460-462.]
 
*[https://asa.scitation.org/doi/abs/10.1121/1.2836780 Lammers, M. O., Brainard, R. E., & Wong, K. B. (2008). "An ecological acoustic recorder (EAR) for long-term monitoring of biological and anthropogenic sounds on coral reefs and other marine habitats", ''The Journal of the Acoustical Society of America'', volume 123, issue 3.]
 
*[https://www.newscientist.com/article/dn19467-robots-on-tv-aquapad-controls-robot-dive-buddy/ de Lange, C. (2010, September 21). "Robots on TV: “AquaPad” controls robot dive buddy". Retrieved March 20, 2020.]
 
*[https://doi.org/10.1016/j.cub.2007.02.054 Newton, K., Côté, I. M., Pilling, G. M., Jennings, S., & Dulvy, N. K. (2007). "Current and Future Sustainability of Island Coral Reef Fisheries", ''Current Biology'', volume 17, issue 7, pages 655-658.]
 
*[https://doi.org/10.1007/s00338-008-0426-z Nyström, M., Graham, N.A.J., Lokrantz, J., & Norström, A. V.(2008). "Capturing the cornerstones of coral reef resilience: linking theory to practice", ''Coral Reefs'', volume 27, issue 4, pages 795–809.]


[https://www.pnas.org/content/105/42/16201 Burkepile, D. E., Hay, M. E. (2008). Herbivore species richness and feeding complementarity affect community structure and function on a coral reef. Proceedings of the National Academy of Sciences, Volume 105, Issue 42, pages 16201-16206. Retrieved from https://www.pnas.org/content/105/42/16201]
*[https://openrov.dozuki.com/c/Building_from_a_kit OpenROV. (n.d.). "Assembling an OpenROV", ''OpenROV website''. Retrieved March 3, 2020.]


[http://www.sciencedirect.com/science/article/pii/0160932782900047 Endean, R. (1982). Crown-of-thorns starfish on the great barrier reef. Endeavour, Volume 6, Issue 1, pages 10-14. Retrieved from http://www.sciencedirect.com/science/article/pii/0160932782900047]
*[https://openrov.dozuki.com/Guide/Guide+0+-+Introduction/103?lang=en OpenROV. (n.d.). "Guide 0 - Introduction", "OpenROV website". Retrieved March 27, 2020.]


[https://pdfs.semanticscholar.org/7491/2a618da033b24796f48e88e71eaa00a9b57d.pdf Enger, P. S., Karlsen, H. E., Knudsen, F. R., & Sand, O. (1993). Detection and reaction of fish to infrasound. ICES mar. Sei. Symp., 196, 108–112. Retrieved from https://pdfs.semanticscholar.org/7491/2a618da033b24796f48e88e71eaa00a9b57d.pdf]
*[https://www.unep-wcmc.org/resources-and-data/pianc-dredging-and-port-construction-around-coral-reefs PIANC. (2010). "Dredging and Port Construction Around Coral Reefs (N°108)". Brussels: PIANC Secretariat General, UN Environment Programme World Conservation Monitoring Centre.]


[https://doi.org/10.1038/ncomms4794 Ferrario, F., Beck, M. W., Storlazzi, C. D., Micheli, F., Shepard, C. C., & Airoldi, L. (2014). The effectiveness of coral reefs for coastal hazard risk reduction and adaptation. Nature Communications, 5(1). https://doi.org/10.1038/ncomms4794]
*[https://www.plasticsoupfoundation.org/en/2018/01/plastic-is-making-coral-reefs-sick/ Plastic Soup Foundation (2018). "Plastic Is Making Coral Reefs Sick", ''Plastic Soup Foundation website''. Retrieved 19 februari 2020.]


[https://www.nature.com/articles/s41467-019-13186-2?fbclid=IwAR2mivNWtUHZtsOXa8l0ng7rOnCyS5GmSLYoM_mWBC85rEuljJkmVvflcVg Gordon, T. A. C., Radford, A. N., Davidson, I. K., Barnes, K., Mccloskey, K., Nedelec, S. L., … Simpson, S. D. (2019). Acoustic enrichment can enhance fish community development on degraded coral reef habitat. Nature Communications, 10(1). doi: 10.1038/s41467-019-13186-2]
*[https://www.youtube.com/watch?v=7zjKTvj0lB4 Queensland University of Technology (2018). "RangerBot: The Robo Reef Protector", ''TheQUTube YouTube channel''. Retrieved on March 20, 2020]


Hawkins, J.P., Roberts, C.M., 1997. Estimating the carrying capacity of coral reefs for SCUBA diving. In: Proceedings of the 8th International Coral Reef Symposium, vol. 2, pp. 1923–1926.
*[https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4934316/ Qureshi, U., Shaikh, F., Aziz, Z., Shah, S., Sheikh, A., Felemban, E., & Qaisar, S. (2016). RF Path and Absorption Loss Estimation for Underwater Wireless Sensor Networks in Different Water Environments. Sensors, 16(6), 890. doi: 10.3390/s16060890]


[http://www.sciencedirect.com/science/article/pii/S0169534710001825 Hughes, T. P, Graham, N. A. J, Jackson, J. B. C., Mumby, P. J., Steneck, R. S. (2010). Rising to the challenge of sustaining coral reef resilience. Trends in Ecology & Evolution, Volume 25, Issue 11, pages 633-642. Retrieved from http://www.sciencedirect.com/science/article/pii/S0169534710001825]
*[https://ucsdnews.ucsd.edu/pressrelease/swarm_of_underwater_robots_mimics_ocean_life Reisewitz, A. (2017). "Swarm of Underwater Robots Mimics Ocean Life", ''UC San Diego News Center''. Retrieved March 16,  2020]


[https://doi.org/10.1109/access.2016.2552538 Kaushal, H., & Kaddoum, G. (2016). Underwater Optical Wireless Communication. IEEE Access, 4, 1518–1547. From https://doi.org/10.1109/access.2016.2552538]
*[https://doi.org/10.1016/j.marpolbul.2007.03.022 Rios, L. M., Moore, C., Jones, P. R. (2007). "Persistent organic pollutants carried by synthetic polymers in the ocean environment", ''Marine Pollution Bulletin'', volume 54, issue 8, pages 1230-1237.]


[https://asa.scitation.org/doi/abs/10.1121/1.2836780 Lammers, M. O., Brainard, R. E., Wong, K. B. (2008). An ecological acoustic recorder (EAR) for long-term monitoring of biological and anthropogenic sounds on coral reefs and other marine habitats. The Journal of the Acoustical Society of America, Volume 123, Issue 3. Retrieved from https://asa.scitation.org/doi/abs/10.1121/1.2836780]
*[https://scripps.ucsd.edu/projects/coralreefsystems/about-coral-reefs/biology-of-corals/ Scripps, Institution of Oceanography at the University of California San Diego. (n.d.). "Coral Reef Systems | About Coral Reefs", ''The Scripps Institution of Oceanography website''. Retrieved in February 2020]


[https://doi.org/10.1007/s00338-008-0426-z Nyström, M., Graham, N.A.J., Lokrantz, J., Norström, A. V.(2008). Capturing the cornerstones of coral reef resilience: linking theory to practice. Coral Reefs, Volume 27, Issue 4, pages 795–809. Retrieved from https://doi.org/10.1007/s00338-008-0426-z]
*[https://www.unep-wcmc.org/resources-and-data/world-atlas-of-coral-reefs-2001 Spalding, M. D., Green, E. P., & Ravilious, C. (2001). "World Atlas of Coral Reefs (1st edition)", ''Berkeley, Los Angeles, London: University of California Press.'']


PIANC. (2010). Dredging and Port Construction Around Coral Reefs (N°108). Brussels: PIANC Secretariat General.
*[https://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/19980214915.pdf Spiess, F. N. (1997). Program for Continued Developement and Use of Ocean Acoustic/GPS Geodetic Techniques. Retrieved from https://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/19980214915.pdf]


[https://www.plasticsoupfoundation.org/en/2018/01/plastic-is-making-coral-reefs-sick/ Plastic Soup Foundation (2018) Plastic Is Making Coral Reefs Sick. Retrieved 19 februari 2020, from https://www.plasticsoupfoundation.org/en/2018/01/plastic-is-making-coral-reefs-sick/]
*[https://tethys-robotics.ch/index.php/robots Tethys. (2019, September 30). Robots. Retrieved from https://tethys-robotics.ch/index.php/robots/]


O. Khatib et al., "Ocean One: A Robotic Avatar for Oceanic Discovery," in IEEE Robotics & Automation Magazine, vol. 23, no. 4, pp. 20-29, Dec. 2016.
*[https://www.bbc.com/news/technology-45574309 Turner, J. (2018). Interviewed by ''BBC News'' for the article "Jellyfish robots to watch over endangered coral reefs".]


Reisewitz, A. (2017). Swarm of Underwater Robots Mimics Ocean Life. Retrieved March 16, 2020, from https://ucsdnews.ucsd.edu/pressrelease/swarm_of_underwater_robots_mimics_ocean_life
* [https://www.mdpi.com/2073-4360/5/1/1 Webb, H.K., Arnott, J., Crawford, R.J., Ivanova, E.P. (2013). "Plastic Degradation and Its Environmental Implications with Special Reference to Poly(ethylene terephthalate)", ''Polymers'', volume 5, issue 1, pages 1-18.]


Augliere, B. (2019). Eyes in the sea: Swarms of floating robots observe the oceans. Retrieved March 17, 2020, from https://www.earthmagazine.org/article/eyes-sea-swarms-floating-robots-observe-oceans
*[https://www.xbox.com/en-us/accessories/controllers/midnight-forces-2-special-edition Xbox. (n.d.). Xbox Wireless Controller – Midnight Forces II Special Edition: Xbox. Retrieved from https://www.xbox.com/en-us/accessories/controllers/midnight-forces-2-special-edition]


Jaffe, J. S., Franks, P. J. S., Roberts, P. L. D., Mirza, D., Schurgers, C., Kastner, R., & Boch, A. (2017). A swarm of autonomous miniature underwater robot drifters for exploring submesoscale ocean dynamics. Nature Communications, 8(1). https://doi.org/10.1038/ncomms14189
*[https://doi.org/10.1115/1.2018-OCT-2 Zeldovich, L. (October 2018). "The Starfish Terminator", ''Mechanical Engineering'', volume 140, issue 10, Pages 36-41.]

Latest revision as of 20:18, 6 April 2020

Group

Team Members

Name ID Email Major
Amit Gelbhart 1055213 a.gelbhart@student.tue.nl Sustainable Innovation
Marleen Luijten 1326732 m.luijten2@student.tue.nl Industrial Design
Myrthe Spronck 1330268 m.s.c.spronck@student.tue.nl Computer Science
Ilvy Stoots 1329707 i.n.j.stoots@student.tue.nl Industrial Design
Linda Tawafra 0941352 l.tawafra@student.tue.nl Industrial Design

Time Sheets

Group17final.png

Peer Review

Name Peer review individual relative grades
Amit Gelbhart 0
Marleen Luijten 0
Myrthe Spronck 0
Ilvy Stoots 0
Linda Tawafra 0

Project Goal

The Problem

Healthy vs dying corals. Image retrieved from https://https://phys.org/news/2015-08-high-seas-die.html/

Due to the acidification and warming of oceans due to climate change, pollution and destructive fishing practices (Ateweberhan et al., 2013) the amount and variety of coral reefs is declining. Understanding and tracking how these developments happen is key to figuring out ways to protect and help the coral reefs. Monitoring the reefs is labor-intesive and time-intesive work, which often involves having to dive down to take measurements and pictures. This kind of work: structured, often repeated and in environments humans are not suited to, could benefit greatly from robotic assistance. However, for scientific work in a fragile environment like the coral reefs, not just any robot will do. In this project, we will investigate how to design a specialized robot that will help coral reef researchers to sustain and improve the coral reefs.

Approach

To design a robot specialized to coral reef environments, we will investigate the needs of both coral reef researchers, as well as the needs of the corals themselves. When using the robot in an already threatened ecosystem, it is not acceptable for it to cause any harm to the corals. Each of the aspects of the robot will need to be assessed in light of these needs. Interviewing experts will give us insights into both of these categories, and we will also do our own research into the coral reefs. Since this project is focused on the specialized needs for robots that operate near coral reefs, not all parts of underwater robot design in general, we will base much of our design on existing underwater robots and focus on what needs to be modified.

Deliverables

  • An in-depth exploration of current robots like our objective and an assessment of how suitable they are for coral reefs.
  • Guidelines for a robot specialized for coral reefs.
  • A report on what coral reef researchers desire and how our guidelines meet those needs.

Problem Background

One of the great benefits of the coral reef is that in case of natural hazards, such as coastal storms, the reef on average can reduce the wave energies by 97% (Ferrario, 2014). Meaning that it can prevent storms and flooding and thus protect the coastal inhabitants. Since roughly 40% of the world’s population is located within a range of 100 km from the coast (Ferrario, 2014), protecting the coral reef will result in a reduction of a great amount of damage. This would not only be in regard to human lives but also to environmental destruction. Coral reefs are also a source of income for many people, through the attraction of tourist but also as a source of fish. Coral reef fisheries contribute to the degredation of the reefs through overfishing, but they are also currently extremely important as a source of food and income for many people (Newton et al., 2007). Healthy coral reefs are important for people and the environment, which makes it such a problem that the coral reefs have been degrading for a long time, and their recovery is lacking.

Proper monitoring of the reefs to track their development over time and identify factors that help and harm them is the main task that our research focussed on. However, a specialized coral reef robot could do much more. The initial idea behind this project was to design a robot that would not assist the researchers working with the corals, but would rather directly help the reefs themselves. While this focus was shifted later on in the project to allow us to work more directly on the user aspect, potential other uses for a coral reef robot were always on our minds.

For instance, one of the factors that could prevent the downgrading of a reef, is the resistance of a reef, its ability to prevent permanent phase-shifts; and the resilience of a reef, its ability to bounce back from phase-shifts (Nyström et al., 2008). These phase-shifts are undesirable because the reef ends up in a state where it can no longer return to a coral-dominated state (Hughes et al., 2010). If the reef has better resilience, it will be able to bounce back quicker. One of the ways to improve the resilience of the reef is increasing the species richness and abundance through the use of acoustics (Gordon, et al., 2019), which improves the reef’s resilience by giving it protection from macroalgae (Burkepile and Hay, 2008). For a coral reef to flourish, a wide biodiversity of animals is needed. Fish that lay their larvae on corals are one of the essential components in a healthy reef ecosystem. However, once corals are dying, the fish do not use them for their larvae and the whole system ends up in a negative cycle. By playing sounds, with different frequencies, fish are tricked into believing that the corals are alive and come back with their larvae. This attracts other marine animals, which causes the entire system to flourish again (Gordon, et al., 2019).

A good, coral-reef oriented roboted with a modular design could be modified to go down near the reefs to place speakers for acoustic enrichment, which would allow them to be placed faster and in areas potentially dangerous to humans.

Additionally, while our robot discussion is oriented towards the current standard coral reef research methods, which is mainly photographing, there are other ways of monitoring reefs that a robot could assist in. Monitoring the sounds of a reef has been suggested as a way of analyzing the fauna in the reef (Lammers et al., 2008), and any other number of other sensors could be attached to the robot. Specifications for a good coral reef oriented robot can therefore be applied to a wide variaty of applications.

User Needs Research

Since the goal of this project is to design a robot that is especially adapted to the coral reefs, the robot can be used for a lot of things. The user of this robot will be researchers who could use our robot to research coral reefs, interact with them and support them. The robot will be used by everybody who is trying to help the corals by making changes in the reefs or by researching them. Since the robot will have a lot of different possibilities the users and use cases can differ a bit. The robot will have variating jobs and attachment modules. Therefore the uses will also differ. The user could be a researcher that is trying to find out the current state of the coral reefs but it could also be a member of an organization that is trying to help the corals through acoustic enrichtment or delivering larvae. All of the members of our user groups are passionate about the coral reefs. One of their priorities will therefore be the safety of the corals and therefore the robot will need to not harm its environment. The robot also will need possibilities to connect a variation of attachment modules sinds the robot will be used in a lot of different use cases. We have made contact with a member of one of our user groups. An interview was held with researchers at a university to find out what their perspectives on this project are.

Interview

Example UI's we showed during the interview for feedback. Retrieved from https://www.kickstarter.com/projects/openrov/openrov-trident-an-underwater-drone-for-everyone

We interviewed Erik Meesters and Oscar Bos, both doctors at the Wageningen University & Research. Dr. Meesters works with coral reefs and specifically studies the long-term development of the reefs in Bonaire and Curacao, and dr. Bos focusses on the preservation of the biodiversity in the North Sea. They recently bought a robot that is helping them with their research. An interview was done with them to find user needs and therefore a series of questions was asked about what they would want from this robot.

We only managed to arrage this meeting further along in our project, when we had already started research into current underwater robots and coral reefs. This made the interview flow better, as we could refer to our findings during it and relate it to the information we already had.

While we intended to let the discussion flow naturally and get as much information on what the researchers want without pushing them in any particular direction with guided questions, we did prepare a list of questions to ask if the discussion ever slowed down and to make sure we covered all relevant parts of our project:

  • What are issues you are facing while studying/ managing/ maintaining coral reefs?
    • How do you deal with them now?
    • How could a robot help?
    • Is a underwater robot which is specialized for coral reefs relevant?
  • Do you like the robot you bought recently?
    • Why/ why not?
    • What is the primary reason you chose to invest in this robot over the other robots on the market?
    • Which aspects of your robot should be carried over into our design? And are those aspects already good enough or do they need to be optimized further?
    • Is it easy to control the robot? If not, is the problem more in the UI or the input method?
    • Are you satisfied with the control method? (At this point, we can show the various control methods we have seen for existing robots and ask for their preferences)
    • Regarding the UI, what kind of information would you like to be visible on screen while you are controlling the robot?
  • What things should we keep in mind regarding useful and safe movement of the robot?
    • In videos of current drones moving around coral reefs, they usually float near the top, they do not go into grooves or move between the corals. Would that be a function that would be helpful?
    • How much fluidity and freedom do you need in movement? (Here we can show the robots with very complex movement, like the Scubo, and those with more basic movement like the RangerBot)
    • Most underwater robots use thrusters, how do you feel about some alternative movement methods? (Here we can show the Aqua2 and the JenniFish)
    • What are your concerns about having a robot moving near the corals? Is it just collisions, or also the way it moves through the water?
    • How fast do you need the robot to move? Is fast movement necessary for your research? Is there potential danger in having fast-moving propellors?
    • Is changing the depth of the robot via buoyancy control a viable method in your opinion?
  • What materials would a robot need to be made from to be considered safe?
    • Are there certain chemicals to avoid specifically?
    • Are there certain battery types you would recommend avoiding?
    • Are there any parts of your current drone that concern you in this area?
  • What would you consider a reasonable price for a robot that could assist in your research?
    • If a robot was available at a reasonable price, what applications would you want to use it for, besides just research?

Interview Results

We found out that the primary application the robot would be used for would be taking photographs of known reefs to monitor their development over time, this already happens but robots would allow it to go faster and over larger areas. In the interview, dr. Meesters mentioned this being more worthwhile for researchers than single-application targeted robots like the RangerBot. This focus on research and data collection informs a lot of their needs.

Needs:

  1. Good lighting is needed for high-quality photographs. If the reefs are very near the surface of the water this is less needed, but ideally the robot would also be useful in deeper reefs and when it is less sunny.
  2. A positioning system is also important, since the information of where the picture was taken should be recorded to make repeated studies of the same area valuable. The robot that dr. Meesters and dr. Bos described used an acoustic positioning device that measures the relative distance to a pole that is hung into the water from a boat. Sonar or gps were also suggested. This kind of positioning system can apparently be really expensive, it doubled the cost of their robot.
  3. A scale on the pictures would be very useful, so they can be compared more easily.
  4. The topographic complexity of the reef (the variation in coral heights) is very important, so that information should also be recorded. 3D images would be one way to visualize them, but the time cost (of taking the pictures and more importantly, processing them) is too much compared to their use. Instead, lasers or another distance measure tool, like a small chain dropped down from the robot are much easier. A multibeam echosounder is also worth looking into.
  5. The closer the robot can get to the corals, the higher quality the pictures will be.
  6. The pictures should be consistent, so the camera should be steady and preferably not rotate. If it does rotate, the angle should be part of the image data.
  7. The main concern regarding protecting the corals is not bumping into them. Collision detection is highly recommended. It should not only detect the corals, but also what is ahead and preferably also what is above the robot, so that it does not get stuck under something. This is particularly important in applications where the robot is being controlled live instead of travelling along a pre-programmed route.
  8. If the robot is being controlled live, rather than travelling along a pre-programmed route, this control should be as simple as possible. The researchers will likely be on a boat as they are controlling the robot, and the movement of the boat and seasickness makes controlling the robot much harder. We cannot take it for granted that everyone is used to gamepad controllers or joysticks.
  9. For the UI, it should be as clean and simple as possible, with no unnecessary information on screen. However, things like current depth, map position and the angle and direction of the camera would be useful. Multiple screens could make the interface easier to understand and calmer on the eyes.
  10. The robot need not be fast, since it needs to take clear pictures anyway, which means stopping every so often or moving very slowly. However, the robot should be able to resist the current. We showed some of the weirder movement systems (the Aqua2 and JenniFish), but their preference went out to the traditional thrusters, which allow for movement in multiple directions, and are more flexible and precise.

Other points:

  1. There is not actually a need for the robot to go into the grooves of the reef, which was one of our concerns since this would require the robot to be much smaller and more flexible, and would increase the risk of bumping into the corals. Due to the degradation of the reefs, there are no deep grooves anymore, so floating above the reef is sufficient.
  2. If the robot kicks up sand when thrusting up near the ocean floor, this would hurt the picture quality and the sand could also get into the thrusters and damage them. They recommended making the robot slightly buoyant so that it could float a little bit away from the ocean floor before thrusting up.
  3. Unexpectedly for us, since we saw a lot of robots advertise their real-time controls, both of the interviewees were very interested in having the robot follow pre-planned routes. Since a lot of the work is monitoring known sites, it would be ideal to be able to tell the robot where to go and how to take the pictures and have it just do that without further input needed. Real-time control is only needed for exploring unknown locations. In that scenario, more complicated movements are needed. Having some of these movements pre-programmed rather than being dependent on advanced user control is recommended, to make control easier.
  4. A tether causes drag on the robot, which is not good. This problem could be addressed by putting a weight on the line, so that is can rest securely at the ocean floor and be connected to the robot from that point, instead of moving directly up from the robot to the boat.
  5. The safety of the reefs was not a major concern when it came to material, instead the resistance of the material to salt water is the primary concern. If oil and/or grease is used it should be biologically degradable.
  6. Batteries can cause problems, not just in use but also in transport. Not all types of batteries are allowed to be taken onto airplanes. Some batteries can also cause explosions when charged, so they are a much bigger concern than we anticipated.
  7. The budget for a robot like this could range anywhere between 15 thousand to 80 thousand euros. To make a robot more appealing to the university to invest in, a modular design that makes it useful in multiple different projects helps a lot.
  8. The survey’s that dr. Meesters described doing in the Caribbean having 115 sites with lines of around 150 meters over the reefs, where divers have to dive down and take pictures along those lines. This exactly the kind of work that robots could assist with. Pictures should show 0.5 to 1 m2 in a picture.

Literature Research

Coral Reefs

To design a robot for use in and around the coral reefs, we needed to research the reefs themselves, their vulnerabilities and general environments. This information is needed so we can make sure the robot does not damage the corals.

Collisions

It is important to consider the risks associated with having the robot move close to the corals. If these risks are manageable, then there might be value in making a smaller, more agile robot that can get up close to the reefs if needed. However, if the risks are so great that having the robot too near the reefs is unaccaptable, that option can be ignored and the robot can be slightly larger and more bulky, which opens up other design opportunities. Direct contact between the robot and the corals could cause great damage to the corals. Corals are quite fragile, and bumping into them could easily cause them to break or damage their outer layer, which might make them move vulnerable to disease (Barker and Roberts, 2004). Especially branching corals are vulnerable to breaking, as their phsyical shape makes them very fragile (Hawking and Roberts, 1997). Additionally, if the robot causes pieces of the coral to break off and this is not recorded, this would not be good for research as it would seem like pieces of the corals were being broken off more often without clear cause. This makes it clear that physical contact between the robot and the corals should absolutely be avoided. This affects things like collision detection and how close the robot is allowed to get to the corals, but it also means that the robot's movement systems should be able to reliably resist the current, so that it is not pushed into the corals.

Movement

The damage might also be more indirect. The propellors could kick up the sediment and cause turbidity in the water, if this persists too long the light penetration to the corals is affected, which can cause them to die off. If these effects persist for a very long time, this could even change the coral reef's biodiversity, as the more tolerant corals survive and the senitive corals die (PIANC, 2010). The scale of the environmental effects of the robot should therefore be kept in mind, it should not affect the environment on the long term.

Material

To keep the robot from damaging the reefs, the materials used should also be reef safe. In discussions surrounding coral reefs, plastics are often mentioned as a problem. When they wrap around corals, they can seal them from light and oxygen and release toxins which can make the corals ill (Plastic Soup Foundation, 2018) (Lamb et al., 2018). However, these concerns are with loose pieces of plastic that might wrap around the corals, not a robot that will quickly leave again. Other frequent plastic concerns are surrounding plastic pellets, which can spread organic pollutants (Rios, Moore and Jones, 2007), but this is a problem caused by the manufacturing of plastic materials, not the prescense of an object made out of plastic in the ocean. Toxins are also not a great concern in this application, due to the extremely slow degredation of plastic (Webb et al., 2013). From all of this we can conclude that using plastic for the robot is not a problem for the health of the corals, as long as the robot is not present in the same place for a long stretch of time. This means that there should be an easy way to retrieve the robot should it break down, since that is the only scenario that would lead to it being stationary in one place for that long.

If water-proof coatings or paints are used, it is important that these do not contain chemicals that might damage the corals. The Haereticus Environmental Laboratories test various products for chemical pollutants, they specifically recommend avoiding anything that contains:

  • Any form of microplastic sphere or beads.
  • Any nanoparticles like zinc oxide or titanium dioxide.
  • Oxybenzone
  • Octinoxate
  • 4-methylbenzylidene camphor
  • Octocrylene
  • Para-aminobenzoic acid (PABA)
  • Methyl Paraben
  • Ethyl Paraben
  • Propyl Paraben
  • Butyl Paraben
  • Benzyl Paraben
  • Triclosan

General Coral Reef Information

Finally, some general information on where the coral reefs are is useful, as it tells us in what environments our robot might be used. Most of our information comes from the World Atlas of Coral Reefs (1st edition), the first chapter of which provides a lot of general explanation of what the coral reefs are. There are many types of corals, but for the coral reefs it is the hermatypic (reef-building) corals that are important. They flourish in warm, shallow water. Hermatypic corals grow extremely slowly, with just a few millimetres each year. Branching corals grow much faster, but even they only grow about 150 millimetres each year. Hermatypic corals lay down a skeleton of calcium carbonate, and these structures form the basis of the coral reefs, since other corals and organisms can then grown on these structures. This source provides much more information regarding the different types of reefs, their spread over the Earth and the organisms found in reefs, but this more specific information is not relevant at this stage of our project: if it turns out that the exact nature of the corals in a reef have a strong influence on the requirements of the robot, this is a topic worth revisiting, but otherwise this general information is enough.

According to the Scripps Institution of Oceanography at the University of California San Diego: “Coral reefs can be found at depths exceeding 91 m (300 ft), but reef-building corals generally grow best at depths shallower than 70 m (230 ft). The most prolific reefs occupy depths of 18–27 m (60–90 ft), though many of these shallow reefs have been degraded.” (Scripps, n.d.) Studying these degrading coral reefs will likely be one of the main applications of a research assisting robot, so the information that those coral reefs are largely close to the surface is useful.

General Underwater Robotics

Operating in water, particularly salt water, has a great impact on the design of a robot. It needs to be well adapted to this environment, it should be able to resist the corrosion of the salt water and be waterproof so that the electronics are not damaged. It should also be able to move freely through the water.

Movement

Many ROVs [Remote Operate Vehicles] and AUVs [Autonomous Underwater Vehicles] use a wide variety of different moving methods and techniques to navigate underwater. Some employ biomimicry, these robots move around the water in ways that are inspired by nature. However, most robots that are more oriented at professional users, such as marine researchers, use a number of propellers to move around in all directions underwater. This provides multidirectional movement. It is important that these thrusters are powerful enough to move through the current in the ocean.

Moving along the x- and y-axes is not a great problem, thrusters can be used to push it in a direction or some kind of steering wheel could be manipulated to allow for turning. Moving up and down is a bit more complicated. Thrusters could also be used for this, but having an underwater robot gives an alternative option as well: using buoyancy. If the robot is naturally slightly buoyant, it will start floating up if there is no downwards force acting on it, this construction would mean that thrusters are used for downward movement, but going up just means turning the thrusters off. Alternatively the density of the robot could be flexibile by having it suck water into a tank to increase the density and move down, and having it push the water out again to return to its natural, slightly buoyant state and have it move up. If this movement system is chosen, it will affect the chosen materials for the robot since the final robot will need to be less dense than salt water.

Sensors

It is likely we will want some form of obstacle avoidance, regardless of whether the movement of the robot is fully automated or controlled by the user. This system will make sure that in case the user makes a wrong judgment, no collision will happen. This is important since no harm should come to the corals. Sonars are frequently used for collision detection. The resolution and reliability of sonar sensors degrade when being close to objects. Since the robot will have to manoeuvre a lot in coral reefs, sonar will not work sufficiently (Dunbabin, Dayoub, Lamont & Martin, 2018).

There are real-time vision-based perception approaches that make it possible to provide a robot in coral reefs with collision avoidance technologies. To make obstacle avoidance possible the challenges of real-time vision processing in coral reef environments needs to be overcome. To do so image enhancement, obstacle detection and visual odometry can be used. Cameras are not used frequently in underwater robots but will work well in coral reefs since the coral reefs are quite close to the surface (Spalding, Green & Ravilious (2001)), and the water in these areas is very clear.

Image enhancement is useful for making the processed images more valuable. To do so different types of colour correction are applied. And for detection semantic monocular obstacle detection can be used. Dunbabin et al.(2018) explain that for shallow water position estimation visual odometry combined with operational methods to limit odometry drift was explored and evaluated in early work using a vision only AUV. This already showed navigation performance errors of <8% of distance travelled (Dunbabin et al. 2018). Therefore this visual odometry can be used for the robot.

Localization

fig. 1 Illustrating how an AUV can be located by means of transducers sending out signals to transponders located on the seafloor. Note. Reprinted from “Absolute Positioning of an Autonomous Underwater Vehicle Using GPS and Acoustic Measurements”, by Kussat, N. H., Chadwell, C. D., & Zimmerman, R., 2005, IEEE Journal of Oceanic Engineering, 30(1), 153–164.

As a user, you want to know where your robot is, particularly if you are controlling it from a distance. For this, localization is needed.

As GPS is known to not work subsurface and thus cannot solely be used to detect the location of our underwater robot. It is, however, possible for GPS positions to be transferred underwater. As Kussat, Chadwell, & Zimmerman (2005, p. 156) state, the location of autonomous underwater vehicles (AUVs) can be determined by acquiring ties between the GPS before and after subsurface and integrating the acceleration, velocity, and rotation of the vehicle during the time of subsurface. However, they go on stating that this method causes an error of 1% of the distance traveled, which means a 10 m error with an AUV track of 1 km. According to them, this error occurs due to the quality of the inertial measurement unit (IMU).

Having such an error in an underwater robot that is being teleoperated can create many issues with relocating where the robot has been. The ocean is yet still a very unknown and widely spread out space where there is constant movement that can interrupt the location constantly. This could be problematic if one were to want to research in the same location again. For this reason, it is highly important to be able to detect an object's location, as accurately as possible as well as above water as underwater. Kussat et al.(2005, p. 156), continue explaining how a much more precise localization for AUVs can be achieved by combining precise underwater acoustic ranging and kinematic GPS positioning together.

To use such a method, a precise measuring of travel time is required. Acquired travel time data with a resolution of only a couple of microseconds, can be achieved by improving the correlator system (Spiess, 1997). This can be done by having fixed delay lines and cross-correlation of a coded signal in the transponders (Kussat et al., (2005, p. 156). Kussat et al, (2005, p.156) continue to explain that the method starts off by determining the location of the transducers, usually aboard a ship, by means of kinematic GPS. With their method transponders were placed on the seafloor, receiving signals from the transducers. This was done so a coordinate frame could globally be referenced. As a second step the autonomous underwater vehicles were located relative to these transponders by means of acoustic signals (see fig. 1).

User Interaction and Communication

If the robot is sent out along a preprogrammed path, it can collect and store all its data while traveling and does not need to transfer it during its movement underwater. It would be good to have the robot send some signal to the users of roughly where it is so that if something goes wrong the users know where to pick it up, but this information is quite simple and does not need to be updated constantly, regular updates of its current position would be enough. On the other hand, if we want the user to be able to control the robot in real-time, the user should have access to real-time information about where the robot is and what its surroundings are. This will likely involve video footage and potentially the data from some other sensors as well, for instance, close range sonar data to see if the robot is about to bump into something in any of the directions the camera is not pointing. This is a lot of information that needs to be transmitted very quickly to the user. The user should also be able to give commands to the robot that it responds to very quickly. "high-frequency EM signals cannot penetrate and propagate deep in underwater environments. The EM properties of water tend to resist their propagation and cause severe attenuation." (Qureshi, et al., 2016). Due to this, a tether must be used for most communication with the robot in the use-case of live control. While there are systems that exist for underwater wireless communication they are rare, pricey, and reserved for military and large scale project use.

Robot Design Requirements

From our literature research and our user interview, we can conclude a number of important factors to keep in mind when looking at a robot for coral reef researchers. A brief summary of those points:

  • The robot's movement should not damage the reefs or the fish around the reefs.
    • This means that obstacle avoidance or some other way of preventing damage to the reefs in case of collision is required. According to dr. Meesters, obstacle avoidance is the way to go here
    • The method of movement also should not damage the reefs, for instance: there should be no danger of fish or parts of the corals getting stuck in the thrusters, if thrusters are used. While we expected this to be important, our interview told us it was not a major concern, mainly since the robot will likely not be getting that close to the corals.
    • The movement of the robot should not cause permanent changes to the environment of the reef. If there are no collisions, this is less of a major concern. Kicking up sand when taking off is a problem to consider, but this is less important for the reef environment and more for the quality of the photos and the robot not getting damaged.
  • The materials that make up the robot should not be damaging to the coral reefs, even if the robot gets damaged itself.
    • Some way of locating and retrieving the robot in case of failure will be needed, to prevent the robot from getting lost and becoming waste in the ocean.
    • Batteries should be chosen so as to minimize damage in case of leakage, and they should also be safe to transport and charge.
    • Chemicals or materials that might damage the reefs should not be used.
  • The robots physical design should suit the coral reefs.
    • Fish should not see it as prey or predators to avoid the robot being damaged by fish or causing too great a disruption.
    • Depending on how close the robot moves to the corals, it should not have bits that stick out and might get caught in the corals. Based on our interview there will not be a major need for the robot to move that close to the corals, but a sleak design will still be useful for movement.
  • The robot should suit the needs of the researchers (our user group) in use.
    • The main task the robot will be assisting with is taking pictures for research into the state and development of the reefs. This means at least one good camera is needed, and the pictures should be taken in such a way that the information is consistent and useful. This means the camera should take all pictures at the same angle and should include information like the scale of the picture and the location the picture was taken at.
    • The user-interface and control methods should be adapted to the needs of the user, simplicitly is key here. The robot should not be too confusing or difficult to control, and more nuanced control might actually make it harder to get consistent results.
    • Specific attention should be paid to how the needs of coral reef researchers differ from those of hobbyists or general divers, since our research into existing robots will include robots aimed at different user bases.
  • The researchers should want to buy this robot.
    • Besides the usefulness of the robot, this involves researching what cost would be acceptable. The main point here seems to be that it is important to consider what features might make a university, research institute or organisation willing to pay for such a product. For instance, if the robot would be more likely to get bought if it could be applied in multiple different ways.

Research into Current Underwater Drones

We are not the first people to think of using robots in underwater environments or for research, so it is worth considering what already exists on the market. This gives us insight into the standards for underwater robots and what might still be missing for our specific user group.

Scubo / Scubolino

Scubo 2.0. Image retrieved from https://tethys-robotics.ch/index.php/robots/

Scubo is an ROV built by Tethys Robotics, a robotics team from ETH Zurich (Tethys, 2019). Scubo's defining feature is its omnidirectional movement, as well as its modularity. Scubo uses 8 propellers that extrude from its main body in order to allow the tethered robot to move with extreme agility underwater. The robot is made out of a carbon cuboid, which features a hole throughout the middle for better water flow and also cooling of the electronic components. It is constructed to be neutrally buoyant, allowing depth control through natural movement through the 8 propellers. On its body, there are 5 universal ports for modularity. The robot’s tether provides power for the onboard batteries, as well as allow direct control from a computer outside the water. It is controlled with a SpaceMouse Joystick. For processing, Tethys say they use an Arduino Due for the hard, real-time tasks, and an Intel NUC for high-performance calculations (Arduino Team, 2016).

LarvalBot / RangerBot

The RangerBot robot swimming above the a coral reef. Image retrieved from https://good-design.org/projects/rangerbot/

Initially, the Queensland University of Technology (QUT) designed the COTBot to deal with the dangerous crown-of-thorns starfish that threatens coral reefs. This design got improved quite a bit (most importantly, it was reduced in size and cost) to make the RangerBot, which has the same purpose as the COTBot. This design then got shifted around a bit to make the LarvalBot, which is extremely similar to the RangerBot, but instead of killing starfish this robot is used to deliver coral larvae to promote the growth of new coral. All three of the robot designs were specifically for coral reefs. Most sources on the robots focus on the video analysis capabilities most of all, so that seems to be where the innovation of this robot lies, not the coral reef application. The RangerBot has some extra features as well, such as water quality sensors and the ability to collect water samples (Braun, 2018).

Both RangerBot and LarvalBot have 6 thrusters (Zeldovich, 2018) that allow full six degrees-of-freedom control, including hover capabilities (Dunabin et al., 2019). Both are controlled with an app that, according to the creators, takes just 15 minutes to learn (Dunabin, August 2018). The LarvalBot follows only preselected paths at a constant altitude, with the release of the larvae being controlled by user input (Dunabin, November 2018), it is unclear if the RangerBot also follows a pre-selected path, but it is likely since it operates fully automatically to dispatch the crown-of-thorns starfish, and there is no mention of automatic pathfinding in any of the report.

RangerBot and LarvalBot have 2 stereo cameras (Zeldovich, 2018). The RangerBot uses these for obstacle avoidance amongst other functions, it is not specified if the LarvalBot has obstacle avoidance (Dunabin, August 2018). RangerBot uses video for obstacle avoidance because the usual method, sonar, does not work in coral reef environments. Unwater use of sonar and ultrasound is difficult either way, but the degree of complexity in coral reefs makes it manageable. Video is usually not used for this purpose because deep in the water it quickly gets dark and murky. However, coral reefs are quite close to the surface (or at least, the coral reefs we are concerned with are) and they are in clear water, so this environment is uniquely suited for cameras as a sensor (Dunabin et al., 2019).

RangerBot weighs 15 kg and is 75 cm long (Dunabin, August 2018), the LarvalBot will be bigger because it has the larvae on board, but the base robot has the same size. The RangerBot floats above the reefs and can only reach the easy-to-reach crown-of-thorns starfish (Zeldovich, 2018), that is why it can have the slightly bulky shape and can have the handle at the top without bumping into the reefs being a concern. There is unfortunately no information available online on what materials the robot was made out of.

RangerBot and LarvalBot can both be untethered. However, the current version of LarvalBot is still tethered (Dunabin, December 2018). It is not mentioned in the report why, but it can be assumed this is because you need reliably real-time video to be able to tell the LarvalBot to drop the larvae at the right time. When without a tether, the robot can last 8 hours and it has rechargeable batteries (Queensland University of Technology, 2018).

OpenROV v2.8

The purpose of the OpenROV robot kit is to bring an affordable underwater drone to the market. It costs about 800 euro to buy the kit and 1300 euro to buy the assembled robot (OpenROV, n.d.). The user is challenged to correctly assemble the robot, a fun and rewarding experience. Once the robot is made, the user is free to decide where and how to use the robot. The most common use is exploring the underwater world and making underwater videos.

The robot is remotely controlled and has a thin tether. Via a software program, which is provided and can be installed on a computer, the user is able to control the robot. The camera on the robot provides live feedback on the robots’ surroundings and the way it reacts to direction changes.

Two horizontal and one vertical motor allow the robot to smoothly move through the water. When assembling the robot, one should pay special attention to the pitch direction of the propellers on the horizontal motors. As the horizontal motors are counter rotating, the pitch direction of the propeller differs.

Other crucial parts are a Genius WideCam F100 camera, a BeagleBone Black processor, an Arduino, a tether, lithium batteries and extra weights.

All the components are held in place by a frame and stored in waterproof casings. The frame and multiple casings are made from laser cutted acrylic parts. The acrylic parts are glued together with either acrylic cement or super glue. When wires are soldered together, heat shrinks are used to secure a waterproof connection. Epoxy is used to fill up areas between wires to prevent leakage.

The design of the robot does not look very streamlined but moves surprisingly smooth. A drawback of the design is that the lower part, which contains the batteries, is always visible on the camera feed. This can be annoying when watching the video.

The robot is relatively small and can be compared to the size of a toaster. This is ideal for moving through places such as shipwrecks and coral reefs.

The robot uses 26650 (26.5mm × 65.4mm) Lithium batteries. The batteries need to be charged beforehand. The guide warns the user that the batteries could be dangerous and must be charged with the correct amount of voltage, which is 3V.

Aqua2

The Aqua2. Image retrieved from https://auvac.org/configurations/view/179

The Aqua2 robot is designed to assist human divers. It is currently employed for monitoring underwater environments and provides insights regarding robotic research and propulsion methods. The robot can go 30 meters deep and has a maximum speed of 1 m/s (AUVAC, n.d.).

The robot can operate autonomously or be controlled via a tablet. The tablet is waterproof, so that the divers can control the robot while being underwater. When the user tilts the tablet in a certain direction, the robot moves in the same direction (De Lange, 2010).

Aqua2 has six flippers which move independently. The fins move slowly, this results in little disturbances in the robot's surroundings. The robot can also be employed on land. There it walks with a speed of 0.7 m/s.

The rectangle shaped body of the AquaA2 robot is made from aluminum. The flippers are made from vinyl and have steel springs inside. These materials are sea water proof.

The robot is powered through two lithium batteries. The batteries have a voltage of 28.8 V and a capacity of 7.2 Ah. After 5 hours, the batteries must be recharged, which takes 8 hours.

It is equipped with image collection and processing software. This allows the robot to operate autonomously. The image procession is made available through the ROS development environment and the OpenCV vision library.

Ocean One

The robot is designed for research purposes. Due to its design, the robot can take over tasks from human divers such as exploring archaeological sites or collecting a specimen. Moreover, it can excess places where human divers cannot go, such as places below 50 meters deep.

The Ocean One has a high level of autonomy and is controlled via an intuitive, haptic interface. The interface “provides visual and haptic feedback together with a user command center (UCC) that displays data from other sensors and sources” (Khatib et al., 2016, p.21) . There are two haptic devices (sigma.7), a 3D display, foot pedals, and a graphical UCC. A relay station, which is connected to the controller, allows the robot to function without a tether.

“The body is actuated by eight thrusters, four on each side of the body. Four thrusters control the yaw motion and planar translations, while four others control the vertical translation, pitch, and roll. This thruster redundancy allows full maneuverability in the event of a single thruster failure” (Khatib et al., 2016, p.21).

It has an anthropomorphic shape as it should have the same capabilities as human divers. The design of the robot’s hands allow “delicate handling of samples, artifacts, and other irregularly-shaped objects” (Khatib et al., 2016, p.21). “The lower body is designed for efficient underwater navigation, while the upper body is conceived in an anthropomorphic form that offers a transparent embodiment of the human’s interactions with the environment through the haptic-visual interface” (Khatib et al., 2016, p.21).

The relay station can be used as a nearby charging station.

In the paper, the authors seem to hint that the robot is equipped with a collision detection system. The hard- and software behind it is not explained.

M-AUE

An exploded view of the M-AUE. Image retrieved from https://www.nature.com/articles/ncomms14189/figures/2

The M-AUE's were developed to do research into plankton in the ocean. The M-AUEs study environmental processes in the ocean (Reisewitz, 2017).

The idea of the M-AUEs is that they move just like planktons in the ocean, by adjusting their buoyancy (through programming) going up and down while drifting with the current (Reisewitz, 2017). The author continues stating that the M-AUEs are moving vertically against the currents that are caused by the internal waves; they do this by repeatedly changing the buoyancy.

“The big engineering breakthroughs were to make the M-AUEs small, inexpensive, and able to be tracked continuously underwater,” said Jaffe (inventor of the M-AUEs) in Reisewitz (2017).

The M-AUEs are designed to not go as deep to a point that they would be coming close to the seafloor. This will mean that in case of being used for coral research they will not maneuver through the coral, but instead hover above them. The M-AUEs are preprogrammed with a PID control algorithm, meaning that they will sense the actual depth that they are currently in and adjust the settings to the desired depth that they want to be at.

According to Reisewitz (2017), acoustic signals are used to keep track of the M-AUEs while submerging, since GPS does not work under water. Receiving three-dimensional information on the location every 12 seconds, showed where exactly the M-AUEs are inside the ocean. Augliere (2019), says that GPS equipped moorings (which float on the surface of the ocean) are being used to send out sonar pings in those 12 seconds. These pings are then received by the hydrophones of the M-AUEs (which are located around 50 meters deep from the surface), giving data which researchers can use to localize them.

As stated by Augliere (2019), the M-AUEs have batteries with power to last for several days and also data storage that can last that long. “The system is powered by a battery pack made from 6 Tenergy 1.2 V, 5000, mA hour NiMH cells in a series configuration. The battery pack is fused with a 2 A fuse and is recharged at an optimal rate of 600 mA” (Jaffe et al., 2017)

The size of the M-AUEs is to be compared with the size of a grapefruit. It exists out of 2 shells that are concentric and made out of syntactic foam (Jaffe et al., 2017) which slide on top of each other. Inside these shells the batteries, a piston and an antenna are positioned and on the bottom, on the outside of the shell, multiple sensors are placed.

Regarding possible future work, Reisewitz (2017) states that adding a camera to the M-AUEs will provide images of the state of the ocean, resulting in enabling to map the coral habitats and the movement of larvae or to do further research on the creatures in the ocean such as the planktons. Enhancing the robot with audio recording devices (hydrophones), will let the M-AUEs function as an ‘ear’ by being able to keep track of the sounds in the ocean.

JenniFish

The JenniFish and its jellyfish-inspired movement. Image retrieved from Florida Atlantic University

Designed at Florida Atlantic University (Floride Atlantic University, 2019), this robot is based on the way jellyfish move through the water to give an alternative to the standard underwater robot design of a hard body propelled forward by thrusters. Coral reefs were amonst the environments that this robot is intended to be used in. This design is not much suited to our needs, as its erratic movement makes it unsuitable for taking pictures (Turner, 2018). It is also too small to really have the variability we want for a modular design. However, the robot does present an interesting alternative way of movement, and gives some insight into what kinds of soft materials might be used near the coral reefs, should we come to the conclusion that we want some kind of shock absorption material.

The JenniFish is slightly negatively buoyant, but can propel itself upwards by filling its "tentacles" with water and pushing this water out again. These tentacles are soft, which allows the JenniFish to squeeze through narrow spaces. Various plastics, including polyester, silicon and plexiglass were used for this robot. Two 6V submersible impeller pumps were used to control the absoption and ejection of water. A 9V primary cell alkaline battery was used to power the system (Frame, 2016), though it should be noted that this description comes from early prototype design and the choice of battery was largely motivated by price and availability, this might not be the best choice for the final product.

Research Conclusions

From our research into user needs, coral reef needs, underwater robotics in general and current underwater robots already used in practice, we can draw conclusions on what a good robot for coral reef researchers would need to look like.

Movement

Through interviewing the researchers, we gained some insights on the required range and method of movements the robot will need in order to be sufficient. First and foremost, the robot must have 2 operation modes to be optimal. These modes are tethered live control (teleoperation) and tetherless pre-programmed routes. These 2 modes ensure that the robot can be used comfortably and for a wide range of uses, a robot that can be used for many different tasks is, after all, a more cost-effective and sustainable solution. The teleoperated mode would ideally be used to explore new sectors and find new sites to scan underwater. The tetherless operation is intended for the scanning and photographing of known sites, this eliminates the need for operation by an expert in order to study certain sites. The researchers we interviewed were asked about biomimicry, robots that are inspired by nature for their design and operation. Examples of robots that use biomimicry are the Aqua2 and the JenniFish. Our conclusion is that biomimicry is not the answer for most types of research done on reefs. This is because most forms of movement that these types of robots use are not nearly as stable and precise as the thruster system can be. The way underwater animals move is inherently different than what this robot needs to accomplish to provide stability for underwater pictures and provide precise positioning for measurement. Therefore we decided that it is not a suitable movement method for our robot. Therefore, the multi-propeller system, such as Scubo 2.0, is much preferred. This is due to its agility and ability to move fluently in all directions. Furthermore, omnidirectional thrusters can be used to stabilize the robot while taking photographs or scans underwater, and steady and clear scans and photographs are, according to researchers, one of the single most important features this robot can have. This movement must be followed by powerful enough thrusters to counter some strong currents (up to 0.5 knots) in order to be able to remain stable in the water.

For controlling upwards and downwards movement, the method preferred by dr. Meesters and dr. Bos was having the downward movement done by thrusters, but making the upwards movement facilitated by building the robot to be slightly buoyant, so that it will float towards the surface when the thrusters are turned off. This will affect the materials of the robot. This method means that the robot can float away from the ocean floor without kicking up sand, which could damage the thrusters and mess up images taken.

UI and Control

The Sigma7 haptic feedback control device. Image retrieved from https://www.forcedimension.com/products/sigma-7/gallery/55
An exploration of a preferred user interface

User Interface [UI] and Control of an ROV are very crucial, without an easy method to send instructions to the robot, the operator cannot efficiently carry out the task that is required, and in research use cases, this could mean bad data collection and potential time wasting. Currently most robots use a few different kinds of input methods to control the robots via a computer interface. Most amateur oriented robots use on screen controls on a tablet or smartphone, while most professionally oriented robots use either standard gamepads, joysticks, or more specialised systems for a more unique robot operation (such as the sigma.7 haptic controller and foot pedals used to control the Ocean One robot). The use of a standard joystick or gamepad seems to be the most common control input method, which is likely due to most of these hardware devices being widely available, easy to operate, and very cost effective (a standard Xbox One controller costs 60 euros (Xbox)). On-screen controls seem to be missing from most serious professional ROV. The user interface of most of these robots seem to follow a general feature list, while the organization on screen varies slightly from system to system. Some features that can be found on the user interface of these robots include: a live camera feed, speedometer, depth meter, signal strength, battery life, orientation of the robot, compass, current robot location, and other less relevant features on some systems.


The system used by the researchers we interviewed uses an xbox one controller to navigate the robot, they had little to say on this matter as neither of them are avid gamers or have used this robot yet. This leads us to believe that the control input method is less relevant in a general sense, but the key importance is that it should be a standardized system in such a way that any user can plug in their controller of choice. This ensures that not only are the users using their most prefered hardware, but also will enable them to use existing hardware to avoid having to spend more resources on buying another item. Each person will have their prefered method to control the robot (which is made increasingly difficult when operating from a moving vessel on the water) and will therefore be more comfortable in controlling the robot. Furthermore, we discussed the case of the UI with the researchers, and the conclusion was that the less busy the display is, the more ease of use the system has. The most important features we concluded are: the depth meter, the compass and orientation display, and the current location of the robot. According to researchers, the most important thing when controlling a robot underwater is knowing where you are at all times (this is why underwater location systems are so complex, expensive, and crucial to these operations). Finally, there should be some way of indicating the scale of the picture, how far away from the corals it was taken so that their size can be assessed.

In the movement section, 2 operation methods were discussed; a tethered live control and a tetherless pre-programmed operation. For the tetherless operation, UI can be just as crucial for two distinct reasons. The first being that you want to be able to easily program the needed path and instructions you want the robot to take. Most marine researchers will not also be programmers, so the software to program the paths needs to be very intuitive and full of features primarily being the ability to control the area of operation, and the type and size of photographs (or other types of readings) the robot will take. The second use of this software would be similar to the tethered operation but except for the live control of it, this will be for monitoring purposes and ideally has the exact same features as the tether operation setting, with the small addition of a progress meter to indicate how the route is progressing.

Materials, and Shape

Before the user interview, we had assumed that the shape of the robot would be very important in letting it safely move near the reefs. After the interview, it became clear that the robot will not be getting that up close and personal with the corals, so the shape does not need to be adapted to this. Instead, the ease of movement and the option of putting various components, like extra camera's or sensors, on the robot is much more important. A simple, squarish shape like the Scubo or the OpenROV is good enough. However, when it comes to specific guidelines for robots for coral reef researchers, shape is not a major concern for them so there is no strong recommendation to give here.

For the materials, the main points are that they should stand up to salt water and that the final robot sould be naturally slightly buoyant to allow it to float up slowly without needing to use its thrusters near the ocean floor. This limits the use of metal, stainless steel is a nice material that is quite resistant to salt water, but its density (about 7 times that of salt water) means that it cannot be used in large quantities. The use of plastics, specific kind dependent on the flexibility needed in that part of the robot, is therefore more viable for large parts of the robot. The specific type of plastic will be dependent on the hardness and flexibility requirements for specific parts of the robot, but this is no longer part of the guidelines specific to coral reef research robots.

In general, the shape and materials of the robot turned out to not need to be changed that much for the coral reef application, since the safety of the corals will largely be covered by the obstacle avoidance systems. Some of the more unique designs, like the JenniFish and the Ocean One, are very interesting but do not give any benefits for the functions that our users want.

Power Source

In the robots which were researched, power was either supplied by batteries, a tether or a combination of both. There are different kinds of tethers. Some tethers only provide power, others transfer data and some do both (Kohanbash, 2016). The most commonly used batteries are lithium based. As mentioned in the online guides from OpenROV one should be cautious when using Lithium batteries as they can be dangerous (OpenROV, n.d.). During the interview, the researchers said that once something went wrong with the charging of a battery and caused a fire on a ship.

As described in the movement chapter, the robot we will be designing operates in two settings. Firstly, the user can program a path via software on a laptop, upload the path and the robot will autonomously move along the path. In this case, just the batteries will suffice in providing power. Without a tether, the robot can move closer to the reefs with the risk of getting entangled. In the second setting, the user can control the robot via a controller and real time video feedback. In that case, a tether is needed for the data transfer. Power can still be supplied by the batteries only.

During the interview, the possibility of incorporating both settings was discussed. The researchers got very excited about this idea and the opportunities it provides for research. As mentioned in the movement chapter, having both functions would make the robot unique.

Additional Features

Collision detection is an important feature to keep the robot safe around the coral reefs. As described in the general underwater robots research section and shown by the RangerBot, obstacle avoidance for obstacles in the direction the camera is moving in can be done with visual odometry. Even with a separate camera for movement control and taking the pictures, a camera still cannot cover all sides of the robot. Particularly in the tethered, exploration application of the robot this could be a problem. In the interview it was suggested that sonar could be used for simple obstacle detection, when it is just used to avoid getting directly stuck under something, so sonar could be used for detecting obstacles directly above the robot.

The M-AUE’s use a method for localization that is very similar to the one described in the general underwater robots research section. The difference in method is that Kussat er al, (2005, p.156) use transponders that are located on the seafloor, whereas the M-AUEs receive the acoustic signal with their built-in hydrophones and respond back to floating moorings. The moorings sent out 8-15 kHz signals at a time interval of 2 seconds, localizing the M-AUEs with an accuracy of ± 0.5 km (Jaffe et al., 2017). Whereas the method that includes using transponders on the seafloor achieving an accuracy of ± 1m (2-𝜎) of the horizontal positioning of the AUVs (Kussat et al., (2005, p. 156). Which method is best to use depends on the type of research for which a robot is going to be used. When for example, globally analyzing the state of the coral reefs, it is less important to have the exact location of where images have been taken than when repetitive research in one area is being done, since in that scenario you would need to know exactly where each picture was taken.

A feature that is not mentioned in any of the robots and also did not come up in general research for underwater robots at all, was sensors that can be used to detect topographic complexity of the coral reefs. In our interview, it became clear that this was a much desired feature. Because the importance of this feature was only discovered later on in the project we could not research it in detail and the implementation of this sensor is something that would need to be investigated further. The chain-and-tape method that dr. Meesters described in the interview (where a chain is laid over the corals so that it follows the grooves between them, and the total distance is then measured with a tape) would likely not work for the basic robot we have outlined so far, since it would need ways of manipulating the chain, marking where the measurement is started and ended and retrieving the chain in such a way that the measurement could be done. This might be a feasible module to be added onto the robot, it could not be part of the basic design. Alternative options are giving it a way to detect the highest point within an area and the lowest point, for this it would only need to be able to detect the furthest distance to a single point. This could be accomplished with sonar or laser distance measurement techniques. A final option is to use a technique similar to the multibeam echosounder that ships use to map the topography of the ocean floor. But the validity of these options and how they are affected by having a robot that can move close to the reefs would need to be further investigated.

Overview Table

16.jpg

Final Results

Guidelines

The main result of our project is an understanding of what coral reef researchers need out of a robot to help in their research, and seeing where those meets are not being met yet by the robots currently on the market. The results of the work we have done here could be used by coral researchers themselves who want to know where the robots that they might invest in might be lacking, but it could also be used by the people developing these robots to see what features and attributes are worth investing in to make such a robot more appealing and useful for users. As this is very much a project that was build on collecting and organising information, not all the useful insights we have written down can necessarily be translated to concrete development guidelines, but we can summarise the main takeaways in this form.

  1. For the robot to be able to move around a fragile environment like the coral reefs, it must have reliable obstacle avoidance methods build in. It is worth considering the use of camera's and visual obstacle detection methods for this, since the coral reefs are uniquely suited to this method of obstacle avoidance.
  2. If the robot uses thrusters to move, it should not kick up sand when moving up and away from the ocean floor. One of the ways of avoiding this is by making the robot slightly naturally buoyant, which will allow it to float up if the engines are turned off. Therefore, the density of the materials chosen to build the robot should be slightly below the density of salt water, though the materials should also be resistant to salt water corrosion.
  3. The main application the robot will likely be used for is taking pictures of coral reefs that are being studied to track their development over time. A camera is therefore needed and lights to make sure the pictures are of high quality. The pictures must be consistent and useful, so the angle of the camera should not change, the scale that the pictures are taken at must be recorded, as well as the precise location where they were taken. Ideally, there would be some system for tracking the height of the corals as well, so that the topographic complexity can be studied.
  4. If the robot is only going to be moving along known sites of study, it does not need to have live control functionality. It can follow along pre-programmed paths with a few set parameters like how often it should take a picture, how big the area covered in each picture should be, etc. This method of control is preffered to real-time user control in this application.
  5. However, the additional option of real-time user control is a very desirable additional feature. The robot will need to be tethered in this case so that live video footage is possible, and users will likely want more detailed environmental information like simple sonar-based obstacle detection above the robot. This would allow the robot to be used to explore unknown environments.
  6. When the robot is being used for research, it is important to know the robot's location so it is known where the images have been taken. Since GPS does not work underwater, acoustic signals should be used instead to determine where the robot is.
  7. For ease of use, particularly while the user is on a boat, the interface should be as clean and simple as possible. There should be a live camera feed if the robot is being controlled live, as well as depth meters, the compass and orientation display, and the current location of the robot.
  8. The robot will be bought by universities and organizations, not individuals. To make the robot more appealing to these buyers, it is better to give it some basic functionalities and the ability to add modular components to give it extra applications, rather than build it from the ground up for just a single application.

Infographic

Infog5.png

Discussion

As there are quite a few underwater robots already out there, a robot build according to our guidelines will be the one that is suitable to use for research and more specifically for research in coral reefs. Getting in touch with two experts during this project has allowed us to get into the perspective of the user and understand what needs to be looked at when designing such an underwater robot. Since there is only so much you can find online on the functionalities of products, collecting this information by means of interviewing the users was greatly needed. For us to gather as much valuable information as possible from the interview, it was important to do lots of research beforehand. By knowing what already exists and by being able to envision ourselves interacting with the robots, higher quality questions could be asked. This work, however, was slightly overlooked during this project. As this project has only a duration of 8 weeks and as at the start, time was spent finding a concrete research question, the interview resulted in taking place a bit later in the process than initially hoped, and we were only able to interview two researchers which limits the generality of our observations. This and the sudden need in having to change the form of our research due to COVID-19 took away the opportunity for us to realize our findings into a working robot, which had been our original intention to validate our design, and a lot of early effort had gone into planning how we would approach building this physical prototype. Although, as a result to this the research that has eventually been conducted, has resulted to much more profound guidelines to how such a robot should be made. These guidelines and their corresponding research can function as a base for futuristic work.

Process

Since the project changed direction some times, here is an overview of the process of this project. Due to shifting priorities over the course of our project, not everything we did ended up being relevant to our final research.

Phase 1

  • Discussed our strengths, weaknesses and interests in order to find out what we want to do during the project
  • Decided that we wanted to focus on coral reefs, a topic with real global importance and that would hopefully be specific enough that we could focus on good design for a specific environment
  • Wanted to make (parts of) a physical robot

Phase 2

  • Did a lot of research into coral reefs, what threatens them and how they are being helped. Some of this research ended up in the problem background and general coral reef information sections, but most was never developed in enough detail to be added to the wiki. It was an exploration of the topic
  • Came across the topic of acoustic enrichment, a lot of research was done into how we could use a robot to assist with acoustic enrichment and how it worked exactly
  • Decided to make a robot which contains a speaker

Robot which contains a speaker.

  • Specified the parts of the robot we would be working on: movement and motor system, design and user interface

Robot which contains a speaker.

  • Made sketches to iterate on the shape and design

Shape sketches.

  • Did research on user interfaces and controllers
  • Built a small physical prototype of the way the movement of the robot could be controlled

Shape sketches.

Phase 3

This phase is elaborated, discussed and explained on this wiki.

  • The acoustic enrichment focus was let go when it became clear that work in that area would mean we would have to focus more on collecting and analyzing audio databases, which is not what interested us or where our specialties lie. Instead, we decided to focus on making a robot suitable to the coral reefs, with the application still being somewhat open. The researcher can add modules according to their needs
  • Decided to step away from actually building the robot and more focus on making guidelines on what the robot should be able to do and which components it needs. This decision was taken due to the narrow time frame (ordering components took and waiting on the delivery would take up about a week). Moreover, coming together to work on the prototype was no longer possible due to the COVID-19 virus and restrictions from the TU/e and government
  • Did a lot of research on current robots. Individual robots were studied based on their valuable aspects.
  • Researched into collision detection and localization was done since there was a need to really understand how it worked before making conclusions about these areas.
  • Did user research. An interview with researchers was done to get user needs.
  • Finally, combined the research into general underwater robots and coral reefs from earlier in the project with the information we had received from the interview and research into existing underwater robots to make guidelines for robots useful for coral reef researchers

References

  • Haereticus Environmental Laboratory (n.d.). "Protect Land + Sea Certification", Haereticus Website, retrieved March 2020.
  • Hawkins, J.P., & Roberts, C.M. (1997). "Estimating the carrying capacity of coral reefs for SCUBA diving", Proceedings of the 8th International Coral Reef Symposium, volume 2, pages 1923–1926.