PRE2024 3 Group4: Difference between revisions
No edit summary |
|||
(60 intermediate revisions by 6 users not shown) | |||
Line 12: | Line 12: | ||
|- | |- | ||
|Jarno Peters | |Jarno Peters | ||
| | |1899198 | ||
|- | |- | ||
|Simon B. Wessel | |Simon B. Wessel | ||
| | |1755956 | ||
|- | |- | ||
|Javier Basterreche Blasco | |Javier Basterreche Blasco | ||
| | |1700189 | ||
|- | |- | ||
|Matei Razvan Manaila | |Matei Razvan Manaila | ||
| | |1833006 | ||
|} | |} | ||
== '''Introduction''' == | == '''Introduction''' == | ||
For this project, we attempted to develop a product step-by-step made to improve swimming technique. The project has thus been divided into different stages that logically build up atop each other. We start with defining the problem statement and our objective in that regard, which will help us outline our goals for the project together with the approach and milestones to that end. After this, the most important question needs to be raised: is there a demand for this product? To begin with, a few competitor products are investigated together with different user groups to identify our suitable target group. From this point, we carry out interviews and conduct surveys to get a thorough insight into the user needs. The requirements are quantitatively determined from this, seeming as they are largely based on the user research. A detailed discussion on this is given. Now that we know how the product should look like and we know what it should be able to do, we can start thinking about quantifying what is 'good and bad technique' and in accordance start working on the code. We end the project with an outline on the feedback system. | |||
== '''Start of project''' == | == '''Start of project''' == | ||
=== '''Problem statement and objective''' === | === '''Problem statement and objective''' === | ||
Ever since the start of the Covid pandemic, the demand for online coaching has grown ever-so rapidly<ref name=":82">Analytica, A. (2023, 7 maart). ''Sports Coaching Platforms Market Size, share| Report, 2031''. <nowiki>https://www.astuteanalytica.com/industry-report/sports-coaching-platforms-market</nowiki></ref>. One of the main factors behind this uprise is the levels of flexibility that online coaching provides, enabling technique feedback to swimmers at any point throughout the day<ref name=":82" />. The trade-off, however, is that these online coaching platforms are not exactly cheap. For instance, the biggest online swimming community MySwimPro charges a base monthly rate of $297 for online coaching<ref>MySwimPro. (z.d.). ''Swimming strategy call''. <nowiki>https://academy.myswimpro.com/apply</nowiki></ref>, and this is not a stand alone case. Other companies such as Train Daly or SwimSmooth charge $200 and €167 respectively for just one hour of video feedback<ref>''SWIM SMOOTH | 1-to-1 Video Analysis & Stroke Correction for Adults | Swim Smooth''. (z.d.). Swim Smooth. <nowiki>https://www.swimsmooth.com/videoanalysis</nowiki></ref><ref>''Virtual Swim Video Technique Analysis - Dan Daly''. (z.d.). Calendly. <nowiki>https://calendly.com/daniel-daly/swim-video-technique-session-at-your-pool-clone?month=2025-04</nowiki></ref>. Our objective is to counter this setback through the design of a motion capture suit that provides an affordable alternative, while retaining the flexibility of online coaching. | |||
The suit should be able to accurately track the swimmer's motions, use this data to identify errors, and provide clear feedback to the swimmer. | |||
=== Approach === | === Approach === | ||
We started by researching which kind of sensors our suit should have and decided that accelerometers and gyroscopes are the best option, price wise and in practicality. These sensors are placed at the shoulders. elbows and wrist for the arms, and the hips, knees and ankles for the legs. lastly one sensor is placed on the back. with the help of the data that these sensors deliver, a skeleton can be created that is used to analyse the posture and movement of the swimmer. | |||
Since the actual physical development of such a product would reach beyond the time we have in this quartile, we decided to make a theoretical product but with working software. To obtain the data we would get from the suit, we borrowed a motion tracking device from the uni. | |||
=== '''Milestones and deliverables''' === | === '''Milestones and deliverables''' === | ||
These were the original milestones and deliverables established at the start of the project. | |||
Due to the time frame and the scope of the course, a full body suit is likely not feasible. To be able to have something to show at the end of the course, a prototype will be built for one arm. There are also multiple ways of swimming, for this project the focus will be on the front crawl. | Due to the time frame and the scope of the course, a full body suit is likely not feasible. To be able to have something to show at the end of the course, a prototype will be built for one arm. There are also multiple ways of swimming, for this project the focus will be on the front crawl. | ||
Line 51: | Line 50: | ||
# Construct a program that can differentiate between correct and wrong technique. Some technique errors may be distinguished manually, others might require some simple implementations of AI. One method for this would be gathering a bunch of data for wrong and correct arm motion, gathering simple features from the data, like minimal and maximal angle of the elbow joint, and training a simple decision tree. | # Construct a program that can differentiate between correct and wrong technique. Some technique errors may be distinguished manually, others might require some simple implementations of AI. One method for this would be gathering a bunch of data for wrong and correct arm motion, gathering simple features from the data, like minimal and maximal angle of the elbow joint, and training a simple decision tree. | ||
# (bonus) if there is some time left, it may be possible to also write a program for a different way of swimming, like backstroke or the butterfly. | # (bonus) if there is some time left, it may be possible to also write a program for a different way of swimming, like backstroke or the butterfly. | ||
However, we quickly realised that the technology for underwater motion capture suits already exists. We therefore decided to work on something that hasn't been explored, namely how to create a feedback algorithm based on the data generated by the suit. Our goal is therefore to design a feedback algorithm that, given certain data, can identify errors in the swimmers' technique and give understandable feedback on how to improve. However, to emphasise that our product is not purely theoretical, and could actually be used in a real-world application, a design is also made for a motion capture suit that could generate the data our software analyses. | |||
== '''The users''' == | == '''The users''' == | ||
The product will be aimed at individual amateur swimmers. These could be swimmers that have never been part of a swim team, have quit their swim team, or are part of a swim team but feel like the coaching they receive is inadequate(according to our own survey, which will be discussed later, a lot of swimmers seem to fall into this category). To properly make use of the product, it is important that the swimmer does have a bit of experience in swimming, and can understand the basic concepts. The product is not meant to replace beginner swimming lessons. | |||
Some requirements/considerations for a system aimed at these swimmers would be: | |||
-The system should be affordable, at least relative to online coaching | |||
-The system should be accurate in tracking the swimmer’s motion, but doesn’t require milimeter precision | |||
-The feedback given by the system should be easily understandable, and available as soon as possible | |||
- | -The system should be comfortable and not hinder the swimmer’s natural swimming motion | ||
-The | -The motion capture suit should be a sensor system and not an optical system as individuals can’t simply put up cameras in a swimming pool. | ||
- | -Most current motion capture systems require a laptop for data processing. Would swimmers be comfortable bringing a laptop to the pool? | ||
To gain more insight into the specific user requirements, two interviews were conducted and a survey was spread online. The full transcript of the interviews, and the exact survey, can be found in the appendix but the main results will be summarized below. | |||
=== '''User interviews''' === | === '''User interviews''' === | ||
Summary interview 1 | Summary interview 1: This interview was done with a very experienced swimmer who is already part of a swimming team(So not necessarily our target audience). However, the swimmer did indicate a feeling of not getting adequate feedback at his team, and indicated an interested in trying out our product if available. In terms of how to receive feedback, the swimmer prefers a video watch after he is done swimming, paired with some form of realtime feedback, preferably haptic. The swimmer finds comfort and low price the most important, while ease of use is not an issue. The swimmer would be willing to pay between 50 and 100 euros, but acknowledges that this might not be realistic. | ||
Summary interview 2: This interview was done with a relatively experienced swimmer, who is already part of a swimming team. The swimmer indicates that there is a lack of personalized feedback at his club. For the feedback, the swimmer would like an animation after he is done swimming, and an earpiece for realtime feedback. The swimmer considers comfort the most important factor for such a system as having to swim differently due to the sensors would go against the purpose of the system. The swimmer would be willing to pay 150 euros for the system. | Summary interview 2: This interview was done with a relatively experienced swimmer, who is already part of a swimming team. The swimmer indicates that there is a lack of personalized feedback at his club. For the feedback, the swimmer would like an animation after he is done swimming, and an earpiece for realtime feedback. The swimmer considers comfort the most important factor for such a system as having to swim differently due to the sensors would go against the purpose of the system. The swimmer would be willing to pay 150 euros for the system. | ||
=== '''User survey''' === | === '''User survey''' === | ||
After reaching out to multiple Instagram accounts in the hope that some of them would share the survey with their audience, the account ‘SwimDepth’ ended up sharing it. In total 25 people responded to the survey. The respondents are almost all very motivated swimmers, with 92% swimming at least 3 times a week, and 56% even swimming more than 5 times a week. The respondents are also quite experienced, with 56% having swum for more than 5 years, while the other 44% have been swimming for somewhere between 1 to 5 years. 92% of swimmers have made use of in-person coaching(which probably just meant that they are part of a swimclub). Only 8% have made use of personalised online coaching. The main challenges encountered in coaching were inadequate feedback(13 votes), high cost(11 votes), and it being hard to find good trainers(9 votes). The most important factors influencing the respondents decision to buy the suit were comfort(19 votes), accuracy of feedback(18 votes), and price(17 votes). Durability and ease of use were found less important with 11 and 10 votes respectively. As to the preferred method of feedback, 52% picked visual feedback, while audio- and haptic feedback received 32% and 12% of the votes. The respondents were split on the maximum price for the product, with 52% being willing to pay between $100-$500, and 40% only being willing to pay below $100. 64% of respondents indicated that having to bring a laptop to the pool would not be a dealbreaker for them. There doesn’t seem to be a relation between how often the swimmers swim, and the price they are willing to pay for the system. | |||
In total 25 people responded to the survey. The respondents are almost all very motivated swimmers, with 92% swimming at least 3 times a week, and 56% even swimming more than 5 times a week. The respondents are also quite experienced, with 56% having swum for more than 5 years, while the other 44% have been swimming for somewhere between 1 to 5 years. 92% of swimmers | |||
== '''Requirements''' == | == '''Requirements''' == | ||
Based off the user survey and the interviews, we can further specify the user requirements. First off, comfort was mentioned as the most important factor when designing the motion capture system. To ensure the system is comfortable, the sensors must be as light as possible and must be attached to the swimmers' joints in a way that does not restrict the swimmer from performing his/her natural swimming motion. Accuracy was also found to be important, so the sensors used must be of relatively high quality. Ease of use was found to be not so important, meaning the calibration time of the system can be quite high. In terms of feedback, there seemed to be a preference for an animation that can be watched back after swimming, paired with some form of real time feedback. The maximum price point for the system seemed to be around $500, but a significant number of swimmers would prioritize an even lower price. These user requirements lead to the following technical requirements: | |||
{| class="wikitable" | {| class="wikitable" | ||
Line 149: | Line 120: | ||
| - | | - | ||
|'x' can be any number | |'x' can be any number | ||
|- | |- | ||
|Robustness | |Robustness | ||
Line 166: | Line 132: | ||
|- | |- | ||
|Price | |Price | ||
|Max. | |Max.400 euro | ||
| +-25 | | +-25 | ||
|Based upon user research | |Based upon user research and similar products/ production cost | ||
|} | |} | ||
'''<u>References specifications:</u>''' | '''<u>References specifications:</u>''' | ||
Line 191: | Line 157: | ||
<u>Dimensions</u>: If the 'reference' sensor is placed on the back (most steady part while swimming) it may not extend to far away from the body because we do not want to interfere with the flow of water/streamline of the swimmer in a great extend. (We assume the specification of 'shape' to be already incorporated here). The other 2 dimension may be extended more due to the reduced effects on streamline, but there is a limit to this which is correlated to the weight. We want the device to 'perfectly float' in the water, so it is practically weightless when using. To calculate the required volume of the device given the mass, we do the calculation shown in figure 1. We conclude, if the mass is 0.5 kg, the height 3cm, then: length*width=0.0167168 m^2 to let the device perfectly float in water. As explained in 'Shape', it is desired to reduce the width (W) as much as possible. | <u>Dimensions</u>: If the 'reference' sensor is placed on the back (most steady part while swimming) it may not extend to far away from the body because we do not want to interfere with the flow of water/streamline of the swimmer in a great extend. (We assume the specification of 'shape' to be already incorporated here). The other 2 dimension may be extended more due to the reduced effects on streamline, but there is a limit to this which is correlated to the weight. We want the device to 'perfectly float' in the water, so it is practically weightless when using. To calculate the required volume of the device given the mass, we do the calculation shown in figure 1. We conclude, if the mass is 0.5 kg, the height 3cm, then: length*width=0.0167168 m^2 to let the device perfectly float in water. As explained in 'Shape', it is desired to reduce the width (W) as much as possible. | ||
<u>Shape</u>: The shape of the device is crucial if we want to minimise the streamline of the swimmer. The general formula for drag force: F<sub>d</sub> = (1/2)*ρ*v<sup>2</sup>*c<sub>d</sub>*A, with ρ the density of water, v the flow velocity relative to object, c<sub>d</sub> the drag coefficient and A the frontal surface area. The frontal surface area can be further reduced by minimising W in the equation in 'Dimension->Specification'. The only variable left where the design has influence over is the drag coefficient. For a 'general' swimmer is is roughly 1.1-1.3<ref>https://swimmingtechnology.com/measurement-of-the-active-drag-coefficient/<nowiki/>,Swimming Technology Research, Inc., Measuring the Active Drag Coefficient</ref>. We want our product to be at least even streamlined as a swimmer so we have chosen C<sub>d</sub> to be less then 1.2. Therefore the device increases the total drag proportionally to the swimmer without device. | <u>Shape</u>: The shape of the device is crucial if we want to minimise the streamline of the swimmer. The general formula for drag force: F<sub>d</sub> = (1/2)*ρ*v<sup>2</sup>*c<sub>d</sub>*A, with ρ the density of water, v the flow velocity relative to object, c<sub>d</sub> the drag coefficient and A the frontal surface area. The frontal surface area can be further reduced by minimising W in the equation in 'Dimension->Specification'. The only variable left where the design has influence over is the drag coefficient. For a 'general' swimmer is is roughly 1.1-1.3<ref name=":7">https://swimmingtechnology.com/measurement-of-the-active-drag-coefficient/<nowiki/>,Swimming Technology Research, Inc., Measuring the Active Drag Coefficient</ref>. We want our product to be at least even streamlined as a swimmer so we have chosen C<sub>d</sub> to be less then 1.2. Therefore the device increases the total drag proportionally to the swimmer without device. | ||
<u>Calibration</u>: The calibration should be able to be completed alone and within 1 minute after putting on the sensors. | <u>Calibration</u>: The calibration should be able to be completed alone and within 1 minute after putting on the sensors. | ||
Line 267: | Line 233: | ||
|111,38 | |111,38 | ||
|} | |} | ||
{| class="wikitable" | |||
|+Budget for 250 units at 400euro a pice | |||
!subject | |||
!cost | |||
!income | |||
!subject | |||
|- | |||
|development | |||
|26330 | |||
|5000 | |||
|sposoring/ start up | |||
|- | |||
|workspace | |||
|4403 | |||
|82644 | |||
|product cost | |||
|- | |||
|work | |||
|25826 | |||
| | |||
| | |||
|- | |||
|materials | |||
|23012 | |||
| | |||
| | |||
|- | |||
|contingencies | |||
|7957 | |||
| | |||
| | |||
|- | |||
|Profit | |||
|114 | |||
| | |||
| | |||
|} | |||
For 250 units sold at 400 euro a pice in one year, this product would break even. From then on the profit margin would increase linearly | |||
=== Calibration === | === Calibration === | ||
Line 273: | Line 277: | ||
https://base.movella.com/s/article/Body-Dimensions?language=en_US#Offline</ref>. For our suit we will now focus on this partially adopted 'fast' method. Human body proportions are not perfectly the same for everyone, however there is a table with relative lengths of body parts<ref>Body Segment Parameters1 A Survey of Measurement Techniques,Rudolf s Drillis, PH.D.,2 Renato Contini , B.S.,3 AND Maurice Bluestein, M.M.E.4, 1964</ref><ref>THE STATIC MOMENTS OF THE COMPONENT MASSES OF THE HUMAN BODY,https://apps.dtic.mil/sti/tr/pdf/AD0279649.pdf<nowiki/>,Dr. Harless,1860</ref>. The proportions mentioned table 1 from this paper can be used. Only the total height of the person has to be measured and the other important lengths can be approximated from this. The angles between the sensors also have to be calibrated. This can be done by agreeing on a (convenient) pose where the angels are known. (for example: arms straight up with hands above shoulders -> elbow angel=180,shoulder angle wrt xy plane=0, shoulder angel wrt xz plane = 0; if we the foot are placed right under the hips, the angles are knee angle=180, hip angle wrt xy plane =0, hip angle wrt xy plane =0, angle shoulders wrt xy plane =0). We might, just as the Xsens suit include a option to insert all 11 measurements (2x lower leg, 2x upper leg, 2x lower arm,2x upper arm, hip width, shoulder width, hip-shoulder height). | https://base.movella.com/s/article/Body-Dimensions?language=en_US#Offline</ref>. For our suit we will now focus on this partially adopted 'fast' method. Human body proportions are not perfectly the same for everyone, however there is a table with relative lengths of body parts<ref>Body Segment Parameters1 A Survey of Measurement Techniques,Rudolf s Drillis, PH.D.,2 Renato Contini , B.S.,3 AND Maurice Bluestein, M.M.E.4, 1964</ref><ref>THE STATIC MOMENTS OF THE COMPONENT MASSES OF THE HUMAN BODY,https://apps.dtic.mil/sti/tr/pdf/AD0279649.pdf<nowiki/>,Dr. Harless,1860</ref>. The proportions mentioned table 1 from this paper can be used. Only the total height of the person has to be measured and the other important lengths can be approximated from this. The angles between the sensors also have to be calibrated. This can be done by agreeing on a (convenient) pose where the angels are known. (for example: arms straight up with hands above shoulders -> elbow angel=180,shoulder angle wrt xy plane=0, shoulder angel wrt xz plane = 0; if we the foot are placed right under the hips, the angles are knee angle=180, hip angle wrt xy plane =0, hip angle wrt xy plane =0, angle shoulders wrt xy plane =0). We might, just as the Xsens suit include a option to insert all 11 measurements (2x lower leg, 2x upper leg, 2x lower arm,2x upper arm, hip width, shoulder width, hip-shoulder height). | ||
== '''Hardware | == '''State Of The Art: Hardware''' == | ||
The following is a summary of the state of the art of motion capture systems for swimming applications. There are two types of motion capture systems that can be used: A camera-based system, or a sensor-based system. Both will be discussed. | |||
=== Camera === | === Camera ===<!-- Probably either a mocap system with passive optical elements on critical joints (requires suit) or without any optical elements (no special suit needed). | ||
<!-- Probably either a mocap system with passive optical elements on critical joints (requires suit) or without any optical elements (no special suit needed). | |||
Challenges camera system: bubbles & splashes, moving with swimmer (rail or multiple cameras), identifying joints and critical locations and separating them (e.g. don't confuse elbow with wrist), multiple angles (above and side view for 3d position). --> | Challenges camera system: bubbles & splashes, moving with swimmer (rail or multiple cameras), identifying joints and critical locations and separating them (e.g. don't confuse elbow with wrist), multiple angles (above and side view for 3d position). --> | ||
Line 288: | Line 292: | ||
Conclusion cameras: If a camera system is used, the system will definitely be a markerless one. Given that companies are already selling this technology on a commercial level indicates that there is not much for us to add to this technology on the hardware side of things. | Conclusion cameras: If a camera system is used, the system will definitely be a markerless one. Given that companies are already selling this technology on a commercial level indicates that there is not much for us to add to this technology on the hardware side of things. | ||
=== Sensors === | === Sensors ===<!-- probably an inertial sensing system, using accelerometers and gyroscopes. | ||
<!-- probably an inertial sensing system, using accelerometers and gyroscopes. | |||
Challenges sensors: drift (for inertial sensors), need to calibrate distance among each other before starting, waterproofing (less relevant), --> | Challenges sensors: drift (for inertial sensors), need to calibrate distance among each other before starting, waterproofing (less relevant), --> | ||
Line 308: | Line 311: | ||
=== Conclusion === | === Conclusion === | ||
For both the sensor- and camera-based mocap system there is not much for us to do on the hardware side. For both methods there are companies that have low error systems that could be used to feed the coordinate data into our software. In the context of a swimming technique suit in swimming pools, the sensor idea is a better fit, as it does not require fixing many cameras over the length of the pool to acquire the data. Given that we will likely focus on the software side of this idea, our choice of hardware does not matter a great deal, as we will assume a (near-)perfect data influx for our software to handle. | For both the sensor- and camera-based mocap system there is not much for us to do on the hardware side. For both methods there are companies that have low error systems that could be used to feed the coordinate data into our software. In the context of a swimming technique suit in swimming pools, the sensor idea is a better fit, as it does not require fixing many cameras over the length of the pool to acquire the data. Given that we will likely focus on the software side of this idea, our choice of hardware does not matter a great deal, as we will assume a (near-)perfect data influx for our software to handle. | ||
== Data Transmission == | |||
[[File:ESP32circuit.png|left|thumb|263x263px]] | |||
[[File:Blank diagram transmission.png|thumb|371x371px]] | |||
The MPU sensor continuously captures real-time motion data, including three-axis accelerometer and gyroscope measurements, representing acceleration and angular velocity respectively. This sensor data is then read by the ESP32 microcontroller over an I²C interface at regular intervals. The ESP32 then formats this data into UDP packets and transmits them wirelessly to a UDP server hosted directly on a smartphone that is connected to the same local network. The smartphone, running a UDP server, listens for incoming packets, decodes them, and applies computations. Based on the computation’s outcome, the program generates an audio response through its built-in text-to-speech engine, delivering direct audio feedback via a Bluetooth headset paired to the phone. At the same time, the data is saved and can be rewatched in a 3D animation after the session. | |||
== '''Perfect swimming technique''' == | == '''Perfect swimming technique''' == | ||
To be able to give accurate feedback, the algorithm needs some 'perfect' baseline to compare the swimmers' technique with. What follows is a summary of some of the most important factors that determine whether someones technique is correct or not. | |||
The freestyle stroke can generally be divided into 4 phases<ref>Jailton G. Pelarigo, Benedito S. Denadai, Camila C. Greco, | The freestyle stroke can generally be divided into 4 phases<ref>Jailton G. Pelarigo, Benedito S. Denadai, Camila C. Greco, | ||
Line 329: | Line 342: | ||
-Entry&Catch(Note that these are sometimes defined as 2 separate phases): | -Entry&Catch(Note that these are sometimes defined as 2 separate phases): | ||
During the entry, the hand enters the water fingertips first - with an angle of 45 degrees with respect to the water surface<ref name=":5">Biskaduros, P. W., Biskaduros, P. W., & Biskaduros, P. W. (2024, 26 september). ''How to swim freestyle with perfect technique''. How To Swim Freestyle With Perfect Technique. <nowiki>https://blog.myswimpro.com/2019/06/06/how-to-swim-freestyle-with-perfect-technique/#comments</nowiki></ref> - around half a meter in front of the shoulder | During the entry, the hand enters the water fingertips first - with an angle of 45 degrees with respect to the water surface<ref name=":5">Biskaduros, P. W., Biskaduros, P. W., & Biskaduros, P. W. (2024, 26 september). ''How to swim freestyle with perfect technique''. How To Swim Freestyle With Perfect Technique. <nowiki>https://blog.myswimpro.com/2019/06/06/how-to-swim-freestyle-with-perfect-technique/#comments</nowiki></ref> - around half a meter in front of the shoulder<ref name=":6" />. The hand should enter at around shoulder width(Image your head indicates ‘12’ on a clock, your arms should enter at around ‘11’ and ‘1’ <ref name=":6">Fares Ksebati. (2021, 17 maart). ''How to Improve Your Freestyle Pull & Catch'' [Video]. YouTube. <nowiki>https://www.youtube.com/watch?v=nD2dZVsrBq4</nowiki></ref>). After entering, the hand should reach forwards and slightly down till the arm is fully extended, staying between 10 to 25cm below the surface of the water<ref>Swim360. (2018, 10 oktober). ''Freestyle hand entry - find the sweet spot to speed up'' [Video]. YouTube. <nowiki>https://www.youtube.com/watch?v=3yReSXt9_8Q</nowiki></ref>. While extending the arm, it is also important to rotate towards the same side. The shoulders should rotate somewhere between 32 and 40 degrees<ref>Ford, B. (2018, 27 juli). ''How to avoid over rotation in freestyle''. Effortless Swimming. <nowiki>https://effortlessswimming.com/how-to-avoid-over-rotation-in-freestyle/#:~:text=One%20of%20the%20key%20aspects,degrees%20during%20a%20breathing%20stroke</nowiki></ref>, up to 45 degrees on a breathing stroke. During this rotation, and the entire stroke for that matter, the head should stay stationary, and pointed forwards at a 45 degrees angle<ref>Petala, A. (2022, 8 november). Everything YOU need to know about Freestyle Rotation. ''Everything YOU need to know about Freestyle Rotation''. <nowiki>https://blog.tritonwear.com/everything-you-need-to-know-about-freestyle-rotation</nowiki></ref>. | ||
After fully extending the arm, the catch begins. Here, fingertips are ‘pressed down’ and the elbow is bent such that the fingertips and forearms will be pointed towards the bottom of the pool.It generally holds that the earlier in the stroke you can set up this ‘early vertical forearm’, the better. It is extremely important that this is done while maintaining a ‘high elbow’ meaning that if one were to draw a straight line between your fingertips and your shoulder, the elbow would sit above that line. When you are done setting up the catch, the angle of your elbow should be between 90 and 120 degrees<ref>Jerszyński D, Antosiak-Cyrak K, Habiera M, Wochna K, Rostkowska E. Changes in selected parameters of swimming technique in the back crawl and the front crawl in young novice swimmers. J Hum Kinet. 2013 Jul 5;37:161-71. doi: 10.2478/hukin-2013-0037. PMID: 24146717; PMCID: PMC3796834.</ref>. | After fully extending the arm, the catch begins. Here, fingertips are ‘pressed down’ and the elbow is bent such that the fingertips and forearms will be pointed towards the bottom of the pool.It generally holds that the earlier in the stroke you can set up this ‘early vertical forearm’, the better. It is extremely important that this is done while maintaining a ‘high elbow’ meaning that if one were to draw a straight line between your fingertips and your shoulder, the elbow would sit above that line. When you are done setting up the catch, the angle of your elbow should be between 90 and 120 degrees<ref>Jerszyński D, Antosiak-Cyrak K, Habiera M, Wochna K, Rostkowska E. Changes in selected parameters of swimming technique in the back crawl and the front crawl in young novice swimmers. J Hum Kinet. 2013 Jul 5;37:161-71. doi: 10.2478/hukin-2013-0037. PMID: 24146717; PMCID: PMC3796834.</ref>. | ||
Line 340: | Line 353: | ||
== '''Matlab script & visualization''' == | == '''Matlab script & visualization''' == | ||
Two different pieces of software are created for this project; a matlab script that gives feedback based on input data and a unity-based game that allows the user to visually look at their swimming technique. Both programs use a comma-separated table of data (usually a csv file) as input, following the following template of data: | |||
xyz coordinates\vector elements (in that order) for: [right shoulder, right elbow, right wrist, left shoulder, left elbow, left wrist, mid back, right hip, right knee, right ankle, left hip, left knee, left ankle, e_x back, e_y back, e_z back] | |||
Here is a link to the drive where our matlab files, the used datasets, the feedback audio fragments and some videos are stored for who is interested: https://drive.google.com/drive/folders/1Nkat2n8IxX_gXgsPh2etX_Fn2ceAWQ00 | |||
The robotsreal script in the drive is the final matlab script, and the final visualization is in the swimcapapp folder (swimcapappios.app for macOS). Since the data collected has no real leg data, another script called robotsrealnolegs is in the drive for the demonstration. This script is identical to the robotsreal script, but, as the name suggests, the analyzation of all leg errors is removed. | |||
The | === Final matlab script === | ||
The matlab script in its final form takes the data given as input, divides it in periods based on maxima in the x-position of the wrist joint for arms and maxima in the y-position of the ankle joint for legs and checks for technique errors. It does this by analyzing certain angles and normalized lengths of the arms and legs in certain parts (or the whole) of the stroke period. It also checks some general errors, like the phase difference between arms and legs or the tilt of the swimmer in general. After normalizing the occurrence of errors, it picks the most frequent one, or the one with the highest priority if multiple errors are tied. It then plays a sound based on the specific error made, giving appropriate feedback to the swimmer. If the error is improved upon, a positive sound effect plays. However, if the error is not improved upon for more than 5 times in a row, the suggestion is made to look at the visualization. | |||
=== Feedback to users === | |||
== Feedback to users == | |||
We have established 10 different sort of swimming mistakes that can be made, 4 general, 4 arm specific, 2 kick specific. We have asked a trainer at the local swim club De Brabantse Dolfijnen (DBD) who has 5 years of experience in coaching to organise these mistakes from most important (a) to least important (j). We have given each error a short but understandable audio feedback file (Tip1-Tip10). If a user improves the previous mistake we can play sound (Top5). If a user makes the same mistakes too often we could play sound (Tip11) which asks the user to look back at his visualised swimming on the computer which might help improving the error. If a user is doing some strokes correctly we can play one of the files (Top1-Top4) which tells the user that the stroke is executed correctly. The audio files can be found here: https://drive.google.com/drive/folders/1IJC0moxTQrO2zlHn6FKTA1iyOh_b4twc?usp=drive_link | We have established 10 different sort of swimming mistakes that can be made, 4 general, 4 arm specific, 2 kick specific. We have asked a trainer at the local swim club De Brabantse Dolfijnen (DBD) who has 5 years of experience in coaching to organise these mistakes from most important (a) to least important (j). We have given each error a short but understandable audio feedback file (Tip1-Tip10). If a user improves the previous mistake we can play sound (Top5). If a user makes the same mistakes too often we could play sound (Tip11) which asks the user to look back at his visualised swimming on the computer which might help improving the error. If a user is doing some strokes correctly we can play one of the files (Top1-Top4) which tells the user that the stroke is executed correctly. The audio files can be found here: https://drive.google.com/drive/folders/1IJC0moxTQrO2zlHn6FKTA1iyOh_b4twc?usp=drive_link | ||
Line 385: | Line 373: | ||
|Audio sentence | |Audio sentence | ||
|Priority number | |Priority number | ||
|File name (without .mp3) | |||
|- | |- | ||
|Tip 1 | |Tip 1 | ||
Line 390: | Line 379: | ||
|Try keeping you arms strokes in phase | |Try keeping you arms strokes in phase | ||
|a | |a | ||
|Tip1 | |||
|- | |- | ||
|Tip2 | |Tip2 | ||
Line 395: | Line 385: | ||
|Try keeping you leg strokes in phase | |Try keeping you leg strokes in phase | ||
|d | |d | ||
|Tip4 | |||
|- | |- | ||
|Tip3 | |Tip3 | ||
Line 400: | Line 391: | ||
|Try using more kicks | |Try using more kicks | ||
|i | |i | ||
|Tip9 | |||
|- | |- | ||
|Tip4 | |Tip4 | ||
Line 405: | Line 397: | ||
|Try reducing rotation in the shoulders | |Try reducing rotation in the shoulders | ||
|h | |h | ||
|Tip8 | |||
|- | |- | ||
|Tip5 | |Tip5 | ||
Line 410: | Line 403: | ||
|Try performing the entry at shoulder width | |Try performing the entry at shoulder width | ||
|b | |b | ||
|Tip2 | |||
|- | |- | ||
|Tip6 | |Tip6 | ||
Line 415: | Line 409: | ||
|Try keeping your elbow stretched at entry phase | |Try keeping your elbow stretched at entry phase | ||
|f | |f | ||
|Tip6 | |||
|- | |- | ||
|Tip7 | |Tip7 | ||
Line 420: | Line 415: | ||
|Try keeping your wrist at shoulder width during pull and push phase | |Try keeping your wrist at shoulder width during pull and push phase | ||
|e | |e | ||
|Tip5 | |||
|- | |- | ||
|Tip8 | |Tip8 | ||
Line 425: | Line 421: | ||
|Try reaching further forward during catch phase | |Try reaching further forward during catch phase | ||
|g | |g | ||
|Tip7 | |||
|- | |- | ||
|Tip9 | |Tip9 | ||
Line 430: | Line 427: | ||
|Try keeping your legs straight during kicks | |Try keeping your legs straight during kicks | ||
|j | |j | ||
|Tip10 | |||
|- | |- | ||
|Tip10 | |Tip10 | ||
Line 435: | Line 433: | ||
|Try keeping your kicks vertical | |Try keeping your kicks vertical | ||
|c | |c | ||
|Tip3 | |||
|- | |- | ||
|Tip11 | |Tip11 | ||
Line 440: | Line 439: | ||
|Maybe it is wise to watch back the animation to get a better visualisation of the error | |Maybe it is wise to watch back the animation to get a better visualisation of the error | ||
| | | | ||
|Tip11 | |||
|- | |- | ||
|Top1 | |Top1 | ||
Line 445: | Line 445: | ||
|You are doing great | |You are doing great | ||
| | | | ||
|Tip0 (convenient for code) | |||
|- | |- | ||
|Top2 | |Top2 | ||
Line 450: | Line 451: | ||
|You are demonstrating true mastery | |You are demonstrating true mastery | ||
| | | | ||
|Top2 (unused) | |||
|- | |- | ||
|Top3 | |Top3 | ||
Line 455: | Line 457: | ||
|You are executing this with remarkable finesse | |You are executing this with remarkable finesse | ||
| | | | ||
|Top3 (unused) | |||
|- | |- | ||
|Top4 | |Top4 | ||
Line 460: | Line 463: | ||
|Every stroke you take is a dance of precision and grace | |Every stroke you take is a dance of precision and grace | ||
| | | | ||
|Top4 (unused) | |||
|- | |- | ||
|Top5 | |Top5 | ||
Line 465: | Line 469: | ||
|*Short sound effect* | |*Short sound effect* | ||
| | | | ||
|Top5 | |||
|} | |} | ||
== | === Discussion on error analysis === | ||
In order to ensure that the script actually works with real data, measurements were performed at the Optitrack lab on atlas 9. This is a camera based method, rather than a sensor-based approach. However, for the sake of gathering data, this does not matter. Seven sensors were placed on our swimmer; 3 on each arm and 1 on the lower back. The swimmer performed some good and some bad takes, and based on this data, the performance of the script was improved greatly. The arm error analysis was tweaked so that no significant error values arise now for GoodTake2, while they do arise from the rest of the takes (GoodTake1 turned out to be not so good after all). | |||
No testing was done for the legs and their motion, so the leg error detection part of the script remains untested. Before any physical prototype is built, this should be done in a similar fashion to the testing of the arms. One of the errors (Tip4) is that the swimmer wobbles too much while swimming. In order to detect this error, the gyroscope data from the motion sensors is used. For the optical system we used and the way we used it, this data is not available, so the error detection for this potential error remains untested as well. For both the leg errors and the wobble error the method used to calculate the error is most likely sound, but the boundary for which values are acceptable and which ones are not requires tweaking that can only be done with real data. | |||
One feature of the script would need to be tweaked by performing user testing after attaching the script to some hardware. This is the tip11 message that plays when an error is made many times in a row. In the final version of the script here it plays after an arbitrarily chosen 5 repetitions of the same error. With user testing, it can be tested whether this is a good amount of repetitions, or if this number should be higher or lower. Also, the specific error messages can be tested for how helpful they are, and if maybe a different way of wording it might be better at conveying the message. | |||
=== Final visualization === | |||
The final visualization is a unity-based app in which you can fly around freely in a 'world' where two models are performing the swimming motion from the data. This way, you can look at the motion from all angles. When opening the game, an input field is on screen, where the user needs to insert the file path of the data they want to see (the copy file as path function in windows makes this easy, although the quotation marks should be removed). When the file path is in the input field, the user presses the start button, causing the input field and the button to disappear, and the two models to start performing the swimming motion from the data. | |||
[[File:A snapshot of the game.png|center|900x900px|alt=]]As can be seen in the above image, the two models are performing a swimming motion. The model on the left is a barebones model, consisting only of spheres representing the measured joints and thin cylinders connecting them. The model on the right is a free, pre-made model taken from the unity asset store <ref>Free model used for the visualisation.https://assetstore.unity.com/packages/3d/characters/humanoids/character-pack-lowpoly-free-221766</ref> . The model on the right was added first, but the data would sometimes stretch out the model, making the whole thing look weird. In order to guarantee a clear visualization, the model on the left was made. Both are still in the model so the user can decide which one they prefer to look at. | |||
Some features and controls: | |||
* Slider: The slider can be moved manually to any frame of the data. | |||
* Auto slide: This function can be toggled on and off by the "auto slide" toggle button in the bottom left (see image). When on, it will automatically move the slider forward and move the models through the frames of the data. It also automatically resets the slider to the beginning if the end is reached. | |||
* Wobble correction: This function can be toggled on and off by the "wobble correction" toggle button in the bottom left (see image). If the swimmer is swimming while wobbling their body a lot, the wobble can be corrected for by a coordinate transformation. For this, the base direction vectors of the central back sensor are used. If wobbling is one of the errors made, this may allow the user to see other errors more clearly. | |||
* Moving the camera: The wasd keys are used to move around as the camera. The space key can be used to keep the vertical coordinate constant while moving, which can be convenient. | |||
* Mouse control: The mouse can be used to move the camera, or to interact with the toggles and slider. To switch between these two mouse 'modes', use the F-key. | |||
Some features could be added to the visualization in future development, with the most important one being a connection between the feedback script and the visualization app. It may be useful to users to see which errors the software detected in which segments of the data they are seeing. For the current script it would have been possible to simply copy it into the app and have its error array output be put on display for each data segment. This is possible, as the current error detection algorithm is quite simple. If more complex algorithms are selected, this method may become too computationally expensive, and it may be better to have the script give a tweaked version of the data as output, where 1 or multiple columns are added to pass on the values of the error array. The visualization app can then be modified to extract the error data from the tweaked data. | |||
== '''Outlook''' == | |||
Before embodying the software into a physical sensor suit, the leg error detection should be tested using real user data, similar to the way it was done for arms. It may also be wise to redo the arm error detection based on data using an Olympic (or at least higher) level swimmer instead of some amateur. After this is done, the script can be embodied with a sensor suit for user testing, either using one that already exists, like the ones from Xsens<ref>Home page of Xsens, a movella brand of motion capture suits https://www.movella.com/products/xsens</ref>, or buiding your own using inertial sensors. The latter option is likely cheaper to build when looking at the materials, but the former is easier to implement in a testing stage. | |||
As mentioned previously, the visualization should be connected more to the script by using error data the script generates. Having this data present while looking at a 3d model performing your stroke can be helpful to users, as they may otherwise not exactly know what to look at. The current version of the app is a barebones alpha version, where things are built for the display of basic functionality, not for ease of use or appearance. It should of course be made prettier and more user-friendly before being released alongside the sensor suit. For example, the file path input at the start should be replaced with a file selection dialog, where the app opens the explorer so the user can easily find and select their file. | |||
Some additional improvements would be to give the algorithm the capability to recognize different strokes and give appropriate feedback for each. Another improvement would be to track the users strokerate, something many indicated they would want the system to do. Lastly, another possible feature would be that the system would not only indicate the mistakes you are making, but also recommend specific exercises to fix those mistakes(e.g., if a swimmer consistently drops their elbow during the catch, the system would recommend them certain sculling drills to work on this). | |||
So far the swimcap only works for android users that run a local UDP server and are connected to the same wifi as the suit. future improvements could be made by implementing a better microcontroller that does the computations itself and can transmit saved data to the phone after the session. Another option could be to run a global server, users can connect to without having to run it themselves. this would make using it a lot easier since the user wouldn't need external wifi at the swimming pool. the later of the two options would also not raise the price, weight and complexity of the devise | |||
Overall, a lot of work still needs to be done before Swimcap is ready to hit the market, but a good start was made with this project. | |||
== '''Appendix''' == | |||
=== Tasks done per person === | |||
{| class="wikitable" | {| class="wikitable" | ||
|+Tasks done per person with duration | |+Tasks done per person with duration | ||
Line 487: | Line 525: | ||
|- | |- | ||
|Simon B. Wessel | |Simon B. Wessel | ||
| | |14h | ||
|Attended beginning lecture (2h), group meeting (2h), research literature ( | |Attended beginning lecture (2h), group meeting (2h), research literature (8h), summarise articles and papers (2h) | ||
|- | |- | ||
|Javier Basterreche Blasco | |Javier Basterreche Blasco | ||
Line 516: | Line 554: | ||
|- | |- | ||
|Simon B. Wessel | |Simon B. Wessel | ||
| | |14h | ||
| | |Attended feedback session & group meeting (2h), look into existing projects using MPUs (4h), Downloading software and installing drivers for ESPs (4h), Programming ESP (4h) | ||
|- | |- | ||
|Javier Basterreche Blasco | |Javier Basterreche Blasco | ||
| | |11h | ||
|Attended meeting (2h), researched project state of the art and potential areas of improvement (4h), researched viable programming languages ( | |Attended meeting (2h), researched project state of the art and potential areas of improvement (4h), researched viable programming languages (5h) | ||
|- | |- | ||
|Matei Manaila | |Matei Manaila | ||
Line 545: | Line 583: | ||
|- | |- | ||
|Simon B. Wessel | |Simon B. Wessel | ||
| | |12h | ||
|Research components needed and their specifications (4h), Making virtual circuits with ESP and MPU (3h), Thinking of approach for wireless communication between ESP and UDP server and making block diagram ( | |Attended feedback session & group meetings (2h), Research components needed and their specifications (4h), Making virtual circuits with ESP and MPU (3h), Thinking of approach for wireless communication between ESP and UDP server and making block diagram (3h) | ||
|- | |- | ||
|Javier Basterreche Blasco | |Javier Basterreche Blasco | ||
| | |8h | ||
|Looked into potential hardware applications and its data processing ( | |Looked into potential hardware applications and its data processing (5h), read existing code (3h) | ||
|- | |- | ||
|Matei Manaila | |Matei Manaila | ||
Line 574: | Line 612: | ||
|- | |- | ||
|Simon B. Wessel | |Simon B. Wessel | ||
| | |10h | ||
|feedback session & group meetings (2h), Research on components weights and specifications ( | |feedback session & group meetings (2h), Research on components weights and specifications (4h), trying to get a hold of a motion tracking system (2h), Matlab programming (2h) | ||
|- | |- | ||
|Javier Basterreche Blasco | |Javier Basterreche Blasco | ||
| | |12h | ||
|Attended group meeting ( | |Attended group meeting (2h), attempted to coordinate scheduling of a session with a measurement system (3h), researched past coding implementations (4h), read existing code (3h) | ||
|- | |- | ||
|Matei Manaila | |Matei Manaila | ||
Line 599: | Line 637: | ||
|- | |- | ||
|Jarno Peters | |Jarno Peters | ||
| | |18h | ||
|attended feedback session & group meeting (2h), measuring with Opti track lab (2h), processing measurement data (1h), matlab coding (5h), unity coding ( | |attended feedback session & group meeting (2h), measuring with Opti track lab (2h), processing measurement data (1h), matlab coding (5h), unity coding (6h), documenting progress (2h) | ||
|- | |- | ||
|Simon B. Wessel | |Simon B. Wessel | ||
| | |14h | ||
| | |attended feedback session & group meeting (2h),investigated Opti track software (4h), measuring with Opti track lab (2h), processing measurement data (1h), woking on wiki (1h), writing cost function (4h) | ||
|- | |- | ||
|Javier Basterreche Blasco | |Javier Basterreche Blasco | ||
| | |11h | ||
| | |Attended group meeting (2h), measurement taking (2h), investigated Opti track software (4h), caught up with code (3h) | ||
|- | |- | ||
|Matei Manaila | |Matei Manaila | ||
Line 621: | Line 659: | ||
|Max van Aken | |Max van Aken | ||
|10h | |10h | ||
|Make mp3 files of feedback sounds (tips+tops) 4h, think of other helpfull/motivation audio cues (tip11+tops) | |Make mp3 files of feedback sounds (tips+tops) 4h, think of other helpfull/motivation audio cues (tip11+tops) 2h, polished wiki (3h), written introduction (1h) | ||
|- | |- | ||
|Bram van der Pas | |Bram van der Pas | ||
| | |11h | ||
| | |group meeting(2h), interpreting user survey(3h),justifying made decision(6h) | ||
|- | |- | ||
|Jarno Peters | |Jarno Peters | ||
Line 632: | Line 670: | ||
|- | |- | ||
|Simon B. Wessel | |Simon B. Wessel | ||
| | |14h | ||
| | |feedback session & group meeting (2h), understanding coding process (2h), work on wiki (4h), start with presentation(2h), try to get a UDP server to run(4h) | ||
|- | |- | ||
|Javier Basterreche Blasco | |Javier Basterreche Blasco | ||
| | |6h | ||
| | |group meeting (2h), worked on final wiki deliverable suggestions (4h) | ||
|- | |- | ||
|Matei Manaila | |Matei Manaila | ||
Line 649: | Line 687: | ||
|- | |- | ||
|Max van Aken | |Max van Aken | ||
| | |10h | ||
| | |feedback session & group meeting (2h),attending presentations (3h), Requirements (2h), rewriting wiki(3h) | ||
|- | |||
|Bram van der Pas | |||
|10h | |||
|feedback session & group meeting (2h),attending presentations (3h), justifying decisions(3h), rewriting wiki(2h) | |||
|- | |||
|Jarno Peters | |||
|17h | |||
|feedback session & group meeting (2h), finalizing matlab script (4h), preparing demonstration (3h), wiki editing (5h), attending presentations (3h) | |||
|- | |||
|Simon B. Wessel | |||
|16h | |||
|feedback session & group meeting (2h), making presentation (6h), attending final presentation(3h), making a clean drive (2h), make peer review template (2h) | |||
|- | |||
|Javier Basterreche Blasco | |||
|12h | |||
|feedback ses. & group meeting (2h), preparing presentation (7h), attending/giving final presentation (3h) | |||
|- | |||
|Matei Manaila | |||
|7h | |||
|group meeting(1h30), presentation(3h), presentation prep(2h30) | |||
|} | |||
{| class="wikitable" | |||
!Name | |||
!Total hours week 8 | |||
!Total hours | |||
!Week 8 | |||
|- | |||
|Max van Aken | |||
|5h | |||
|82h | |||
|wiki writing (5h) | |||
|- | |- | ||
|Bram van der Pas | |Bram van der Pas | ||
| | |4h | ||
| | |83h | ||
|rewriting wiki(4h) | |||
|- | |- | ||
|Jarno Peters | |Jarno Peters | ||
| | |8h | ||
| | |130h | ||
|cleaning up the drive (1h) wiki editing (7h) | |||
|- | |- | ||
|Simon B. Wessel | |Simon B. Wessel | ||
| | |5h | ||
| | |99h | ||
|clean up wiki (5h) | |||
|- | |- | ||
|Javier Basterreche Blasco | |Javier Basterreche Blasco | ||
| | |6h | ||
| | |76h | ||
|rewriting wiki (6h) | |||
|- | |- | ||
|Matei Manaila | |Matei Manaila | ||
| | |0h | ||
| | |80h | ||
| - | |||
|} | |} | ||
=== Code progress per week === | |||
Below is the progress of the matlab script and the unity-based visualization per week. Old versions of the matlab script can be found in the old scripts folder inside the matlab folder. There are no old versions of the old unity project, but there are some videos of early versions on the drive. | |||
==== Week 3 progress ==== | |||
Currently there are 3 data files in the drive folder. Book1 is a small base with random numbers I came up with on the fly to get the basics of importing the data under control. This database was also used to get the basic code for calculating angles and distances done correctly. BookSin contains a sinusoidal wave with 2 periods to configure a basic system that selects 1 full period of the data by determining a minimum in the absolute velocity in the x-direction. x is a sin wave, y is a cos wave, and z is a constant (1). This is for all 3 joints so the distances and angles calculated will yield nonsense when using this dataset. The basic plot also doesn't work. BookCos is similar to BookSin in that it is also periodic, but now the values are taken to also give a decent plot. The shoulder joint is taken in the origin, the elbow at x=sin(t),y=cos(t),z=0.5 and the wrist joint at x=2sin(t),y=1.3cos(t),z=1 (This book also contains 2 periods in week 3, but in week 4 this will be updated to 12). | |||
Right now, the robots1.m script can select one period of data (so a period from the swimming stroke) and isolate the position vectors of the (now) 3 joints in the data: the elbow, wrist and shoulder joint. It calculates the distances between these 3 joints and calculates several relevant angles. It also makes a simple 3d plot that takes one frame of the data (can be configured in the script) and "models" the arm using 2 straight lines. After these functions were added to the script, a matlab app (robots.mlapp) was made from the script with the same 3d plot. This app now has a slider, so the user can select a frame. You can also toggle a switch and let the program automatically go through all the frames by toggling the slider for you. | |||
==== Week 4 progress ==== | |||
In week 4 the script gets expanded to 2 arms and some basic error identification is established. To accomodate this, the BookCos csv file is expanded from 2 to 12 periods. It also receives double the amount of columns, with the original 9 being for the right arm and the 9 new ones for the left arm. The 9 new ones are also half a period behind the first 9, just like for an optimal front crawl. For further expansion, a new dataset is used, called BookBody1. It contains the same 18 columns as the BookCos file, but also a new set of coordinates for a central sensor on the back, and 18 new columns for the legs, with each leg having a sensor at the hip, knee and ankle. Beyond this, the normalized direction vectors (the e_x, e_y and e_z unit vectors) of the central back sensor are located in 9 columns. This is mainly useful for the back sensor, as this tells us something about the general tilt of the swimmer (we can also identify whether the swimmer wobbles a lot while swimming). In the BookBody1 file, the vector e_x is normal, but there is a small wobble present in the yz-plane. BookBody2 is the same as BookBody1, but with a more significant wobble in the yz-plane. For both the bookbody1 and bookbody2 data files the coordinate distribution over the columns is as follows: | |||
xyz coordinates\vector elements (in that order) for: right shoulder, right elbow, right wrist, left shoulder, left elbow, left wrist, mid back, right hip, right knee, right ankle, left hip, left knee, left ankle, e_x back, e_y back, e_z back | |||
The expanded scripta for this week are the robots2.m and the (newest) robotlegs1.m script. Some global errors are identified in the script, and the calculation of relative vectors and angles now occurs in a for loop, so the 'smaller' errors that can be deduced from this data can be counted for every individual period. These smaller errors are also counted for each arm individually, of course. In the robotlegs1 script, a second for loop for data selection and mistake identification is added for the legs (first one was for arms). Legs and arms are done in a different loop, as the leg periodicity generally differs from arm periodicity. | |||
The visualization app, robots.mlapp, is also expanded to the whole body. A small video segment is in the drive to display the current plot. A slightly different plotting method is used than before, which makes the thing look way nicer and more like a representation of a person. An optional wobble correction term is also added. This corrects for the swimmers wobbles by using a coordinate transformation to the unit vectors of the back sensor. In my created data, the joint coordinates are data without a wobbly swimmer, while the back sensor direction is wobbling, meaning that the wobble correction works in reverse to how it would work with real data. For the effect of the wobble correction function to be truly visible, BookBody2 needs to be loaded into the app, as this dataset has a larger wobbling effect. | |||
==== Week 5 progress ==== | |||
We finally managed to get some real measuring data from the system and with it some practical errors can be removed from the system. It is also useful to finetune the error detection system. The data only contains arm data, but this is fine, as the arm movement is the most important and complex one between arms and legs. robotsreal.m is the newest version of the script. | |||
One error that would occur is that one maximum sometimes appeared in the data as multiple points. This can be seen when the script has run on real data and a matrix like TFi or TFj is loaded in the command window. These arrays store the indices for local maxima, and in one case an array contained the points 793, 795, 1227, 1229 and 1235. This is a problem, since a period of the stroke is defined as the data points between the indices of maxima (these points). To solve this problem, a period of data is skipped in the error detection for loops if it is shorter than 10 data points long. This is also done separately for left and right, as this may occur on the left and right at different moments. Although it is not confirmed this also happens with the leg data, it likely does, so the same if statements are added to the leg error detector. | |||
Another coding mistake was that the errors of the right and left arm are evaluated in the same for loop. The amount of repetitions in the for loop depends on whichever dataset has the lowest amount of periods. For the perfect data used until this point this was fine, but now in one of the datasets the code detects 5 maxima that are very close (<10 data points apart) together on the right, while it doesn't do this on the left. Because the left now has 4 less maxima detected, the final 4 datapoints on the right are not evaluated, meaning most of the useful data is cut off. Separating the for loops for left and right solves this problem. | |||
A final thing that was worked on was a new program for visualization. Until now the only thing we have is the robots.mlapp, which is good, but it could be better. I attempted to copy my visualisation code into unity (this took a little while) to get a 3d model to follow the motion from the datasheet. It is not finished, as the motion still looks a little weird, but a video named unityrobot is in the google drive. | |||
==== Week 6 progress ==== | |||
The robotsreal.m script is finalized. A hierarchy is added that selects which error needs correcting the most. It first of all selects a group: general errors first, then arm errors, and finally leg errors. After this, it selects the error with the larges (normalized) value to give feedback on. The selection is done by a number called n, and a string is produced called "Tip<n>.mp3". A file connected to this string is played. After this, the error from the run is stored in the errmp and errp variables. Next time the script runs, it will compare the new value of the appropriate error matrix with the previous error for which a sound was played. If the value in the error matrix is less than previous run, the "Top5.mp3" sound is played. | |||
Since there is no real leg data, the script keeps on giving leg errors for good takes. Therefore, for the demonstration, a script called robotsrealnolegs.m can be used. This script is the exact same as robotsreal, but with all the leg-related stuff removed. | |||
The unity-based visualization is finally finished and it looks (in my humble opinion) quite awesome. Some videos are available in the drive, as is a folder called swimcapapp (or swimcapappios for ios) containing the working app. It has all the functions of the robots.mlapp script (autoslide, manual slide, wobble correction), but you can now also move freely in the room with 2 models. Both models go through the same motion, but the used mesh is different. The first model is a pre-made humanoid model, but this model can seem a bit weird and deformed sometimes, especially with wobble correction turned on. The second model fixes this by being made of a collection of spheres and interconnecting cylinders, so there is not much texture to deform. | |||
To test and use the visualization, a new dataset is made, called GoodTake2b.csv. I added the (fabricated) leg data from bookbody2 to GoodTake2, as the real data does not contain any leg data. | |||
==== Week 7 progress ==== | |||
Some finishing touches are done in week 7. The hierarchy is changed to match the priority given by our experienced trainer (see feedback to users). Furthermore, if the error from the previous measurement is improved upon, the Top5 sound plays (was already in week 6). Now the feature is added that if the error is not improved upon for 5 times, the Tip11 sound plays, recommending the user to use the visualization after the swimming session. This 5 times is an arbitrary number and user testing should be done to see what a better value might be. This also holds for the leg error detection in the script. | |||
In this week another script named robotsrealnolegs is made, which is the same as the final robotsreal script, but with all the error detection related to legs removed, as this is not relevant. This script is primarily used for the demonstration. | |||
=== Interview transcripts === | |||
Interview 1: | |||
Interviewer: How often do you swim per week? | |||
Response: 2-3 times per week | |||
I: For how long have you been swimming? | |||
R: Almost my whole life | |||
I: Are you also part of a swimming team? | |||
R:Yes, I’m part of DBD | |||
I: Do you also compete in swim meets? | |||
R:Yes, I do | |||
I: Have you ever had the feeling that you did not get enough technique feedback, or at least that you could use some more? | |||
R: Yes, I feel like we are doing a lot of technique exercises, but we don’t really get personal tips | |||
I: Have you ever made use of online services like online coaches or online videos to improve? | |||
R: No, not really | |||
I: Is there a specific reason for that? | |||
R: No, I’ve just never done it. | |||
I: We are working on a motion capture system that would be able to give technique tips. It would work by attaching sensors to certain joints and based on the relative position of those joints, be able to give feedback, like that you are pointing your arm too much to the side, or not bending it enough, etc. If such a system were to exist, would you be interested in it or do you feel like it wouldn’t have much added benefit for you? | |||
R: That sounds pretty cool, I would definitely be interested. | |||
I: If such a system were to exist, what would be your preferred method of feedback? For example, a video that you can watch back later that explains what to improve, an earpiece that gives automatic feedback while you are swimming, something like pressure on your joint indicating you need to move, or maybe you have your own idea. | |||
R: I would say a combination of the video, and some feedback while you are actually swimming. I would then probably say the pressure on joints idea. | |||
I: Why did you pick those? | |||
R: The video is nice to be able to say your own swimming afterwards, but I don’t think that alone will be enough so some direct feedback to make you feel what you are doing would also be important. | |||
I: Are there other things you would like the system to be able to track, like stroke frequency as an example? | |||
R: If possible, something like stroke frequency or breathing frequency would be nice. | |||
I: Which of the following things would you consider the most important in such a motion capture system: Low price, comfort, accuracy, ease of use, or durability. | |||
R: Firstly low price and comfort, then accuracy and durability. Ease of use I don’t find important. | |||
I: With a lot of motion capture systems, a laptop is needed to interpret the data and give feedback. Would having to bring a laptop to the pool be a deal-breaker for you? | |||
R: It wouldn’t be ideal, but not necessarily a deal-breaker. However having some laptop cover to prevent it from accidental splashing would be nice. | |||
I: Lastly, if the system were to actually exist, and be helpful in improving technique, how much money would you be willing to pay for it? | |||
R: It would depend on how well it would work, but probably around 50 to 100 euros, although I know this won’t be a very realistic price for such a product. | |||
I: Thanks for the interview, do you have any additional thoughts regarding the product? | |||
R: No, nothing right now. | |||
Summary: This interview was done with a very experienced swimmer who is already part of a swimming team(So not necessarily our target audience). However, the swimmer did indicate a feeling of not getting adequate feedback at his team, and indicated an interested in trying out our product if available. In terms of how to receive feedback, the swimmer prefers a video watch after he is done swimming, paired with some form of realtime feedback, preferably haptic. The swimmer finds comfort and low price the most important, while ease of use is not an issue. The swimmer would be willing to pay between 50 and 100 euros, but acknowledges that this might not be realistic. | |||
Interview 2: | |||
I: How often do you swim per week? | |||
R: I swim 3 times per week | |||
I: For how long have you been swimming? | |||
R: Around 5 years I believe | |||
I: Are you also part of a swimming team? | |||
R:Yes | |||
I: Do you also compete in swim meets | |||
R:Yes | |||
I: How often? | |||
R: Around 5 times a year | |||
I: Have you ever had the feeling that the coaching you received at the swim team wasn’t enough? | |||
R: I guess so | |||
I: How come? | |||
R: I feel like the trainers might have given up on giving feedback | |||
I: Do you feel like there is too little focus on technique in general, or that there are not enough personal tips? | |||
R: Coaches will usually make these general remarks like ‘focus on your technique’ so in that sense there is attention given to it, but too little personal tips. | |||
I: Have you ever made use of online services like coaching or videos to improve your technique? | |||
R: I have, but that was quite a while ago | |||
I: What did you use specifically? | |||
R: Online videos | |||
I: Did you feel like they helped you? | |||
R: Not really | |||
I: Do you have any idea why? | |||
R:No | |||
I: We are working on a motion capture system that would be able to give technique tips. It would work by attaching sensors to certain joints and based on the relative position of those joints, be able to give feedback, like that you are pointing your arm too much to the side, or not bending it enough, etc. If such a system were to exist, would you be interested in it or do you feel like it wouldn’t have much added benefit for you? | |||
R: That would sound fun to use | |||
I: If such a system were to exist, what would be your preferred method of feedback? For example, a video to watch back after you are done swimming, an earpiece giving you real-time feedback, some pressure based feedback so you feel when your technique is off? | |||
R: A video or an earpiece | |||
I: Why do you pick those? | |||
R: A video because I think its nice to be able to watch your stroke back and I think this can help a lot. The earpiece if it can give realtime feedback would be really nice because you don’t usually get that. The pressure idea sounds like it would be irritating. | |||
I: For the video, what would be the maximal time it could take for the video to be ready after you are done swimming? Does it have to be finished instantly, can it take a little while? | |||
R: Its not a problem if it takes a couple of minutes | |||
I: Are there more factors, like stroke rate, that the system would need to track | |||
R: No, not really. | |||
I:Which of the following aspects would you consider the most important in such a motion capture system: Low price, comfort, accuracy, ease of use, or durability. | |||
R: Comfort, its not of a lot of use if you have to change the way you swim. | |||
I: With a lot of motion capture systems its necessary to bring a laptop to the pool for the analysis, would this be a deal-breaker for you? | |||
R: Not a deal-breaker, but its not ideal. | |||
I: Lastly, if this system would work and actually be effective in improving your technique, what would be the maximum price you would pay for it? | |||
R: around 150 euro’s. | |||
Summary: This interview was done with a relatively experienced swimmer, who is already part of a swimming team. The swimmer indicates that there is a lack of personalized feedback at his club. For the feedback, the swimmer would like an animation after he is done swimming, and an earpiece for realtime feedback. The swimmer considers comfort the most important factor for such a system as having to swim differently due to the sensors would go against the purpose of the system. The swimmer would be willing to pay 150 euros for the system. | |||
=== User survey === | |||
Link to the User survey: https://docs.google.com/forms/d/e/1FAIpQLSfRmrGkf-iRCDJQGpLv0SILlQEPYvAIomGOxuePE_pGPYbJzA/viewform?usp=sharing | |||
== Literature Summaries == | == Literature Summaries == | ||
'''<u>Wearable motion capture suit with full-body tactile sensors<ref>Y. Fujimori, Y. Ohmura, T. Harada and Y. Kuniyoshi, "Wearable motion capture suit with full-body tactile sensors," 2009 IEEE International Conference on Robotics and Automation, Kobe, Japan, 2009, pp. 3186-3193, doi: 10.1109/ROBOT.2009.5152758. keywords: {Tactile sensors;Humans;Motion estimation;Humanoid robots;Wearable sensors;Motion measurement;Force measurement;Motion analysis;Shape;Robot control}, https://ieeexplore.ieee.org/abstract/document/5152758</ref></u>''' | '''<u>Wearable motion capture suit with full-body tactile sensors<ref>Y. Fujimori, Y. Ohmura, T. Harada and Y. Kuniyoshi, "Wearable motion capture suit with full-body tactile sensors," 2009 IEEE International Conference on Robotics and Automation, Kobe, Japan, 2009, pp. 3186-3193, doi: 10.1109/ROBOT.2009.5152758. keywords: {Tactile sensors;Humans;Motion estimation;Humanoid robots;Wearable sensors;Motion measurement;Force measurement;Motion analysis;Shape;Robot control}, https://ieeexplore.ieee.org/abstract/document/5152758</ref></u>''' |
Latest revision as of 22:15, 10 April 2025
Name | Student number |
---|---|
Max van Aken | 1859455 |
Bram van der Pas | 1893637 |
Jarno Peters | 1899198 |
Simon B. Wessel | 1755956 |
Javier Basterreche Blasco | 1700189 |
Matei Razvan Manaila | 1833006 |
Introduction
For this project, we attempted to develop a product step-by-step made to improve swimming technique. The project has thus been divided into different stages that logically build up atop each other. We start with defining the problem statement and our objective in that regard, which will help us outline our goals for the project together with the approach and milestones to that end. After this, the most important question needs to be raised: is there a demand for this product? To begin with, a few competitor products are investigated together with different user groups to identify our suitable target group. From this point, we carry out interviews and conduct surveys to get a thorough insight into the user needs. The requirements are quantitatively determined from this, seeming as they are largely based on the user research. A detailed discussion on this is given. Now that we know how the product should look like and we know what it should be able to do, we can start thinking about quantifying what is 'good and bad technique' and in accordance start working on the code. We end the project with an outline on the feedback system.
Start of project
Problem statement and objective
Ever since the start of the Covid pandemic, the demand for online coaching has grown ever-so rapidly[1]. One of the main factors behind this uprise is the levels of flexibility that online coaching provides, enabling technique feedback to swimmers at any point throughout the day[1]. The trade-off, however, is that these online coaching platforms are not exactly cheap. For instance, the biggest online swimming community MySwimPro charges a base monthly rate of $297 for online coaching[2], and this is not a stand alone case. Other companies such as Train Daly or SwimSmooth charge $200 and €167 respectively for just one hour of video feedback[3][4]. Our objective is to counter this setback through the design of a motion capture suit that provides an affordable alternative, while retaining the flexibility of online coaching.
The suit should be able to accurately track the swimmer's motions, use this data to identify errors, and provide clear feedback to the swimmer.
Approach
We started by researching which kind of sensors our suit should have and decided that accelerometers and gyroscopes are the best option, price wise and in practicality. These sensors are placed at the shoulders. elbows and wrist for the arms, and the hips, knees and ankles for the legs. lastly one sensor is placed on the back. with the help of the data that these sensors deliver, a skeleton can be created that is used to analyse the posture and movement of the swimmer.
Since the actual physical development of such a product would reach beyond the time we have in this quartile, we decided to make a theoretical product but with working software. To obtain the data we would get from the suit, we borrowed a motion tracking device from the uni.
Milestones and deliverables
These were the original milestones and deliverables established at the start of the project.
Due to the time frame and the scope of the course, a full body suit is likely not feasible. To be able to have something to show at the end of the course, a prototype will be built for one arm. There are also multiple ways of swimming, for this project the focus will be on the front crawl.
The milestones for the construction of the arm suit would be as follows:
- Build the sleeve (for now without any sensors yet). Keep the type of sensor to be used in mind when creating a sleeve.
- Build a functional prototype, either by attaching sensors, or by making a construction with external cameras. The prototype should be able to send position data for each joint to a computer.
- Convert the raw position data to usable coordinates, likely with angles and distances between joints.
- Construct a program that can differentiate between correct and wrong technique. Some technique errors may be distinguished manually, others might require some simple implementations of AI. One method for this would be gathering a bunch of data for wrong and correct arm motion, gathering simple features from the data, like minimal and maximal angle of the elbow joint, and training a simple decision tree.
- (bonus) if there is some time left, it may be possible to also write a program for a different way of swimming, like backstroke or the butterfly.
However, we quickly realised that the technology for underwater motion capture suits already exists. We therefore decided to work on something that hasn't been explored, namely how to create a feedback algorithm based on the data generated by the suit. Our goal is therefore to design a feedback algorithm that, given certain data, can identify errors in the swimmers' technique and give understandable feedback on how to improve. However, to emphasise that our product is not purely theoretical, and could actually be used in a real-world application, a design is also made for a motion capture suit that could generate the data our software analyses.
The users
The product will be aimed at individual amateur swimmers. These could be swimmers that have never been part of a swim team, have quit their swim team, or are part of a swim team but feel like the coaching they receive is inadequate(according to our own survey, which will be discussed later, a lot of swimmers seem to fall into this category). To properly make use of the product, it is important that the swimmer does have a bit of experience in swimming, and can understand the basic concepts. The product is not meant to replace beginner swimming lessons.
Some requirements/considerations for a system aimed at these swimmers would be:
-The system should be affordable, at least relative to online coaching
-The system should be accurate in tracking the swimmer’s motion, but doesn’t require milimeter precision
-The feedback given by the system should be easily understandable, and available as soon as possible
-The system should be comfortable and not hinder the swimmer’s natural swimming motion
-The motion capture suit should be a sensor system and not an optical system as individuals can’t simply put up cameras in a swimming pool.
-Most current motion capture systems require a laptop for data processing. Would swimmers be comfortable bringing a laptop to the pool?
To gain more insight into the specific user requirements, two interviews were conducted and a survey was spread online. The full transcript of the interviews, and the exact survey, can be found in the appendix but the main results will be summarized below.
User interviews
Summary interview 1: This interview was done with a very experienced swimmer who is already part of a swimming team(So not necessarily our target audience). However, the swimmer did indicate a feeling of not getting adequate feedback at his team, and indicated an interested in trying out our product if available. In terms of how to receive feedback, the swimmer prefers a video watch after he is done swimming, paired with some form of realtime feedback, preferably haptic. The swimmer finds comfort and low price the most important, while ease of use is not an issue. The swimmer would be willing to pay between 50 and 100 euros, but acknowledges that this might not be realistic.
Summary interview 2: This interview was done with a relatively experienced swimmer, who is already part of a swimming team. The swimmer indicates that there is a lack of personalized feedback at his club. For the feedback, the swimmer would like an animation after he is done swimming, and an earpiece for realtime feedback. The swimmer considers comfort the most important factor for such a system as having to swim differently due to the sensors would go against the purpose of the system. The swimmer would be willing to pay 150 euros for the system.
User survey
After reaching out to multiple Instagram accounts in the hope that some of them would share the survey with their audience, the account ‘SwimDepth’ ended up sharing it. In total 25 people responded to the survey. The respondents are almost all very motivated swimmers, with 92% swimming at least 3 times a week, and 56% even swimming more than 5 times a week. The respondents are also quite experienced, with 56% having swum for more than 5 years, while the other 44% have been swimming for somewhere between 1 to 5 years. 92% of swimmers have made use of in-person coaching(which probably just meant that they are part of a swimclub). Only 8% have made use of personalised online coaching. The main challenges encountered in coaching were inadequate feedback(13 votes), high cost(11 votes), and it being hard to find good trainers(9 votes). The most important factors influencing the respondents decision to buy the suit were comfort(19 votes), accuracy of feedback(18 votes), and price(17 votes). Durability and ease of use were found less important with 11 and 10 votes respectively. As to the preferred method of feedback, 52% picked visual feedback, while audio- and haptic feedback received 32% and 12% of the votes. The respondents were split on the maximum price for the product, with 52% being willing to pay between $100-$500, and 40% only being willing to pay below $100. 64% of respondents indicated that having to bring a laptop to the pool would not be a dealbreaker for them. There doesn’t seem to be a relation between how often the swimmers swim, and the price they are willing to pay for the system.
Requirements
Based off the user survey and the interviews, we can further specify the user requirements. First off, comfort was mentioned as the most important factor when designing the motion capture system. To ensure the system is comfortable, the sensors must be as light as possible and must be attached to the swimmers' joints in a way that does not restrict the swimmer from performing his/her natural swimming motion. Accuracy was also found to be important, so the sensors used must be of relatively high quality. Ease of use was found to be not so important, meaning the calibration time of the system can be quite high. In terms of feedback, there seemed to be a preference for an animation that can be watched back after swimming, paired with some form of real time feedback. The maximum price point for the system seemed to be around $500, but a significant number of swimmers would prioritize an even lower price. These user requirements lead to the following technical requirements:
Attribute | Specification | Tolerance | Note |
---|---|---|---|
Weight | 1.241-2.5 kg | - | Subject to change, correlated with Suit size |
Dimensions | L*W = 0.0167168 m^2 | +-5% | Actual numbers may change slightly. |
Shape | Cd <1.2 | +10% | The lower the better |
Calibration | <5 minutes | - | According to the interviews, ease of use was not
a very important factor. |
Accessibility | Real time feed-back.
Computation time: <2s |
- | Visualisation of data is desirable. |
Waterproof | IPx8 | - | 'x' can be any number |
Robustness | Resist fall of: E=>9.81 J | - | - |
Colour | HSL: (x,<50,<30), matte finish | HSL(:x,+-10,+-10) | - |
Price | Max.400 euro | +-25 | Based upon user research and similar products/ production cost |
References specifications:
Weight: The weight of the suit can be calculated with the following values:
ESP 32 board: 8-12g
MPU 6050 sensor: 3-5g
Batterie (Li-Po 3.7V, 1000-2000mAh): 20-50g
Wiring: 6g
Water proofing: 20-40g
1mm (Ultra-thin, rash guard style): 0.5 – 1 kg
The weight of a single sensor could vary from 57g to 113g. The biggest variants are in batterie size. A middle ground between weight and batterie life need to be found. The weight of a full suit including all sensors depends also on the weight of the suit which depends on the size. The weight would vary between 1.241-2.5kg depending on suit and batterie size.
Dimensions: If the 'reference' sensor is placed on the back (most steady part while swimming) it may not extend to far away from the body because we do not want to interfere with the flow of water/streamline of the swimmer in a great extend. (We assume the specification of 'shape' to be already incorporated here). The other 2 dimension may be extended more due to the reduced effects on streamline, but there is a limit to this which is correlated to the weight. We want the device to 'perfectly float' in the water, so it is practically weightless when using. To calculate the required volume of the device given the mass, we do the calculation shown in figure 1. We conclude, if the mass is 0.5 kg, the height 3cm, then: length*width=0.0167168 m^2 to let the device perfectly float in water. As explained in 'Shape', it is desired to reduce the width (W) as much as possible.
Shape: The shape of the device is crucial if we want to minimise the streamline of the swimmer. The general formula for drag force: Fd = (1/2)*ρ*v2*cd*A, with ρ the density of water, v the flow velocity relative to object, cd the drag coefficient and A the frontal surface area. The frontal surface area can be further reduced by minimising W in the equation in 'Dimension->Specification'. The only variable left where the design has influence over is the drag coefficient. For a 'general' swimmer is is roughly 1.1-1.3[5]. We want our product to be at least even streamlined as a swimmer so we have chosen Cd to be less then 1.2. Therefore the device increases the total drag proportionally to the swimmer without device.
Calibration: The calibration should be able to be completed alone and within 1 minute after putting on the sensors.
Accessibility: The calibration button should be easily reached, but not unintentionally hit. It would then be better if this was a digital, in-app button. The feedback must be real-time to give the user the opportunity to optimise the stroke while swimming. To do this visually is hard to do, you cannot watch your wrist all the time so you may miss some feedback. Feedback in speech through a set of earplugs seems to be ideal. The computation time from the finish of the stroke until the time of feedback must be at most 1 time the 'general stroke time'.[6] For Olympic swimmers we can analyse this video[7] and get roughly 100 strokes per minute. This means that the computation time may only take 60/100=0.6 seconds. The feedback should be stored to be made visually afterwards.
Waterproof: The first numeral is not important as it indicates protection against solid foreign objects. The second numeral refers to water where we decided it is necessary to have '8' for the main device, it should be protected against continuous immersion in water[8]. The IP rating however does not take into account any corrosion effects.
There are 2 parts that need to be waterproof: the units containing the electronics, and the wires connecting them.
For the limb-mounted units, the connection point of the shell can be sealed with 2 commercially available O-rings, along with plastic screws, rivets, or even sealed shut with silicone before sale, since they never need to be accessed. If using O-rings, we found EDPM to be resistant to both deformation over time and the chemicals often present in pools. With these, we are aiming for around 20% compression, which is on the lower end of the use range. This is in order to not exert too much stress on the plastic shell, while maintaining water-tightness, thus the length of the screw/rivet should be designed accordingly. If we were to use a sealant to permanently glue the unit shut, silicone or any commercially available option from the pool industry could be chosen.
The unit containing the battery however needs to be opened frequently, so we decided it would be more secure if the product was designed such that the battery can be removed, charged, placed back in the compartment and the entire unit sealed before reaching the pool. This was done to maximize the lifespan of the product and minimize risk of water damage, as products featuring waterproof USB covers often tend to fail in these areas. Specifically, over time, since they use friction-based waterproofing, which is not suitable for our type of application. We then decided to equip the battery compartment with a clip-on top for ease of use, but maintaining the same seal design, so there is no compromise on reliability.
All power and data transfer to and from the base unit will be done through wires, ensuring the user never needs to access any compartment other than the one containing the battery (for example to change sensor batteries). This also makes it easier for the user to check whether the critical parts are sufficiently dry before accessing the inner components, so there is no risk of water damage with proper use.
As the suit is supposed to mold to the body perfectly, we decided it would be best for the wires between units to be sewn on a snaking path along the suit. This gives the wires enough extra length to allow flexibility, as mobility would be severely limited otherwise. The wires could attach to the units via commercially available waterproof connections.
To further increase the reliability of our product, we need to reduce the complexity of its physical components as much as possible. One of the ways we achieve this is by moving towards a completely digital user interface. With our current design, the only part of the suit that would require interaction anywhere near a pool will be the power button. Waterproof buttons are widely available commercially, to multiple standards, thus not compromising the overall integrity of our product. Once turned on, the user will not be required to interact with any physical part of the suit again until they want to turn it off, as all other controls are done through the associated app.
Corrosion Proof: A water bassin is allowed to have a certain upper (or lower) limit of chlorine based particles in a specified volume. Free chlorine => 0.5 mg/L, Combined chlorine =< 0.6 mg/L, chloride =< 1000 mg/L, chlorate =<30 mg/L[9] so it must be protected against these substances. PVC is not effected by chlorine or salt. The lifetime is then roughly 15 years[10].
Robustness: We do not want the device to break when accidentally dropped, so we calculate the energy it will generate when falling from 2 meters which corresponds to the maximum heigh it could potentially fall. (E=mass*height*gravitational acceleration=0.5*2*9.81=9.81 J).
Colour: We do not want the device to unintentionally reflect the pool lights into the eyes of the user. There is not an academic source on how reflective a colour is. However we know that darker objects reflects less light and objects with a matt finish are able to decrease this 'reflectance' even further. If we use the HSL (hue, saturation, lightness)[11] method we can give intuitive values for the saturation and lightness.
Price: The material cost can be calculated from the price of all components combined.
component | amount | price pp | total |
---|---|---|---|
ESP | 13 | 0.98€ | 12.09€ |
MPU | 13 | 0.93€ | 12.09€ |
Batteris | 13 | 0.93€ | 12.09€ |
Charging module | 13 | 0.93€ | 12.09€ |
Wires | - | 0.93€ | 0.93€ |
Waterproofing | 13 | 0.93€ | 12.09€ |
Wet Suit | 1 | 50€ | 50€ |
Total | 67 | 55.58€ | 111,38 |
subject | cost | income | subject |
---|---|---|---|
development | 26330 | 5000 | sposoring/ start up |
workspace | 4403 | 82644 | product cost |
work | 25826 | ||
materials | 23012 | ||
contingencies | 7957 | ||
Profit | 114 |
For 250 units sold at 400 euro a pice in one year, this product would break even. From then on the profit margin would increase linearly
Calibration
The purpose of calibration is to give the software the actual distances and angles between the various sensors at a given time, so that after the calibration we can track the location of the body parts (sensors). The expensive Xsens suit also requires a calibration before using it and they have an 'accurate' and a 'fast' method[12]. For our suit we will now focus on this partially adopted 'fast' method. Human body proportions are not perfectly the same for everyone, however there is a table with relative lengths of body parts[13][14]. The proportions mentioned table 1 from this paper can be used. Only the total height of the person has to be measured and the other important lengths can be approximated from this. The angles between the sensors also have to be calibrated. This can be done by agreeing on a (convenient) pose where the angels are known. (for example: arms straight up with hands above shoulders -> elbow angel=180,shoulder angle wrt xy plane=0, shoulder angel wrt xz plane = 0; if we the foot are placed right under the hips, the angles are knee angle=180, hip angle wrt xy plane =0, hip angle wrt xy plane =0, angle shoulders wrt xy plane =0). We might, just as the Xsens suit include a option to insert all 11 measurements (2x lower leg, 2x upper leg, 2x lower arm,2x upper arm, hip width, shoulder width, hip-shoulder height).
State Of The Art: Hardware
The following is a summary of the state of the art of motion capture systems for swimming applications. There are two types of motion capture systems that can be used: A camera-based system, or a sensor-based system. Both will be discussed.
Camera
A recent literature review done on motion capture for 3d animation mentioned 5 main techniques that are used [15]. These techniques are passive optical markers, active optical markers, no markers, inertial motion sensors, and surface electromyography. In the context of swimming, either passive optical markers or no markers are the most promising. Passive markers are preferred over active ones, as active markers emit their own light, meaning they need electricity, making passive markers easier to incorporate in a swimsuit. Using no markers at all is also easier compared to active markers, as this would either require no special swimsuit at all or a plain wetsuit. One paper mentioned that the placement of the optical markers can take a long time, and that the optical markers can affect the subject's movement [16]. Moreover, they mentioned that it can sometimes be a challenge to identify which marker is which, meaning an elbow marker might be confused with a shoulder marker in certain positions. Another paper that studied the swimming behavior of horses using optical markers for motion capture also had problems with noise, which came as the markers were occasionally difficult to track [17]. For these reasons, a markerless optical system is the best option for our idea out of the camera based options. This is less invasive and generally yields better and less noisy data. The paper in [16] mentions a neural network approach called OpenPose [18] as a method to identify joints and motion in the footage obtained by the cameras.
When looking into the cutting edge of markerless motion capture companies are found that implement this technology commercially with low errors. For example, a company called Theia has a 3d camera based mocap system that can reach errors of less than 1 cm and less than 3 degrees [19].
Conclusion cameras: If a camera system is used, the system will definitely be a markerless one. Given that companies are already selling this technology on a commercial level indicates that there is not much for us to add to this technology on the hardware side of things.
Sensors
Almost all current papers on motion capture systems with sensors use inertial sensing technologies. Some systems are very cheap as well, such as [20], where the individual sensors are in the range of 10-20 euros. The exact model used in this specific paper (MPU-9250) is no longer being produced. However, the newer version, the MPU-9255, costs 13,50 euros per piece and is pretty much the same as its predecessor, according to tinytronics. In the paper of [20], 15 sensors were used, meaning the sensors cost a total of 202,50 euros. Of course, the sensor suit consisted of more electronics, but this would leave the suit in the price range of a few hundred euros. This is doable for swimming clubs if the suits can become one size fits all, as a handful of suits would then be enough per club. Another advantage of the sensors is that they are usually quite small and light, meaning they do not inhibit the motion of the swimmer much.
The biggest challenge for sensors is that the positions they give are integrated from the acceleration vectors they measure, meaning that the beginning position with respect to other sensors needs to be calibrated, and that a small error in acceleration or direction gets integrated over, meaning the error grows quickly over time. The initial calibration is a problem that has been tackled, as a company called Xsens, which sells motion capture suits to researchers, even has a well documented calibration procedure [21]. One thing that could be exploited for drift compensation is the fact that swimming is generally a periodic motion. For example with the front crawl your arms move forward and backward periodically, while the left arm and right arm are in opposite phase with respect to each other. Each arm will repeat moving forward, and then backward again, which means that at the end of a forward or backward phase there is a moment without velocity, which can maybe be used as some sort of reset point (I thought of this before looking at papers for drift so this idea is not very scientifically backed). Xsens has minimized drift in their motion capture suits by using machine learning and artificial intelligence to create an optimal algorithm [22]. This algorithm almost entirely eliminates drift, according to them. They also have the option to combine the sensor suit with a camera tracking system for situations where zero drift is allowed, such as movies. Other research also uses various algorithms to negate drift [23][24][25].
Another minor challenge that is swimming specific for the sensor suit is that the sensor suit has to be waterproof enough to endure the usually wet conditions of a swimming pool. The movella DOT sensor is IP68, meaning it is sufficiently waterproof for under water use [26].
Conclusion for sensors: sensor suits are a feasible option for a swimming mocap suit, and all major challenges already have a solution, to the point where there is not much for us to add as a group (drift etc).
Conclusion
For both the sensor- and camera-based mocap system there is not much for us to do on the hardware side. For both methods there are companies that have low error systems that could be used to feed the coordinate data into our software. In the context of a swimming technique suit in swimming pools, the sensor idea is a better fit, as it does not require fixing many cameras over the length of the pool to acquire the data. Given that we will likely focus on the software side of this idea, our choice of hardware does not matter a great deal, as we will assume a (near-)perfect data influx for our software to handle.
Data Transmission
The MPU sensor continuously captures real-time motion data, including three-axis accelerometer and gyroscope measurements, representing acceleration and angular velocity respectively. This sensor data is then read by the ESP32 microcontroller over an I²C interface at regular intervals. The ESP32 then formats this data into UDP packets and transmits them wirelessly to a UDP server hosted directly on a smartphone that is connected to the same local network. The smartphone, running a UDP server, listens for incoming packets, decodes them, and applies computations. Based on the computation’s outcome, the program generates an audio response through its built-in text-to-speech engine, delivering direct audio feedback via a Bluetooth headset paired to the phone. At the same time, the data is saved and can be rewatched in a 3D animation after the session.
Perfect swimming technique
To be able to give accurate feedback, the algorithm needs some 'perfect' baseline to compare the swimmers' technique with. What follows is a summary of some of the most important factors that determine whether someones technique is correct or not.
The freestyle stroke can generally be divided into 4 phases[27]:
-Entry&Catch(Note that these are sometimes defined as 2 separate phases):
During the entry, the hand enters the water fingertips first - with an angle of 45 degrees with respect to the water surface[28] - around half a meter in front of the shoulder[29]. The hand should enter at around shoulder width(Image your head indicates ‘12’ on a clock, your arms should enter at around ‘11’ and ‘1’ [29]). After entering, the hand should reach forwards and slightly down till the arm is fully extended, staying between 10 to 25cm below the surface of the water[30]. While extending the arm, it is also important to rotate towards the same side. The shoulders should rotate somewhere between 32 and 40 degrees[31], up to 45 degrees on a breathing stroke. During this rotation, and the entire stroke for that matter, the head should stay stationary, and pointed forwards at a 45 degrees angle[32].
After fully extending the arm, the catch begins. Here, fingertips are ‘pressed down’ and the elbow is bent such that the fingertips and forearms will be pointed towards the bottom of the pool.It generally holds that the earlier in the stroke you can set up this ‘early vertical forearm’, the better. It is extremely important that this is done while maintaining a ‘high elbow’ meaning that if one were to draw a straight line between your fingertips and your shoulder, the elbow would sit above that line. When you are done setting up the catch, the angle of your elbow should be between 90 and 120 degrees[33].
-Pull: The catch smoothly transitions into the pull phase, where the arm is pulled straight back(the pull is straight, not the arm!) with the elbow above the hand for as long as possible[28].
-Push(or exit): About when the hand reaches the hips, the arm will extend and push out the last bit of water, while transitioning smoothly into the recovery.
-Recovery: The recovery is the movement of the hand back over the water. Its important that during the recovery, the elbow stays above the hand[29] and leads the recovery, while the hand and forearm are relaxed[34]
Matlab script & visualization
Two different pieces of software are created for this project; a matlab script that gives feedback based on input data and a unity-based game that allows the user to visually look at their swimming technique. Both programs use a comma-separated table of data (usually a csv file) as input, following the following template of data:
xyz coordinates\vector elements (in that order) for: [right shoulder, right elbow, right wrist, left shoulder, left elbow, left wrist, mid back, right hip, right knee, right ankle, left hip, left knee, left ankle, e_x back, e_y back, e_z back]
Here is a link to the drive where our matlab files, the used datasets, the feedback audio fragments and some videos are stored for who is interested: https://drive.google.com/drive/folders/1Nkat2n8IxX_gXgsPh2etX_Fn2ceAWQ00
The robotsreal script in the drive is the final matlab script, and the final visualization is in the swimcapapp folder (swimcapappios.app for macOS). Since the data collected has no real leg data, another script called robotsrealnolegs is in the drive for the demonstration. This script is identical to the robotsreal script, but, as the name suggests, the analyzation of all leg errors is removed.
Final matlab script
The matlab script in its final form takes the data given as input, divides it in periods based on maxima in the x-position of the wrist joint for arms and maxima in the y-position of the ankle joint for legs and checks for technique errors. It does this by analyzing certain angles and normalized lengths of the arms and legs in certain parts (or the whole) of the stroke period. It also checks some general errors, like the phase difference between arms and legs or the tilt of the swimmer in general. After normalizing the occurrence of errors, it picks the most frequent one, or the one with the highest priority if multiple errors are tied. It then plays a sound based on the specific error made, giving appropriate feedback to the swimmer. If the error is improved upon, a positive sound effect plays. However, if the error is not improved upon for more than 5 times in a row, the suggestion is made to look at the visualization.
Feedback to users
We have established 10 different sort of swimming mistakes that can be made, 4 general, 4 arm specific, 2 kick specific. We have asked a trainer at the local swim club De Brabantse Dolfijnen (DBD) who has 5 years of experience in coaching to organise these mistakes from most important (a) to least important (j). We have given each error a short but understandable audio feedback file (Tip1-Tip10). If a user improves the previous mistake we can play sound (Top5). If a user makes the same mistakes too often we could play sound (Tip11) which asks the user to look back at his visualised swimming on the computer which might help improving the error. If a user is doing some strokes correctly we can play one of the files (Top1-Top4) which tells the user that the stroke is executed correctly. The audio files can be found here: https://drive.google.com/drive/folders/1IJC0moxTQrO2zlHn6FKTA1iyOh_b4twc?usp=drive_link
Tip/Top number | Error | Audio sentence | Priority number | File name (without .mp3) |
---|---|---|---|---|
Tip 1 | Too big phase difference in arms | Try keeping you arms strokes in phase | a | Tip1 |
Tip2 | Too big phase difference in legs | Try keeping you leg strokes in phase | d | Tip4 |
Tip3 | Insufficient leg usage | Try using more kicks | i | Tip9 |
Tip4 | Too much wobbling in shoulders | Try reducing rotation in the shoulders | h | Tip8 |
Tip5 | Arms not straight when coming forward | Try performing the entry at shoulder width | b | Tip2 |
Tip6 | Elbow angle bend when forward | Try keeping your elbow stretched at entry phase | f | Tip6 |
Tip7 | Wrist not at shoulder width when moving arm backward | Try keeping your wrist at shoulder width during pull and push phase | e | Tip5 |
Tip8 | Entry not far enough | Try reaching further forward during catch phase | g | Tip7 |
Tip9 | Movement from knees | Try keeping your legs straight during kicks | j | Tip10 |
Tip10 | Legs not straight w.r.t. moving direction | Try keeping your kicks vertical | c | Tip3 |
Tip11 | x | Maybe it is wise to watch back the animation to get a better visualisation of the error | Tip11 | |
Top1 | x | You are doing great | Tip0 (convenient for code) | |
Top2 | x | You are demonstrating true mastery | Top2 (unused) | |
Top3 | x | You are executing this with remarkable finesse | Top3 (unused) | |
Top4 | x | Every stroke you take is a dance of precision and grace | Top4 (unused) | |
Top5 | x | *Short sound effect* | Top5 |
Discussion on error analysis
In order to ensure that the script actually works with real data, measurements were performed at the Optitrack lab on atlas 9. This is a camera based method, rather than a sensor-based approach. However, for the sake of gathering data, this does not matter. Seven sensors were placed on our swimmer; 3 on each arm and 1 on the lower back. The swimmer performed some good and some bad takes, and based on this data, the performance of the script was improved greatly. The arm error analysis was tweaked so that no significant error values arise now for GoodTake2, while they do arise from the rest of the takes (GoodTake1 turned out to be not so good after all).
No testing was done for the legs and their motion, so the leg error detection part of the script remains untested. Before any physical prototype is built, this should be done in a similar fashion to the testing of the arms. One of the errors (Tip4) is that the swimmer wobbles too much while swimming. In order to detect this error, the gyroscope data from the motion sensors is used. For the optical system we used and the way we used it, this data is not available, so the error detection for this potential error remains untested as well. For both the leg errors and the wobble error the method used to calculate the error is most likely sound, but the boundary for which values are acceptable and which ones are not requires tweaking that can only be done with real data.
One feature of the script would need to be tweaked by performing user testing after attaching the script to some hardware. This is the tip11 message that plays when an error is made many times in a row. In the final version of the script here it plays after an arbitrarily chosen 5 repetitions of the same error. With user testing, it can be tested whether this is a good amount of repetitions, or if this number should be higher or lower. Also, the specific error messages can be tested for how helpful they are, and if maybe a different way of wording it might be better at conveying the message.
Final visualization
The final visualization is a unity-based app in which you can fly around freely in a 'world' where two models are performing the swimming motion from the data. This way, you can look at the motion from all angles. When opening the game, an input field is on screen, where the user needs to insert the file path of the data they want to see (the copy file as path function in windows makes this easy, although the quotation marks should be removed). When the file path is in the input field, the user presses the start button, causing the input field and the button to disappear, and the two models to start performing the swimming motion from the data.
As can be seen in the above image, the two models are performing a swimming motion. The model on the left is a barebones model, consisting only of spheres representing the measured joints and thin cylinders connecting them. The model on the right is a free, pre-made model taken from the unity asset store [35] . The model on the right was added first, but the data would sometimes stretch out the model, making the whole thing look weird. In order to guarantee a clear visualization, the model on the left was made. Both are still in the model so the user can decide which one they prefer to look at.
Some features and controls:
- Slider: The slider can be moved manually to any frame of the data.
- Auto slide: This function can be toggled on and off by the "auto slide" toggle button in the bottom left (see image). When on, it will automatically move the slider forward and move the models through the frames of the data. It also automatically resets the slider to the beginning if the end is reached.
- Wobble correction: This function can be toggled on and off by the "wobble correction" toggle button in the bottom left (see image). If the swimmer is swimming while wobbling their body a lot, the wobble can be corrected for by a coordinate transformation. For this, the base direction vectors of the central back sensor are used. If wobbling is one of the errors made, this may allow the user to see other errors more clearly.
- Moving the camera: The wasd keys are used to move around as the camera. The space key can be used to keep the vertical coordinate constant while moving, which can be convenient.
- Mouse control: The mouse can be used to move the camera, or to interact with the toggles and slider. To switch between these two mouse 'modes', use the F-key.
Some features could be added to the visualization in future development, with the most important one being a connection between the feedback script and the visualization app. It may be useful to users to see which errors the software detected in which segments of the data they are seeing. For the current script it would have been possible to simply copy it into the app and have its error array output be put on display for each data segment. This is possible, as the current error detection algorithm is quite simple. If more complex algorithms are selected, this method may become too computationally expensive, and it may be better to have the script give a tweaked version of the data as output, where 1 or multiple columns are added to pass on the values of the error array. The visualization app can then be modified to extract the error data from the tweaked data.
Outlook
Before embodying the software into a physical sensor suit, the leg error detection should be tested using real user data, similar to the way it was done for arms. It may also be wise to redo the arm error detection based on data using an Olympic (or at least higher) level swimmer instead of some amateur. After this is done, the script can be embodied with a sensor suit for user testing, either using one that already exists, like the ones from Xsens[36], or buiding your own using inertial sensors. The latter option is likely cheaper to build when looking at the materials, but the former is easier to implement in a testing stage.
As mentioned previously, the visualization should be connected more to the script by using error data the script generates. Having this data present while looking at a 3d model performing your stroke can be helpful to users, as they may otherwise not exactly know what to look at. The current version of the app is a barebones alpha version, where things are built for the display of basic functionality, not for ease of use or appearance. It should of course be made prettier and more user-friendly before being released alongside the sensor suit. For example, the file path input at the start should be replaced with a file selection dialog, where the app opens the explorer so the user can easily find and select their file.
Some additional improvements would be to give the algorithm the capability to recognize different strokes and give appropriate feedback for each. Another improvement would be to track the users strokerate, something many indicated they would want the system to do. Lastly, another possible feature would be that the system would not only indicate the mistakes you are making, but also recommend specific exercises to fix those mistakes(e.g., if a swimmer consistently drops their elbow during the catch, the system would recommend them certain sculling drills to work on this).
So far the swimcap only works for android users that run a local UDP server and are connected to the same wifi as the suit. future improvements could be made by implementing a better microcontroller that does the computations itself and can transmit saved data to the phone after the session. Another option could be to run a global server, users can connect to without having to run it themselves. this would make using it a lot easier since the user wouldn't need external wifi at the swimming pool. the later of the two options would also not raise the price, weight and complexity of the devise
Overall, a lot of work still needs to be done before Swimcap is ready to hit the market, but a good start was made with this project.
Appendix
Tasks done per person
Name | Total hours | Week 1 |
---|---|---|
Max van Aken | 10h | Attended beginnen lecture+meeting(2h), Read sources (5h), summerise sources (3h) |
Bram van der Pas | 13h | Attented lecture(2h), Read sources(8h), Summarized sources(3h) |
Jarno Peters | 12h | Attended beginning lecture (2h), group meeting (2h), created start of project chapter (3h), summarized 5 sources for literature summary (5h) |
Simon B. Wessel | 14h | Attended beginning lecture (2h), group meeting (2h), research literature (8h), summarise articles and papers (2h) |
Javier Basterreche Blasco | 10h | Attended beginning lecture (2h), group meeting (2h), literature acquisition (4h) summarizing research (2h) |
Matei Manaila | 9h | Researched and read sources (7h), summarise sources (2h) |
Name | Total hours | Week 2 |
---|---|---|
Max van Aken | 6h | Investigated paper+dataset 'xyz dataset'(3h), Searched for 'perfect swim data' (3h) |
Bram van der Pas | 14h | Attend meeting(2h), Research different possible users(10h), summarize findings(2h) |
Jarno Peters | 15h | Attended feedback session & group meeting (2h), identify challenges for sensor & camera systems (2h), research state of the art (10h), write camera vs sensors section (3h) |
Simon B. Wessel | 14h | Attended feedback session & group meeting (2h), look into existing projects using MPUs (4h), Downloading software and installing drivers for ESPs (4h), Programming ESP (4h) |
Javier Basterreche Blasco | 11h | Attended meeting (2h), researched project state of the art and potential areas of improvement (4h), researched viable programming languages (5h) |
Matei Manaila | 14h | Attend meeting(2h), research of technologies and design brainstorming (12h) |
Name | Total hours | Week 3 |
---|---|---|
Max van Aken | 15h | Researched specifications + references (15h) |
Bram van der Pas | 12h | Attend feedback session and group meeting(2h), prepare survey and interviews(7h), hold interviews(1h), work out interviews(1h), try to spread survey(1h) |
Jarno Peters | 17h | Attended feedback session & group meetings (2h), work on matlab script and app (13h), document matlab script progress (2h) |
Simon B. Wessel | 12h | Attended feedback session & group meetings (2h), Research components needed and their specifications (4h), Making virtual circuits with ESP and MPU (3h), Thinking of approach for wireless communication between ESP and UDP server and making block diagram (3h) |
Javier Basterreche Blasco | 8h | Looked into potential hardware applications and its data processing (5h), read existing code (3h) |
Matei Manaila | 12h | group meeting (2h), start learning matlab and understand existing code(10h) |
Name | Total hours | Week 4 |
---|---|---|
Max van Aken | 10h | 2h help Jarno with matlab/possible errors in swimming, 8h calibration research/elaboration |
Bram van der Pas | 8h | Attend meeting(2h), get survey to be spread out(4h), work on user requirements(2h) |
Jarno Peters | 20h | attend feedback session & group meetings (2h), matlab coding (15h), documenting matlab progress (3h) |
Simon B. Wessel | 10h | feedback session & group meetings (2h), Research on components weights and specifications (4h), trying to get a hold of a motion tracking system (2h), Matlab programming (2h) |
Javier Basterreche Blasco | 12h | Attended group meeting (2h), attempted to coordinate scheduling of a session with a measurement system (3h), researched past coding implementations (4h), read existing code (3h) |
Matei Manaila | 10h | group meeting (2h), catch up with code (3h), work on legs code (5h) |
Name | Total hours | Week 5 |
---|---|---|
Max van Aken | 6h | Feedback session+meeting (2h), 2h feedback priority discussion, 2h start on feedback sounds |
Bram van der Pas | 11h | attending meetings(2h), getting survey spread(4h), researching perfect swimming technique(5h). |
Jarno Peters | 18h | attended feedback session & group meeting (2h), measuring with Opti track lab (2h), processing measurement data (1h), matlab coding (5h), unity coding (6h), documenting progress (2h) |
Simon B. Wessel | 14h | attended feedback session & group meeting (2h),investigated Opti track software (4h), measuring with Opti track lab (2h), processing measurement data (1h), woking on wiki (1h), writing cost function (4h) |
Javier Basterreche Blasco | 11h | Attended group meeting (2h), measurement taking (2h), investigated Opti track software (4h), caught up with code (3h) |
Matei Manaila | 10h | group meeting (2h), polished wiki (6h), caught up with code (2h) |
Name | Total hours | Week 6 |
---|---|---|
Max van Aken | 10h | Make mp3 files of feedback sounds (tips+tops) 4h, think of other helpfull/motivation audio cues (tip11+tops) 2h, polished wiki (3h), written introduction (1h) |
Bram van der Pas | 11h | group meeting(2h), interpreting user survey(3h),justifying made decision(6h) |
Jarno Peters | 23h | feedback session & group meeting (2h), adding legs to real data for visualization testing(1h), matlab coding (3h), unity coding (15h), wiki editing (2h) |
Simon B. Wessel | 14h | feedback session & group meeting (2h), understanding coding process (2h), work on wiki (4h), start with presentation(2h), try to get a UDP server to run(4h) |
Javier Basterreche Blasco | 6h | group meeting (2h), worked on final wiki deliverable suggestions (4h) |
Matei Manaila | 18h | presentation draft (8h), waterproofing research+design(5h), waterproofing wiki update(1h), Photoshop visuals for presentation (4h) |
Name | Total hours | Week 7 |
---|---|---|
Max van Aken | 10h | feedback session & group meeting (2h),attending presentations (3h), Requirements (2h), rewriting wiki(3h) |
Bram van der Pas | 10h | feedback session & group meeting (2h),attending presentations (3h), justifying decisions(3h), rewriting wiki(2h) |
Jarno Peters | 17h | feedback session & group meeting (2h), finalizing matlab script (4h), preparing demonstration (3h), wiki editing (5h), attending presentations (3h) |
Simon B. Wessel | 16h | feedback session & group meeting (2h), making presentation (6h), attending final presentation(3h), making a clean drive (2h), make peer review template (2h) |
Javier Basterreche Blasco | 12h | feedback ses. & group meeting (2h), preparing presentation (7h), attending/giving final presentation (3h) |
Matei Manaila | 7h | group meeting(1h30), presentation(3h), presentation prep(2h30) |
Name | Total hours week 8 | Total hours | Week 8 |
---|---|---|---|
Max van Aken | 5h | 82h | wiki writing (5h) |
Bram van der Pas | 4h | 83h | rewriting wiki(4h) |
Jarno Peters | 8h | 130h | cleaning up the drive (1h) wiki editing (7h) |
Simon B. Wessel | 5h | 99h | clean up wiki (5h) |
Javier Basterreche Blasco | 6h | 76h | rewriting wiki (6h) |
Matei Manaila | 0h | 80h | - |
Code progress per week
Below is the progress of the matlab script and the unity-based visualization per week. Old versions of the matlab script can be found in the old scripts folder inside the matlab folder. There are no old versions of the old unity project, but there are some videos of early versions on the drive.
Week 3 progress
Currently there are 3 data files in the drive folder. Book1 is a small base with random numbers I came up with on the fly to get the basics of importing the data under control. This database was also used to get the basic code for calculating angles and distances done correctly. BookSin contains a sinusoidal wave with 2 periods to configure a basic system that selects 1 full period of the data by determining a minimum in the absolute velocity in the x-direction. x is a sin wave, y is a cos wave, and z is a constant (1). This is for all 3 joints so the distances and angles calculated will yield nonsense when using this dataset. The basic plot also doesn't work. BookCos is similar to BookSin in that it is also periodic, but now the values are taken to also give a decent plot. The shoulder joint is taken in the origin, the elbow at x=sin(t),y=cos(t),z=0.5 and the wrist joint at x=2sin(t),y=1.3cos(t),z=1 (This book also contains 2 periods in week 3, but in week 4 this will be updated to 12).
Right now, the robots1.m script can select one period of data (so a period from the swimming stroke) and isolate the position vectors of the (now) 3 joints in the data: the elbow, wrist and shoulder joint. It calculates the distances between these 3 joints and calculates several relevant angles. It also makes a simple 3d plot that takes one frame of the data (can be configured in the script) and "models" the arm using 2 straight lines. After these functions were added to the script, a matlab app (robots.mlapp) was made from the script with the same 3d plot. This app now has a slider, so the user can select a frame. You can also toggle a switch and let the program automatically go through all the frames by toggling the slider for you.
Week 4 progress
In week 4 the script gets expanded to 2 arms and some basic error identification is established. To accomodate this, the BookCos csv file is expanded from 2 to 12 periods. It also receives double the amount of columns, with the original 9 being for the right arm and the 9 new ones for the left arm. The 9 new ones are also half a period behind the first 9, just like for an optimal front crawl. For further expansion, a new dataset is used, called BookBody1. It contains the same 18 columns as the BookCos file, but also a new set of coordinates for a central sensor on the back, and 18 new columns for the legs, with each leg having a sensor at the hip, knee and ankle. Beyond this, the normalized direction vectors (the e_x, e_y and e_z unit vectors) of the central back sensor are located in 9 columns. This is mainly useful for the back sensor, as this tells us something about the general tilt of the swimmer (we can also identify whether the swimmer wobbles a lot while swimming). In the BookBody1 file, the vector e_x is normal, but there is a small wobble present in the yz-plane. BookBody2 is the same as BookBody1, but with a more significant wobble in the yz-plane. For both the bookbody1 and bookbody2 data files the coordinate distribution over the columns is as follows:
xyz coordinates\vector elements (in that order) for: right shoulder, right elbow, right wrist, left shoulder, left elbow, left wrist, mid back, right hip, right knee, right ankle, left hip, left knee, left ankle, e_x back, e_y back, e_z back
The expanded scripta for this week are the robots2.m and the (newest) robotlegs1.m script. Some global errors are identified in the script, and the calculation of relative vectors and angles now occurs in a for loop, so the 'smaller' errors that can be deduced from this data can be counted for every individual period. These smaller errors are also counted for each arm individually, of course. In the robotlegs1 script, a second for loop for data selection and mistake identification is added for the legs (first one was for arms). Legs and arms are done in a different loop, as the leg periodicity generally differs from arm periodicity.
The visualization app, robots.mlapp, is also expanded to the whole body. A small video segment is in the drive to display the current plot. A slightly different plotting method is used than before, which makes the thing look way nicer and more like a representation of a person. An optional wobble correction term is also added. This corrects for the swimmers wobbles by using a coordinate transformation to the unit vectors of the back sensor. In my created data, the joint coordinates are data without a wobbly swimmer, while the back sensor direction is wobbling, meaning that the wobble correction works in reverse to how it would work with real data. For the effect of the wobble correction function to be truly visible, BookBody2 needs to be loaded into the app, as this dataset has a larger wobbling effect.
Week 5 progress
We finally managed to get some real measuring data from the system and with it some practical errors can be removed from the system. It is also useful to finetune the error detection system. The data only contains arm data, but this is fine, as the arm movement is the most important and complex one between arms and legs. robotsreal.m is the newest version of the script.
One error that would occur is that one maximum sometimes appeared in the data as multiple points. This can be seen when the script has run on real data and a matrix like TFi or TFj is loaded in the command window. These arrays store the indices for local maxima, and in one case an array contained the points 793, 795, 1227, 1229 and 1235. This is a problem, since a period of the stroke is defined as the data points between the indices of maxima (these points). To solve this problem, a period of data is skipped in the error detection for loops if it is shorter than 10 data points long. This is also done separately for left and right, as this may occur on the left and right at different moments. Although it is not confirmed this also happens with the leg data, it likely does, so the same if statements are added to the leg error detector.
Another coding mistake was that the errors of the right and left arm are evaluated in the same for loop. The amount of repetitions in the for loop depends on whichever dataset has the lowest amount of periods. For the perfect data used until this point this was fine, but now in one of the datasets the code detects 5 maxima that are very close (<10 data points apart) together on the right, while it doesn't do this on the left. Because the left now has 4 less maxima detected, the final 4 datapoints on the right are not evaluated, meaning most of the useful data is cut off. Separating the for loops for left and right solves this problem.
A final thing that was worked on was a new program for visualization. Until now the only thing we have is the robots.mlapp, which is good, but it could be better. I attempted to copy my visualisation code into unity (this took a little while) to get a 3d model to follow the motion from the datasheet. It is not finished, as the motion still looks a little weird, but a video named unityrobot is in the google drive.
Week 6 progress
The robotsreal.m script is finalized. A hierarchy is added that selects which error needs correcting the most. It first of all selects a group: general errors first, then arm errors, and finally leg errors. After this, it selects the error with the larges (normalized) value to give feedback on. The selection is done by a number called n, and a string is produced called "Tip<n>.mp3". A file connected to this string is played. After this, the error from the run is stored in the errmp and errp variables. Next time the script runs, it will compare the new value of the appropriate error matrix with the previous error for which a sound was played. If the value in the error matrix is less than previous run, the "Top5.mp3" sound is played.
Since there is no real leg data, the script keeps on giving leg errors for good takes. Therefore, for the demonstration, a script called robotsrealnolegs.m can be used. This script is the exact same as robotsreal, but with all the leg-related stuff removed.
The unity-based visualization is finally finished and it looks (in my humble opinion) quite awesome. Some videos are available in the drive, as is a folder called swimcapapp (or swimcapappios for ios) containing the working app. It has all the functions of the robots.mlapp script (autoslide, manual slide, wobble correction), but you can now also move freely in the room with 2 models. Both models go through the same motion, but the used mesh is different. The first model is a pre-made humanoid model, but this model can seem a bit weird and deformed sometimes, especially with wobble correction turned on. The second model fixes this by being made of a collection of spheres and interconnecting cylinders, so there is not much texture to deform.
To test and use the visualization, a new dataset is made, called GoodTake2b.csv. I added the (fabricated) leg data from bookbody2 to GoodTake2, as the real data does not contain any leg data.
Week 7 progress
Some finishing touches are done in week 7. The hierarchy is changed to match the priority given by our experienced trainer (see feedback to users). Furthermore, if the error from the previous measurement is improved upon, the Top5 sound plays (was already in week 6). Now the feature is added that if the error is not improved upon for 5 times, the Tip11 sound plays, recommending the user to use the visualization after the swimming session. This 5 times is an arbitrary number and user testing should be done to see what a better value might be. This also holds for the leg error detection in the script.
In this week another script named robotsrealnolegs is made, which is the same as the final robotsreal script, but with all the error detection related to legs removed, as this is not relevant. This script is primarily used for the demonstration.
Interview transcripts
Interview 1:
Interviewer: How often do you swim per week?
Response: 2-3 times per week
I: For how long have you been swimming?
R: Almost my whole life
I: Are you also part of a swimming team?
R:Yes, I’m part of DBD
I: Do you also compete in swim meets?
R:Yes, I do
I: Have you ever had the feeling that you did not get enough technique feedback, or at least that you could use some more?
R: Yes, I feel like we are doing a lot of technique exercises, but we don’t really get personal tips
I: Have you ever made use of online services like online coaches or online videos to improve?
R: No, not really
I: Is there a specific reason for that?
R: No, I’ve just never done it.
I: We are working on a motion capture system that would be able to give technique tips. It would work by attaching sensors to certain joints and based on the relative position of those joints, be able to give feedback, like that you are pointing your arm too much to the side, or not bending it enough, etc. If such a system were to exist, would you be interested in it or do you feel like it wouldn’t have much added benefit for you?
R: That sounds pretty cool, I would definitely be interested.
I: If such a system were to exist, what would be your preferred method of feedback? For example, a video that you can watch back later that explains what to improve, an earpiece that gives automatic feedback while you are swimming, something like pressure on your joint indicating you need to move, or maybe you have your own idea.
R: I would say a combination of the video, and some feedback while you are actually swimming. I would then probably say the pressure on joints idea.
I: Why did you pick those?
R: The video is nice to be able to say your own swimming afterwards, but I don’t think that alone will be enough so some direct feedback to make you feel what you are doing would also be important.
I: Are there other things you would like the system to be able to track, like stroke frequency as an example?
R: If possible, something like stroke frequency or breathing frequency would be nice.
I: Which of the following things would you consider the most important in such a motion capture system: Low price, comfort, accuracy, ease of use, or durability.
R: Firstly low price and comfort, then accuracy and durability. Ease of use I don’t find important.
I: With a lot of motion capture systems, a laptop is needed to interpret the data and give feedback. Would having to bring a laptop to the pool be a deal-breaker for you?
R: It wouldn’t be ideal, but not necessarily a deal-breaker. However having some laptop cover to prevent it from accidental splashing would be nice.
I: Lastly, if the system were to actually exist, and be helpful in improving technique, how much money would you be willing to pay for it?
R: It would depend on how well it would work, but probably around 50 to 100 euros, although I know this won’t be a very realistic price for such a product.
I: Thanks for the interview, do you have any additional thoughts regarding the product?
R: No, nothing right now.
Summary: This interview was done with a very experienced swimmer who is already part of a swimming team(So not necessarily our target audience). However, the swimmer did indicate a feeling of not getting adequate feedback at his team, and indicated an interested in trying out our product if available. In terms of how to receive feedback, the swimmer prefers a video watch after he is done swimming, paired with some form of realtime feedback, preferably haptic. The swimmer finds comfort and low price the most important, while ease of use is not an issue. The swimmer would be willing to pay between 50 and 100 euros, but acknowledges that this might not be realistic.
Interview 2:
I: How often do you swim per week?
R: I swim 3 times per week
I: For how long have you been swimming?
R: Around 5 years I believe
I: Are you also part of a swimming team?
R:Yes
I: Do you also compete in swim meets
R:Yes
I: How often?
R: Around 5 times a year
I: Have you ever had the feeling that the coaching you received at the swim team wasn’t enough?
R: I guess so
I: How come?
R: I feel like the trainers might have given up on giving feedback
I: Do you feel like there is too little focus on technique in general, or that there are not enough personal tips?
R: Coaches will usually make these general remarks like ‘focus on your technique’ so in that sense there is attention given to it, but too little personal tips.
I: Have you ever made use of online services like coaching or videos to improve your technique?
R: I have, but that was quite a while ago
I: What did you use specifically?
R: Online videos
I: Did you feel like they helped you?
R: Not really
I: Do you have any idea why?
R:No
I: We are working on a motion capture system that would be able to give technique tips. It would work by attaching sensors to certain joints and based on the relative position of those joints, be able to give feedback, like that you are pointing your arm too much to the side, or not bending it enough, etc. If such a system were to exist, would you be interested in it or do you feel like it wouldn’t have much added benefit for you?
R: That would sound fun to use
I: If such a system were to exist, what would be your preferred method of feedback? For example, a video to watch back after you are done swimming, an earpiece giving you real-time feedback, some pressure based feedback so you feel when your technique is off?
R: A video or an earpiece
I: Why do you pick those?
R: A video because I think its nice to be able to watch your stroke back and I think this can help a lot. The earpiece if it can give realtime feedback would be really nice because you don’t usually get that. The pressure idea sounds like it would be irritating.
I: For the video, what would be the maximal time it could take for the video to be ready after you are done swimming? Does it have to be finished instantly, can it take a little while?
R: Its not a problem if it takes a couple of minutes
I: Are there more factors, like stroke rate, that the system would need to track
R: No, not really.
I:Which of the following aspects would you consider the most important in such a motion capture system: Low price, comfort, accuracy, ease of use, or durability.
R: Comfort, its not of a lot of use if you have to change the way you swim.
I: With a lot of motion capture systems its necessary to bring a laptop to the pool for the analysis, would this be a deal-breaker for you?
R: Not a deal-breaker, but its not ideal.
I: Lastly, if this system would work and actually be effective in improving your technique, what would be the maximum price you would pay for it?
R: around 150 euro’s.
Summary: This interview was done with a relatively experienced swimmer, who is already part of a swimming team. The swimmer indicates that there is a lack of personalized feedback at his club. For the feedback, the swimmer would like an animation after he is done swimming, and an earpiece for realtime feedback. The swimmer considers comfort the most important factor for such a system as having to swim differently due to the sensors would go against the purpose of the system. The swimmer would be willing to pay 150 euros for the system.
User survey
Link to the User survey: https://docs.google.com/forms/d/e/1FAIpQLSfRmrGkf-iRCDJQGpLv0SILlQEPYvAIomGOxuePE_pGPYbJzA/viewform?usp=sharing
Literature Summaries
Wearable motion capture suit with full-body tactile sensors[37]
This article discusses a suit with not only motion sensors, but also tactile sensors. These sensors detect whether a part of the suit is touching something or not. The motion sensors consist of an accelerometer, several gyroscopes, and multiple magnetometers. The data from these sensors is processed in a local cpu and subsequently sent to a central computer, to decrease processing time and ensure real-time calculations. The goal of the suit is to give researchers in the field of sports and rehabilitation more insight in human motion and behavior, as before this, no real motion capture suit with both motion sensors and tactile sensors had been implemented.
Motion tracking: no silver bullet, but a respectable arsenal[38]
This article goes over the different principles of motion sensors and which methods there are. They discuss mechanical, inertial, acoustic, magnetic, optical, radio and microwave sensing.
mechanical sensing: Provides accurate data for a single target, but generally has a small range of motion. These generally work by detecting mechanical stress, which is not a desirable approach for this project.
Inertial sensing: By using gyroscopes and accelerometers, the orientation and acceleration can be determined in the sensor. By compensating for gravity and double integrating over the acceleration, the position can be determined. One downside is that they are quite sensitive to drift and errors, and a small error integrated over time yields massive errors in the final position. For our project this would be very useful, as sensors determining their position wrt each other is difficult to do as the orientation is difficult to determine.
Acoustic sensing: These sensors transmit a small ultrasonic pulse and time how long it takes to get back. This method has multiple challenges, such as the fact that it can only measure relative changes in distance, not absolute distance. It is also very noise sensitive, as the sound wave can reflect off of multiple surfaces. Those reflections can get back at the sensor at different times, causing all sorts of problems. To solve the reflection problems, the sensor can be programmed to only consider the first reflection and ignore the rest, as this first reflection is generally the one that is to be measured.
Magnetic sensing: These sensors rely on magnetometers for static fields and on a change in induced current for changing magnetic fields. One creative way to use this is to have a double coil produce a magnetic field at a given location and estimate the sensors position and orientation based on the field it measures.
optical sensing: These sensors consist of two components; a light source and a sensor. The article discusses these sensors further, but since water and air have different refractive indices, and the sensors will be in and out of the water at random, these sensors will be useless.
radio and microwave sensing: Based on what the article had to say, this is generally used for long range position determination, such as gps. This is likely not useful for this project.
The Use of Motion Capture Technology in 3D Animation[15]
This article reviews literature about motion capture in 3D animation, and it aims to identify the strengths and limitations of different methods and technologies. It starts by describing different motion capture systems, while later on it comes to conclusions about the accessibility, ease of use, and the future of motion capture in general. Although this last part is not super interesting for us, the descriptions of different systems is.
active & passive optical motion capture: The basic idea is that an object or a person has a suit with either active or passive optical elements. Passive elements only reflect external light, and their position is measured using external cameras, generally multiple cameras from several directions. The material is usually selected such that it reflects infrared light. Active markers on the other hand emit their own light, which is again generally in the infrared part of the spectrum. Also for active markers their position is measured using cameras.
Inertial motion capture: This system uses inertia sensors (described in [38]) to determine the position of key joints and body parts. This system does not depend on lighting and cameras, increasing the freedom of motion. A widely used inertia based system is the Xsens MVN system.
Markerless motion capture: In this case, no markers or sensors are used, but the motion is simply recorded with one or multiple cameras. Software then interprets the data and turns it into something usable for animators. For us this approach is not very usefull.
Surface electromyography: This method is generally used to detect fine motions in the face using sensors that detect the electrical currents produced by contracting muscles. For us again not super useful.
Musculoskeletal model-based inverse dynamic analysis under ambulatory conditions using inertial motion capture[39]
This article discusses the use of inertial motion sensors from xsens, which is currently part of movella. The specific model used here is the xsens MVN link. They constructed a suit using these sensors and let the test subjects perform different movements. The root mean square distance between the determined position and the real position was found to lie between about 3 and 8 degrees, depending on the body part measured. If we can reach these kinds of values for our prototype that would be sufficient. Since this article is from 2019, the current state of the art technology might be even better than this.
Sensor network oriented human motion capture via wearable intelligent system[20]
This article uses 15 wireless inertial motion sensors placed on human limbs and other important locations to capture the motion of the person. The researchers had their focus on a lightweight design with small sensors and a low impact on behavior. The specific sensors used are MPU9250 models, which also only cost about 12 euros.The researchers transform the coordinates gained from the sensors and have an error of about 1.5% in the determined displacement.
Development of a non-invasive motion capture system for swimming biomechanics[40]
I still need to work this one out
Swimming Stroke Phase Segmentation Based on Wearable Motion Capture Technique[41]
In this source, a sensor-based motion capture system is used to distinguish the different phases occurring during a swim stroke. To determine the bodyposition, measurement nodes are placed on different points of the body. Each measurement node contains a 3-axis gyrometer, 3-axis accelerometer,and a 3-axis magnetometer. An orientation estimation algorithm was used to determine the posture from the information gathered by the nodes. Supervised learning is used to create an algorithm capable of separating different phases of the stroke. The accuracy was compared to an optical motion capture system, and it was found that the system is comparatively accurate. But there is quite a big error for a rapidly changing joint angle.
Development of a Methodology for Low-Cost 3D Underwater Motion Capture: Application to the Biomechanics of Horse Swimming[17]
In this article the motion capture is done using an optical marker-based system. This usually involves spherical markers glued to important body parts and camera’s to track them, but in this specific case color pigments were used instead of these markers. It mentions that in general there are three types of Motion Capture: sensor based, optical with markers, and optical without markers. Where optical with markers is considered the gold standard. It also mentions as a downside for sensor based systems that the size of the sensors can cause additional drag. 6 Cameras were used to capture videos of the horses swimming. The video’s were then preprocessed which included things like calibrating the different camera angles and tracking the different markers. Lastly algorithms were used to determine the horse’s movements. The accuracy of the system was in the order of centimeters for segments, and in the order of degrees for angles. Downsides of the method in this article are that the preprocessing requires lots of time-intensive tasks like having to manually track the markers. It therefore recommends training an algorithm for these tasks. It also mentions problems occurring when certain anatomical points lack overlap between different cameras.
Markerless Optical Motion Capture System for Asymmetrical Swimming Stroke[16]
In the article a markerless optical motion capture system is used that only uses one underwater camera to record the swimming motion. The article mentions that an optical motion capture with markers is actually the most used due to its relatively high accuracy, but mentions that there are also downsides like the markers affecting the swimmers movement, it taking quite a long time to properly place the markers, and there being issues discriminating the different markers from eachother. The system involved creating a human body model, this was then matched to the images of the swimmer to track their movement. An additional algorithm was then required to distinguish the left and right side of the body from eachother. The model seemed to match the images quite well. SWUM software was then used to for dynamical analysis, like analysing the force exerted on the water by the swimmer.
SwimXYZ: A large-scale dataset of synthetic swimming motions and videos[42]
According to the article, most limitations of motion capture systems are due to a lack of data. The motion capture systems usually struggle with interpreting aquatic data as they are not trained on that. Research has shown that synthetic data could complement or even replace real images in training the computer models. This paper published a database of 3.4 million synthetic frames of the swimming stroke that can be used for training algorithms. Using their database the researchers also created their own stroke classifier algorithm that seemed to provide accurate results. Limitations of the database are the lack of diversity in the subjects(gender, bodyshape), and lack of diversity in the pool background.
I have downloaded and watched the data for freestyle from this paper, but I think the videos are to short and of quality not high enough to accurately track the posture of the (animated) person. Therefore I think tracking a few videos of (olympic) swimmers and (manually?) track their movements is the most achievable option to obtain reliable data and to give constructive feedback to the user.
Usable videos:https://www.youtube.com/watch?v=vFqX_KxqWtE
Current swimming techniques: the physics of movement and observations of champions[43]
This paper outlines how former champions swam and how their strokes looked like. This we could use as perfect data. It does this for multiple kinds of strokes for males and females. This paper also dives a bit into what contributes most to drag for swimmers, so how a stroke can be improved. Next to this it has experimental data for drag coefficient for swimmers with respect to the depth of swimming.
Research on distance measurement method based on micro-accelerometer[44]
This paper shows a possibility how to go from accelerometer data to distance measurements. This we probably need to accurately calculate the position and angles of the swimmers body parts
Experiments on human swimming : passive drag experiments and visualization of water flow[45]
This paper investigates the drag force of a swimmer in streamline position. This is done by towing a person (so the person doesnt move any bodyparts). This is also done with a sphere to better understand how to flow of water goes.
Smart Swimming Training: Wearable Body Sensor Networks Empower Technical Evaluation of Competitive Swimming[46]
This paper proposes a body-area sensor network for detailed analysis of swimming posture and movement phases. Wearable inertial sensors placed on 10 body parts capture motion data, which is processed using a motion intensity-based Kalman filter for improved accuracy. Deep learning models combine temporal and graph-based networks for automatic phase segmentation, achieving over 97% accuracy. The system accurately tracks joint angles and body posture, providing valuable feedback for swimming performance enhancement.
Using Wearable Sensors to Capture Posture of the Human Lumbar Spine in Competitive Swimming[47]
This study introduces a motion capture system based on wearable inertial sensors for tracking lumbar spine posture in competitive swimming. Using a multi-sensor fusion algorithm aided by a bio-mechanical model, the system reconstructs swimmers’ posture with high accuracy (errors between 1.65° and 3.66°). Experiments validate its reliability compared to an optical tracking system. Kinematic analysis across four swimming styles (butterfly, breaststroke, freestyle, and backstroke) reveals distinct lumbar movement patterns, helping coaches and athletes optimize
performance. The system offers a practical alternative to video-based methods.
3D Orientation Estimation Using Inertial Sensors[48]
This paper discusses methods for 3D orientation using data orm accelerometers, gyroscopes and magnetometers. The Study focuses on upper limb movements for patients with neural diseases. Additionally it explores techniques for 2D and 3D position tracking by considering joint links and kinematic constrains.
Motion Tracker With Arduino and Gyroscope Sensor[49]
This project focuses on making a 3D motion tracing devises using an ESP and a MPU. The advantage of this approach is that it is wireless and could be mounted as a wearable.
Building a skeleton-based 3D body model with angle sensor data[50]
In this paper a method for constructing a full body motion model using data from wearable sensors.
Using Gyroscopes to Enhance Motion Detection [51]
In this article, the author explores the role of gyro sensors in improving motion detection and their functionality. The article also provides insight into how gyroscopes can be used to enhance the performance of various motion-sensitive technologies.
Wearable inertial sensors in swimming motion analysis: a systematic review[52]
With "the aim of describing the state of the art of current developments in this area", the paper presents a comprehensive review of the existing literature on swimming motion analysis, as well an exhaustive exploration of current sensing capacities, both inertial and magnetic. The research focuses more on physical methods rather than digital, as the latter was found to introduce large delays and require too much set-up to be practical for consumers. Like most sources thus far, the paper indicates inertial sensors to be most reliable.
Development of a swimming motion display system for athlete swimmers’ training using a wristwatch-style acceleration and gyroscopic sensor device[53]
This is an older paper, hence advancements in sensing technology may have surpassed the sensor units used here. Despite this, the methods described may still prove useful: the paper details methods of data logging swimmers' motions using an accelerometer and a three-axis gyroscopic sensor device, where any one sensor measures its respective axis.
SmartPaddle as a new tool for monitoring swimmers' kinematic and kinetic variables in real time [54]
This paper explores a more modern sensing toolkit by means of the SmartPaddle device, which features two pressure sensors and a 9-axis Inertial Measurement Unit in a small device that is strapped to both of the swimmer's palms. This way, the force of each stroke can be measured and graphed with the pressure sensors to allow comparisons in force between hands, and the motion of the hand during the stroke can be visually represented in three swimming direction graphs, namely a side view, a top view, and a back view. This product may be useful in our project as it comes as a standalone sensing unit with which we could create a larger assembly from, to capture motion of the entire body.
Digital Analysis and Visualization of Swimming Motion[55]
Unlike the last three articles, this paper focuses on digital methods of motion tracking and analysis. A solid framework is laid out for methods of digitizing videos of swimmers, by use of a module that takes a 2d video and outputs a 3D representation of their body. The swimmer needs to upload a picture of himself standing in a T position in order for the module to calculate length from start to end of limbs, and the swimmer needs to manually point out joint locations on the recorded video. From this, a 3D digitization is produced, that can be further used to visualize and analyze the swimmer's motions.
- ↑ Jump up to: 1.0 1.1 Analytica, A. (2023, 7 maart). Sports Coaching Platforms Market Size, share| Report, 2031. https://www.astuteanalytica.com/industry-report/sports-coaching-platforms-market
- ↑ MySwimPro. (z.d.). Swimming strategy call. https://academy.myswimpro.com/apply
- ↑ SWIM SMOOTH | 1-to-1 Video Analysis & Stroke Correction for Adults | Swim Smooth. (z.d.). Swim Smooth. https://www.swimsmooth.com/videoanalysis
- ↑ Virtual Swim Video Technique Analysis - Dan Daly. (z.d.). Calendly. https://calendly.com/daniel-daly/swim-video-technique-session-at-your-pool-clone?month=2025-04
- ↑ https://swimmingtechnology.com/measurement-of-the-active-drag-coefficient/,Swimming Technology Research, Inc., Measuring the Active Drag Coefficient
- ↑ https://www.theendurancestore.com/blogs/the-endurance-store/stroke-rate-v-stroke-count-and-why-its-critical-for-swim-performance, The Endurance Store, Stroke Rate V Stroke Count, and why it's critical for swim performance
- ↑ https://www.youtube.com/watch?v=D-7YP5BGswA, Olympics, 🏊♂️ The last five Men's 100m freestyle champions 🏆
- ↑ https://www.iec.ch/ip-ratings Internation Electrotechnical Commision, Ingress Protection (IP) ratings.
- ↑ https://wetten.overheid.nl/BWBR0041330/2025-01-01#Hoofdstuk15, Wetten.overheid.nl, Besluit activiteiten Leefomgeving (BaL)
- ↑ https://www.researchgate.net/publication/285642648_Degradation_of_plasticized_PVC, Yvonne Shashoua, Inhibiting the deterioration of plasticized poly(vinyl chloride) – a museum perspective
- ↑ https://global.canon/en/imaging/picturestyle/editor/matters05.html, Canon. Inc, Understanding the "HSL" color expression system
- ↑ A detailed explanation of the body dimensions which can be inserted in the configuration window in MVN software. https://base.movella.com/s/article/Body-Dimensions?language=en_US#Offline
- ↑ Body Segment Parameters1 A Survey of Measurement Techniques,Rudolf s Drillis, PH.D.,2 Renato Contini , B.S.,3 AND Maurice Bluestein, M.M.E.4, 1964
- ↑ THE STATIC MOMENTS OF THE COMPONENT MASSES OF THE HUMAN BODY,https://apps.dtic.mil/sti/tr/pdf/AD0279649.pdf,Dr. Harless,1860
- ↑ Jump up to: 15.0 15.1 WIBOWO, Mars Caroline; NUGROHO, Sarwo; WIBOWO, Agus. The use of motion capture technology in 3D animation. International Journal of Computing and Digital Systems, 2024, 15.1: 975-987. https://pdfs.semanticscholar.org/9514/28e966feece961d7100448d0caf17a8b93ec.pdf
- ↑ Jump up to: 16.0 16.1 16.2 erryanto, F., Mahyuddin, A. I., & Nakashima, M. (2022). Markerless Optical Motion Capture System for Asymmetrical Swimming Stroke. Journal of Engineering and Technological Sciences, 54(5), 220503. https://doi.org/10.5614/j.eng.technol.sci.2022.54.5.3
- ↑ Jump up to: 17.0 17.1 Giraudet, C., Moiroud, C., Beaumont, A., Gaulmin, P., Hatrisse, C., Azevedo, E., Denoix, J.-M., Ben Mansour, K., Martin, P., Audigié, F., Chateau, H., & Marin, F. (2023). Development of a Methodology for Low-Cost 3D Underwater Motion Capture: Application to the Biomechanics of Horse Swimming. Sensors, 23(21), 8832. https://doi.org/10.3390/s23218832
- ↑ Z. Cao, G. Hidalgo, T. Simon, S. -E. Wei and Y. Sheikh, "OpenPose: Realtime Multi-Person 2D Pose Estimation Using Part Affinity Fields," in IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 43, no. 1, pp. 172-186, 1 Jan. 2021, doi: 10.1109/TPAMI.2019.2929257. keywords: {Two dimensional displays;Pose estimation;Detectors;Runtime;Kernel;Training;2D human pose estimation;2D foot keypoint estimation;real-time;multiple person;part affinity fields}, https://ieeexplore.ieee.org/document/8765346
- ↑ Web page of theia, a company that sells markerless mocap technology https://www.theiamarkerless.com/
- ↑ Jump up to: 20.0 20.1 20.2 Qiu S, Zhao H, Jiang N, et al. Sensor network oriented human motion capture via wearable intelligent system. Int J Intell Syst. 2022; 37: 1646-1673. https://doi.org/10.1002/int.22689
- ↑ Article on the calibration of xsens mvn motion capture suits, from the movella website https://base.movella.com/s/article/Calibration-MVN-2021-0-and-older?language=en_US
- ↑ Article on the methods used for drift compensation for xsens motion capture suits, from the mozilla website https://www.movella.com/resources/blog/mocap-without-limits-how-xsens-solved-the-drift-dilemma
- ↑ H. T. Butt, M. Pancholi, M. Musahl, P. Murthy, M. A. Sanchez and D. Stricker, "Inertial Motion Capture Using Adaptive Sensor Fusion and Joint Angle Drift Correction," 2019 22th International Conference on Information Fusion (FUSION), Ottawa, ON, Canada, 2019, pp. 1-8, doi: 10.23919/FUSION43075.2019.9011359. keywords: {Magnetometers;Magnetic field measurement;Calibration;Accelerometers;Sensor fusion;Acceleration;Gravity;Sensor Fusion;Human Motion Capture;Self-Calibration;Magnetic-inertial measurement unit (MIMU)}, https://ieeexplore.ieee.org/abstract/document/9011359
- ↑ I. Weygers et al., "Drift-Free Inertial Sensor-Based Joint Kinematics for Long-Term Arbitrary Movements," in IEEE Sensors Journal, vol. 20, no. 14, pp. 7969-7979, 15 July15, 2020, doi: 10.1109/JSEN.2020.2982459. keywords: {Kinematics;Motion segmentation;Acceleration;Gyroscopes;Robot sensing systems;Body sensor networks;gait;inertial-sensor drift;motion analysis;sensor fusion;wearable sensors}, https://ieeexplore.ieee.org/abstract/document/9044292
- ↑ N. O-larnnithipong and A. Barreto, "Gyroscope drift correction algorithm for inertial measurement unit used in hand motion tracking," 2016 IEEE SENSORS, Orlando, FL, USA, 2016, pp. 1-3, doi: 10.1109/ICSENS.2016.7808525. keywords: {Quaternions;Gyroscopes;Gravity;Tracking;Angular velocity;Accelerometers;Estimation;Inertial Measurement Unit;Gyroscope Drift;Drift Correction Algorithm;Bias Offset Error Estimation;Quaternion Correction using Gravity Vector;Hand Motion Tracking},https://ieeexplore.ieee.org/abstract/document/7808525
- ↑ Movella DOT data sheet https://www.xsens.com/hubfs/Xsens%20DOT%20data%20sheet.pdf
- ↑ Jailton G. Pelarigo, Benedito S. Denadai, Camila C. Greco, Stroke phases responses around maximal lactate steady state in front crawl, Journal of Science and Medicine in Sport, Volume 14, Issue 2, 2011, Pages 168.e1-168.e5, ISSN 1440-2440, https://doi.org/10.1016/j.jsams.2010.08.004.
- ↑ Jump up to: 28.0 28.1 Biskaduros, P. W., Biskaduros, P. W., & Biskaduros, P. W. (2024, 26 september). How to swim freestyle with perfect technique. How To Swim Freestyle With Perfect Technique. https://blog.myswimpro.com/2019/06/06/how-to-swim-freestyle-with-perfect-technique/#comments
- ↑ Jump up to: 29.0 29.1 29.2 Fares Ksebati. (2021, 17 maart). How to Improve Your Freestyle Pull & Catch [Video]. YouTube. https://www.youtube.com/watch?v=nD2dZVsrBq4
- ↑ Swim360. (2018, 10 oktober). Freestyle hand entry - find the sweet spot to speed up [Video]. YouTube. https://www.youtube.com/watch?v=3yReSXt9_8Q
- ↑ Ford, B. (2018, 27 juli). How to avoid over rotation in freestyle. Effortless Swimming. https://effortlessswimming.com/how-to-avoid-over-rotation-in-freestyle/#:~:text=One%20of%20the%20key%20aspects,degrees%20during%20a%20breathing%20stroke
- ↑ Petala, A. (2022, 8 november). Everything YOU need to know about Freestyle Rotation. Everything YOU need to know about Freestyle Rotation. https://blog.tritonwear.com/everything-you-need-to-know-about-freestyle-rotation
- ↑ Jerszyński D, Antosiak-Cyrak K, Habiera M, Wochna K, Rostkowska E. Changes in selected parameters of swimming technique in the back crawl and the front crawl in young novice swimmers. J Hum Kinet. 2013 Jul 5;37:161-71. doi: 10.2478/hukin-2013-0037. PMID: 24146717; PMCID: PMC3796834.
- ↑ SwimGym. (2024, 9 maart). Freestyle arm recovery common mistakes [Video]. YouTube. https://www.youtube.com/watch?v=zN7zZztMFiY
- ↑ Free model used for the visualisation.https://assetstore.unity.com/packages/3d/characters/humanoids/character-pack-lowpoly-free-221766
- ↑ Home page of Xsens, a movella brand of motion capture suits https://www.movella.com/products/xsens
- ↑ Y. Fujimori, Y. Ohmura, T. Harada and Y. Kuniyoshi, "Wearable motion capture suit with full-body tactile sensors," 2009 IEEE International Conference on Robotics and Automation, Kobe, Japan, 2009, pp. 3186-3193, doi: 10.1109/ROBOT.2009.5152758. keywords: {Tactile sensors;Humans;Motion estimation;Humanoid robots;Wearable sensors;Motion measurement;Force measurement;Motion analysis;Shape;Robot control}, https://ieeexplore.ieee.org/abstract/document/5152758
- ↑ Jump up to: 38.0 38.1 G. Welch and E. Foxlin, "Motion tracking: no silver bullet, but a respectable arsenal," in IEEE Computer Graphics and Applications, vol. 22, no. 6, pp. 24-38, Nov.-Dec. 2002, doi: 10.1109/MCG.2002.1046626. keywords: {Tracking;Silver;Delay;Roads;Motion estimation;Motion measurement;Pipelines;Robustness;Degradation;Magnetic fields}, https://ieeexplore.ieee.org/abstract/document/1046626
- ↑ Angelos Karatsidis, Moonki Jung, H. Martin Schepers, Giovanni Bellusci, Mark de Zee, Peter H. Veltink, Michael Skipper Andersen, Musculoskeletal model-based inverse dynamic analysis under ambulatory conditions using inertial motion capture, Medical Engineering & Physics, Volume 65, 2019, Pages 68-77, ISSN 1350-4533, https://doi.org/10.1016/j.medengphy.2018.12.021
- ↑ Ascendo, G. (2021). Development of a non-invasive motion capture system for swimming biomechanics [Thesis(Doctoral)]. Manchester Metropolitan University.
- ↑ J. Wang, Z. Wang, F. Gao, H. Zhao, S. Qiu and J. Li, "Swimming Stroke Phase Segmentation Based on Wearable Motion Capture Technique," in IEEE Transactions on Instrumentation and Measurement, vol. 69, no. 10, pp. 8526-8538, Oct. 2020, doi: 10.1109/TIM.2020.2992183.
- ↑ Guénolé Fiche, Vincent Sevestre, Camila Gonzalez-Barral, Simon Leglaive, and Renaud Séguier. 2023. SwimXYZ: A large-scale dataset of synthetic swimming motions and videos. In Proceedings of the 16th ACM SIGGRAPH Conference on Motion, Interaction and Games (MIG '23). Association for Computing Machinery, New York, NY, USA, Article 22, 1–7. https://doi.org/10.1145/3623264.3624440
- ↑ https://coachsci.sdsu.edu/swim/bullets/Current44.pdf,Brent S. Rushall, Ph.D. August 21, 2013
- ↑ Yonglei Shi, Liqing Fang, Deqing Guo, Ziyuan Qi, Jinye Wang, and Jinli Che,https://sci-hub.se/10.1063/5.0054463
- ↑ G, Custers, https://research.tue.nl/en/studentTheses/experiments-on-human-swimming-passive-drag-experiments-and-visual
- ↑ J. Li et al., "Smart Swimming Training: Wearable Body Sensor Networks Empower Technical Evaluation of Competitive Swimming," in IEEE Internet of Things Journal, vol. 12, no. 4, pp. 4448-4465, 15 Feb.15, 2025, doi: 10.1109/JIOT.2024.3485232. keywords: {Sports;Motion segmentation;Training;Biomedical monitoring;Inertial sensors;Data integration;Wearable devices;Skeleton;Wireless communication;Monitoring;Body sensor network (BSN);competitive swimming;motion capture;multisensor data fusion;phase segmentation},
- ↑ Z. Wang et al., "Using Wearable Sensors to Capture Posture of the Human Lumbar Spine in Competitive Swimming," in IEEE Transactions on Human-Machine Systems, vol. 49, no. 2, pp. 194-205, April 2019, doi: 10.1109/THMS.2019.2892318. keywords: {Sports;Biomechanics;Tracking;Biological system modeling;Spine;Wearable sensors;Position measurement;Human biomechanical model;inertial sensor;motion capture;orientation estimation;sensor networks;sport training},
- ↑ Bai, L. (2022). 3D orientation estimation using inertial sensors. Journal of Electrical Technology UMY, 6(1), 1–8.
- ↑ Instructables. (2024, September 3). Motion tracker with arduino and gyroscope sensor. Instructables. https://www.instructables.com/Motion-Tracker-With-Arduino-and-Gyroscope-Sensor/
- ↑ Wang, Q., Zhou, G., Liu, Z., & Ren, B. (2021). Building a skeleton-based 3D body model with angle sensor data. Smart Health, 19, 100141. https://doi.org/10.1016/j.smhl.2020.100141
- ↑ Meyer, A. (2020). Using gyroscopes to enhance motion detection. Engineering Student Trade Journal Articles, (6). https://scholar.valpo.edu/stja/6/
- ↑ Magalhaes, Fabricio Anicio, et al. “Wearable inertial sensors in Swimming Motion Analysis: A systematic review.” Journal of Sports Sciences, vol. 33, no. 7, 30 Oct. 2014, pp. 732–745, https://doi.org/10.1080/02640414.2014.962574.
- ↑ Nakashima, Motomu, et al. “Development of a swimming motion display system for athlete swimmers’ training using a wristwatch-style acceleration and gyroscopic sensor device.” Procedia Engineering, vol. 2, no. 2, June 2010, pp. 3035–3040, https://doi.org/10.1016/j.proeng.2010.04.107.
- ↑ Marinho, Daniel A., et al. “Smartpaddle® as a new tool for monitoring swimmers’ kinematic and kinetic variables in real time.” The Open Sports Sciences Journal, vol. 15, no. 1, 14 Dec. 2022, https://doi.org/10.2174/1875399x-v15-e221026-2022-11.
- ↑ Kirmizibayrak, Can, et al. “Digital Analysis and visualization of Swimming Motion.” International Journal of Virtual Reality, vol. 10, no. 3, 1 Jan. 2011, pp. 9–16, https://doi.org/10.20870/ijvr.2011.10.3.2817.