PRE2024 3 Group18
Members
Name | Student Number | Division |
---|---|---|
Bas Gerrits | 1747371 | B |
Jada van der Heijden | 1756710 | BPT |
Dylan Jansen | 1485423 | B |
Elena Jansen | 1803654 | B |
Sem Janssen | 1747290 | B |
Approach, milestones and deliverables
Week | Milestones |
---|---|
Week 1 | Project orientation, brainstorming, defining deliverables and limitations, SotA research |
Week 2 | Deepened SotA research
Identifying and determining specifications (hardware, software, design (MoSCoW Prioritization)) UX Design: User research Create bill of materials |
Week 3 | UX Design: User Interviews
Wiki: Specifications, Functionalities (sensors/motors used) |
Week 4 | Prototyping cycle: building and mock-ups
UX Design: Processing interviews Order needed items |
Week 5 | Evaluating and refining final design |
Week 6 | Demo/scenario testing, fine-tuning
UX Design: Second round of interviews (evaluation) Wiki: Testing results |
Week 7 | Presentation/demo preperations
Wiki: Finalize and improve (ex. Conclusion, Future work, Discussion) |
Name | Responsibilities |
---|---|
Bas Gerrits | Arduino control |
Jada van der Heijden | Administration, UX design, Wiki |
Dylan Jansen | Code, Electronics, SotA |
Elena Jansen | Designing |
Sem Janssen | Hardware |
Problem statement and objectives
Problem Statement
When encountering an environment as ever-changing and unpredictable as traffic, it is important for every traffic participant to have the widest range of information about the situation available to them, for safety reasons. Most roads are already covered in guiding materials: traffic lights, cross walks and level crossings have visual, tactile and auditory signals that are able to relay as much information to users of traffic as possible. Unfortunately, some user groups are more dependent on certain type of signals than others, for example due to disabilities. Not every crossing or road has all of these sensory cues, therefore it is important to find a solution for those user groups that struggle with this lack of information and therefore feel less safe in traffic. In specific, we are looking at visually impaired people, and creating a system/robot/design that will aid them in most traffic situations to cross roads, even with the lack of external sensory cues.
Main objectives:
- The design should be able to aid the user in crossing a road, regardless of external sensory cues already put in place, with the purpose of giving more autonomy to the user
- The design must have audible, or otherwise noticeable, alerts for the user, that are easily understood by said user
- The design must have a reliable detection system
- The design must be 'hidden': it should not disrupt the user's QoL and could be barely noticeable as technology
- Ex. hidden in the user's clothing
An extended list of all features can be found at MoScoW part.
State of the Art Literature Research
Existing visually-impaired aiding materials
Today there already exist a lot of aids for visually impaired people. Some of these can also be applied to help cross the road. The most common form of aid for visually impaired people when crossing is audio traffic signals and tactile pavement. Audio traffic signals provide audible tones when it’s safe to cross the road. Tactile pavement are patterns on the sidewalk to alert visually impaired people to hazards or important locations like crosswalk. These aids are already widely applied but come with the drawback that it is only available at dedicated crosswalks. This means visually impaired people might still be able to cross at locations they would like to, which doesn’t positively affect their autonomy.
Another option is smartphone apps. There are currently two different types of apps that visually impaired people can use. The first is apps that use a video call to connect visually impaired people to someone that can guide them through the use of the camera. Examples of these apps are Be My Eyes and Aira. The second type is an app utilizing AI to describe scenes using the phone’s camera. An example of this is Seeing AI by Microsoft. The reliability of this sort of app is of course a major question.
There have also been attempts to make guidance robots. These robots autonomously guide, avoid obstacles, stay on a safe path, and help you get to your destination. Glidance is one of these robots currently in the testing stage. It promises obstacle avoidance, the ability to detect doors and stairs, and a voice to describe the scene. In its demonstration it also shows the ability to navigate to and across crosswalks. It navigates to a nearby crosswalk, slows down and comes to a standstill before the pavement ends, and keeps in mind the traffic. It also gives the user subtle tactile feedback to communicate certain events to them. These robots could in some ways replace the tasks of guidance dogs. There are also projects that try to make the dogs robot-like. Even though this might make the implementation harder than it needs to be. It seems the reason for the shape of a dog is to make the robot feel more like a companion.

There also exist some wearable/accessory options for blind people. Some examples are the OrCam MyEye: A device attachable to glasses that helps visually impaired users by reading text aloud, recognizing faces, and identifying objects in real time. Or the eSight Glasses: Electronic glasses that enhance vision for people with legal blindness by providing a real-time video feed displayed on special lenses. Theres also the Sunu Band (no longer available): A wristband equipped with sonar technology that provides haptic feedback when obstacles are near, helping users detect objects and navigate around them. While these devices can all technically assist in crossing the road, none of them are specifically designed for that purpose. The OrCam MyEye could maybe help identify oncoming traffic but may not be able to judge their speed. The eSight Glasses are unfortunately not applicable to all types of blindness. And the Sunu Band would most likely not react fast enough to fast-driving cars. Lastly, there are some smart canes that come with features like haptic feedback or GPS that can help guide users to the safest crossing points.
About road aids
??
User Experience Design Research
For this project, we will employ a process similar to UX design:
- We will contact the stakeholders or target group, which is in this case visually impaired people, to understand what they need and what our design could do for them
- Based on their insight and literature user research, we will further specify our requirements list from the USE side
- Combined with requirements formed via the SotA research, a finished list will emerge with everything needed to start the design process
- From then we build, prototype and iterate until needs are met
Below are the USE related steps to this process.
USEr research
Target Group
Primary Users:
- People with affected vision that would have sizeable trouble navigating traffic independently: ranging from visually impaired to fully blind
Secondary Users:
- Road users: any moving being or thing on the road will be in contact with the system.
- Fellow pedestrians: the system must consider other people when moving. This is a separate user category, as the system may have to interact with these users in a different way than, for example, oncoming traffic.
Users
What Do They Require?
The main takeaway here is that visually impaired people are still very capable people, and have developed different strategies for how to go about everyday life.
!!From user research we will probably find that most blind people are elderly. We can conclude somewhere here then that we will focus on elderly people, and that that will have implications for the USEr aspects of the design as elderly people are notorious for interacting and responding differently to technology.!!
Society
How is the user group situated in society? How would society benifit from our design?
People that are visually impaired are somewhat treated differently in society. In general, there are plenty of guidelines and modes of help accessible for this group of people, considering globally at least 2.2 billion have a near or distance vision impairment (https://www.who.int/news-room/fact-sheets/detail/blindness-and-visual-impairment). In the Netherlands, this number is more like x.
While aids like white sticks exist, visually impaired people will realistically be dependent on their environment, and the consistency of it. Markings on the road or locations of buttons on a pay machine are essential for making their life easier (but not impossible). Some visually impaired people have also noted that they would not like to be treated differently due to their condition, and that they are just as capable as anyone else.
As society stands right now, the aids that exist do help in aiding visually impaired people in their day to day life, but do not give them the right amount of security and confidence needed for those people to feel more independent, and more at equal pace as people without impaired vision. In order to close up this gap, creating a design that will focus more on giving these people a bigger sense of interdependency and self-reliance is key.
Enterprise
What businesses and companies are connected to this issue?
Stakeholder Analysis
To come in closer contact with our target groups, we reached out to multiple associations via mail. The following questions were asked (in Dutch), as a ways to gather preliminary information:
Following these questions, we got answers from multiple associations that were able to give us additional information.
From this, we were able to setup interviews via Peter Meuleman, researcher and founder of Videlio Foundation. This foundation focusses on enhancing daily life for the visually impaired through personal guidance, influencing governmental policies and stimulating innovation in the field of assisting tools. Peter himself is very involved in the research and technological aspects of these tools, and was willing to be interviewed by us for this project. He also provided us with interviewees via his foundation and his personal circle.
Additionally, via personal contacts, we were also able to visit an art workshop at StudioXplo on x meant for the visually impaired. We were able to conduct some interviews with the intended users on site.
(how do we cross the road vs. hard of seeing)
Creating the solution
MosCoW Requirements
Prototypes/Design
Here we will put some different versions of a design of what we think will be a solution. We can pick one to work on.
To safely cross a road with a speed limit of 50 km/h using distance detection technology, we need to calculate the minimum distance the sensor must measure.
Assuming a standard two-lane road width of 7.9 meters to be on the safe side, the middle of the second lane is at 5.8 meters. This gives us a height h of 5.8 meters.
The average walking speed is 4.8 km/h, which converts to 1.33 m/s. Therefore, crossing the road takes approximately 5.925 seconds.
The distance d can now be calculated. A car traveling at 50 km/h moves at 13.88 m/s, so:
d=13.88 * 5.925 = 82.24 meters
Applying the Pythagorean theorem, the radius r is:
r=sqrt(d^2 + h^2)=sqrt(82.24^2 + 5.8^2)=82.44
This means the sensor needs to measure a minimum distance of approximately 82.44 meters to ensure safe crossing.
The sensors that are capable of this are more the 2000 euros, which is out of the budget for the course so for the prototype a small scaled version needs to be used to prove the concept. The new speed will depend on the distance the sensor can measure. This is done by the following ratio.
speed/distance = 50/82.44
The first design uses a servomotor to rotate the distance sensor. When it turns 180 degrees it measures distance and then the motor rotates back 180 degrees and compares the distances. If there is a change in distances this means there is movement in the environment. This will result in the device telling the user that it is not safe to cross the road.
The second design uses a better distance sensor which measures even further and will calculate the speed of the road users. Depending on this speed the device tells the user if it is safe to cross the road.
Idea 1
Idea 2
Idea 3
Implementation
Evaluation
Conclusion
Discussion
Appendix
Timesheets
Week | Names | Breakdown | Total hrs |
---|---|---|---|
Week 1 | Bas Gerrits | Meeting & brainstorming (3h), SotA research (4h), working out concepts (1h), sub-problem solutions (2h) | 10 |
Jada van der Heijden | Meeting & brainstorming (3h), planning creation (2h), wiki cleanup (2h), SotA research (3h) | 10 | |
Dylan Jansen | Meeting & brainstorming (3h), SotA research (3h), working out concepts (2h), looking for reference material (2h) | 10 | |
Elena Jansen | Meeting & brainstorming (3h) | ||
Sem Janssen | Meeting & brainstorming (3h) | ||
Week 2 | Bas Gerrits | Meeting (2h), User research (2h), MoSCoW (2h), design (2h), design problem solving (4h) | |
Jada van der Heijden | Meeting (2h), Editing wiki to reflect subject change (2h), Contacting institutions for user research (1h), Setting up user research (interview questions) (1h), User research (2h) | 7 | |
Dylan Jansen | Meeting (2h), Contacting robotics lab (1h), SotA research for new subject (4h), Looking for reference material (2h), Updating Wiki (1h), Prototype research (2h) | 12 | |
Elena Jansen | Meeting (2h) | ||
Sem Janssen | Meeting (2h) | ||
Week 3 | Bas Gerrits | Meeting (3h), Looking at sensors and such to see what is possible (making a start on program) (3h) Theoritical situation conditions (3h) | |
Jada van der Heijden | Meeting (3h), Editing User Research wiki (4h), Refining interview questions (2h) | 9 | |
Dylan Jansen | Meeting (3h), Updating Wiki (1h), uploading and refining SotA (4h), Ideas for prototype (2h), Bill of materials for prototype (2h) | 12 | |
Elena Jansen | Meeting (3h), Making interview questions for online, updating user research wiki | ||
Sem Janssen | Meeting (3h), Looking at different options for relaying feedback (design) | ||
Week 4 | Bas Gerrits | ||
Jada van der Heijden | Meeting (3h), Editing User Research wiki (xh), Refining interview questions (2h), Interviewing (xh) | ||
Dylan Jansen | |||
Elena Jansen | |||
Sem Janssen | |||
Week 5 | Bas Gerrits | ||
Jada van der Heijden | |||
Dylan Jansen | |||
Elena Jansen | |||
Sem Janssen | |||
Week 6 | Bas Gerrits | ||
Jada van der Heijden | |||
Dylan Jansen | |||
Elena Jansen | |||
Sem Janssen | |||
Week 7 | Bas Gerrits | ||
Jada van der Heijden | |||
Dylan Jansen | |||
Elena Jansen | |||
Sem Janssen |
140 pp, 700 total, ~20 hrs a week