PRE2024 3 Group18

From Control Systems Technology Group
Revision as of 17:01, 3 March 2025 by D.m.a.jansen@student.tue.nl (talk | contribs) (Added SotA)
Jump to navigation Jump to search

Members

Name Student Number Division
Bas Gerrits 1747371 B
Jada van der Heijden 1756710 BPT
Dylan Jansen 1485423 B
Elena Jansen 1803654 B
Sem Janssen 1747290 B

Approach, milestones and deliverables

Project Planning
Week Milestones
Week 1 Project orientation, brainstorming, defining deliverables and limitations, SotA research
Week 2 Deepened SotA research

Identifying and determining specifications (hardware, software, design (MoSCoW Prioritization))

UX Design: User research

Create bill of materials

Week 3 UX Design: User Interviews

Wiki: Specifications, Functionalities (sensors/motors used)

Week 4 Prototyping cycle: building and mock-ups

UX Design: Processing interviews

Order needed items

Week 5 Evaluating and refining final design
Week 6 Demo/scenario testing, fine-tuning

UX Design: Second round of interviews (evaluation) Wiki: Testing results

Week 7 Presentation/demo preperations

Wiki: Finalize and improve (ex. Conclusion, Future work, Discussion)

Project Roles
Name Responsibilities
Bas Gerrits Arduino control
Jada van der Heijden Administration, UX design, Wiki
Dylan Jansen (Opties) Code, Electronics ,CAD model, Construction, Control documentation
Elena Jansen Designing
Sem Janssen Hardware

Problem statement and objectives

Problem Statement

When encountering an environment as ever-changing and unpredictable as traffic, it is important for every traffic participant to have the widest range of information about the situation available to them, for safety reasons. Most roads are already covered in guiding materials: traffic lights, cross walks and level crossings have visual, tactile and auditory signals that are able to relay as much information to users of traffic as possible. Unfortunately, some user groups are more dependent on certain type of signals than others, for example due to disabilities. Not every crossing or road has all of these sensory cues, therefore it is important to find a solution for those user groups that struggle with this lack of information and therefore feel less safe in traffic. In specific, we are looking at visually impaired people, and creating a system/robot/design that will aid them in most traffic situations to cross roads, even with the lack of external sensory cues.

Main objectives:

  • The design should be able to aid the user in crossing a road, regardless of external sensory cues already put in place, with the purpose of giving more autonomy to the user
  • The design must have audible, or otherwise noticeable, alerts for the user, that are easily understood by said user
  • The design must have a reliable detection system
  • The design must be 'hidden': it should not disrupt the user's QoL and could be barely noticeable as technology
    • Ex. hidden in the user's clothing

An extended list of all features can be found at MoScoW part.

State of the Art Literature Research

Existing visually-impaired aiding materials

Today there already exist a lot of aids for visually impaired people. Some of these can also be applied to help cross the road. The most common form of aid for visually impaired people when crossing is audio traffic signals and tactile pavement. Audio traffic signals provide audible tones when it’s safe to cross the road. Tactile pavement are patterns on the sidewalk to alert visually impaired people to hazards or important locations like crosswalk. These aids are already widely applied but come with the drawback that it is only available at dedicated crosswalks. This means visually impaired people might still be able to cross at locations they would like to, which doesn’t positively affect their autonomy.

Another option is smartphone apps. There are currently two different types of apps that visually impaired people can use. The first is apps that use a video call to connect visually impaired people to someone that can guide them through the use of the camera. Examples of these apps are Be My Eyes and Aira. The second type is an app utilizing AI to describe scenes using the phone’s camera. An example of this is Seeing AI by Microsoft. The reliability of this sort of app is of course a major question.

There have also been attempts to make guidance robots. These robots autonomously guide, avoid obstacles, stay on a safe path, and help you get to your destination. Glidance is one of these robots currently in the testing stage. It promises obstacle avoidance, the ability to detect doors and stairs, and a voice to describe the scene. In its demonstration it also shows the ability to navigate to and across crosswalks. It navigates to a nearby crosswalk, slows down and comes to a standstill before the pavement ends, and keeps in mind the traffic. It also gives the user subtle tactile feedback to communicate certain events to them. These robots could in some ways replace the tasks of guidance dogs. There are also projects that try to make the dogs robot-like. Even though this might make the implementation harder than it needs to be. It seems the reason for the shape of a dog is to make the robot feel more like a companion.  

Glidance, a prototype guidance robot for visually impaired people


There also exist some wearable/accessory options for blind people. Some examples are the OrCam MyEye: A device attachable to glasses that helps visually impaired users by reading text aloud, recognizing faces, and identifying objects in real time. Or the eSight Glasses: Electronic glasses that enhance vision for people with legal blindness by providing a real-time video feed displayed on special lenses. Theres also the Sunu Band (no longer available): A wristband equipped with sonar technology that provides haptic feedback when obstacles are near, helping users detect objects and navigate around them. While these devices can all technically assist in crossing the road, none of them are specifically designed for that purpose. The OrCam MyEye could maybe help identify oncoming traffic but may not be able to judge their speed. The eSight Glasses are unfortunately not applicable to all types of blindness. And the Sunu Band would most likely not react fast enough to fast-driving cars. Lastly, there are some smart canes that come with features like haptic feedback or GPS that can help guide users to the safest crossing points.

About road aids

??

User Experience Design Research

For this project, we will employ a process similar to UX design:

  • We will contact the stakeholders or target group, which is in this case visually impaired people, to understand what they need and what our design could do for them
  • Based on their insight and literature user research, we will further specify our requirements list from the USE side
  • Combined with requirements formed via the SotA research, a finished list will emerge with everything needed to start the design process
  • From then we build, prototype and iterate until needs are met

Below are the USE related steps to this process.

USEr research

Target Group

Primary Users:

  • People with affected vision that would have sizeable trouble navigating traffic independently: ranging from visually impaired to fully blind

Secondary Users:

  • Road users: any moving being or thing on the road will be in contact with the system.
  • Fellow pedestrians: the system must consider other people when moving. This is a separate user category, as the system may have to interact with these users in a different way than, for example, oncoming traffic.

Users

What Do They Require?

!!From user research we will probably find that most blind people are elderly. We can conclude somewhere here then that we will focus on elderly people, and that that will have implications for the USEr aspects of the design as elderly people are notorious for interacting and responding differently to technology.!!

Society

How is the user group situated in society? How would society benifit from our design?

Enterprise

What businesses and companies are connected to this issue?

Stakeholder Analysis

To come in closer contact with our target groups, we reached out to multiple associations via mail. The following questions were asked (in Dutch), as a ways to gather preliminary information:

Following these questions, we got answers from multiple associations that were able to give us additional information.

From this, we were able to setup interviews via Peter Meuleman, researcher and founder of Videlio Foundation. This foundation focusses on enhancing daily life for the visually impaired through personal guidance, influencing governmental policies and stimulating innovation in the field of assisting tools. Peter himself is very involved in the research and technological aspects of these tools, and was willing to be interviewed by us for this project. He also provided us with interviewees via his foundation and his personal circle.

Additionally, via personal contacts, we were also able to visit an art workshop at StudioXplo on the 7th of February meant for the visually impaired. We were able to conduct some interviews with the intended users on site.

Creating the solution

MosCoW Requirements

Prototypes/Design

Here we will put some different versions of a design of what we think will be a solution. We can pick one to work on.

Idea 1

Idea 2

Idea 3

Implementation

Evaluation

Conclusion

Discussion

Appendix

Timesheets

Week Names Breakdown Total hrs
Week 1 Bas Gerrits Meeting & brainstorming (3h), SotA research (4h), working out concepts (1h), sub-problem solutions (2h) 10
Jada van der Heijden Meeting & brainstorming (3h), planning creation (2h), wiki cleanup (2h), SotA research (3h) 10
Dylan Jansen Meeting & brainstorming (3h), SotA research (3h), working out concepts (2h), looking for reference material (2h) 10
Elena Jansen Meeting & brainstorming (3h)
Sem Janssen Meeting & brainstorming (3h)
Week 2 Bas Gerrits Meeting (2h)
Jada van der Heijden Meeting (2h), Editing wiki to reflect subject change (2h), Contacting institutions for user research (1h), Setting up user research (interview questions) (1h), User research (2h) 7
Dylan Jansen Meeting (2h)
Elena Jansen Meeting (2h)
Sem Janssen Meeting (2h)
Week 3 Bas Gerrits Meeting (3h), Looking at sensors and such to see what is possible (making a start on program) 
Jada van der Heijden Meeting (3h), Editing User Research wiki (), Refining interview questions ()
Dylan Jansen Meeting (3h), Updating Wiki, uploading and refining SotA
Elena Jansen Meeting (3h), Making interview questions for online, updating user research wiki
Sem Janssen Meeting (3h), Looking at different options for relaying feedback (design)
Week 4 Bas Gerrits
Jada van der Heijden
Dylan Jansen
Elena Jansen
Sem Janssen
Week 5 Bas Gerrits
Jada van der Heijden
Dylan Jansen
Elena Jansen
Sem Janssen
Week 6 Bas Gerrits
Jada van der Heijden
Dylan Jansen
Elena Jansen
Sem Janssen
Week 7 Bas Gerrits
Jada van der Heijden
Dylan Jansen
Elena Jansen
Sem Janssen

140 pp, 700 total, ~20 hrs a week