HvA Digital Life
November 2017
  • Konstantinos
  • Pinar
  • Roselinde
My Role
  • Literature Review
  • Research Analysis
  • User Journeys
  • Concepting
  • Interaction Design
Helping visually impaired people navigate urban environments using common technologies.

This brief came from a research group at the HvA Digital Life. The objective of the group was to create a solution with technologies that are already used by visually impaired people or technologies that are currently available in the market or will be in the near future. To do that, the group was already invested in using smartphones, smartwatches and beacons. They asked us the question, ‘how can a smartwatch+smartphone way-finding solution guide visually impaired people through an urban environment?’

We started off by analysing the interviews.

The client had already conducted user research through interviews and our first step was to analyse these and generate key findings.

Sense of surroundings

Visually impaired people (VIP) like having a sense of their surroundings and understanding the kind of space they are walking through. For e.g. if they are walking through a busy intersection or a train station or a park.


VIPs tend to have their own little landmarks which are not known to other people. For e.g. the smell of a cafe is considered to be a landmark. VIPs would like to have the ability to customise their route to register this landmark.

Not the fastest route

Most navigation apps guide people along the fastest routes. But there are often confusing and stress inducing. Research shows that VIP prefer being guided through the safest or the easiest route.


VIPs often need confirmation to know if they are on the right path or not. Their most commonly used products and services do not allow them to ask the system if they are on the correct path or not.


VIPs are scared of using their phone for navigation because of the chance of theft. Besides this, smartphones are also awkward to hold, have a visual based interface and are awkward to talk to in public.

Free hand

VIPs often hold either a cane or a guide dog in one hand. They need the other hand free for holding the railing while climbing stairs or other purposes.

A visually impaired person (VIP) is not a blind person.

VIPs simply have visual impairments that affect their ability to see clearly. VIPs can differentiate between a road and a sidewalk and do not need extremely high accuracy solutions like blind people would.

Based on our research analysis, we created personas.


< 30% vision

Navigation aids

Cane, Smartphone Apps, Has Smartwatch


Navigate to friend’s house independently
Rely less on her friends
Raise her self-respect and self-confidence


Navigating causes stress
Apps show fastest route instead of the safest or easiest
No help with which side of the road the final destination/building is
She can't ask if she is on the correct route or not
Preparing her route takes a lot of time because it is stressful
Using the app causes stress


Knowing which side of the road her final destination is.
Knowing what kind of environment she is going through.


< 10% vision

Navigation aids

Cane, Guide dog, Smartphone


Navigate more easily
Be more independent
Be more confident going outside


Feels disoriented & less mobile
Apps show fastest route instead of the safest or easiest
Favourite apps don’t tell her if crossings have clickers, which areas are more/less crowded and where obstacles are.
Has to avoid crowded areas at rush hour
Selecting her preferred route in an app is very difficult so she has to stick to the default route provided
Using the app causes stress


Knowing which side of the road her final destination is.
Knowing what kind of environment she is going through.

Next, we created a user journey to identify pain points.

The journey was split into two phases. The first was the preparation phase in which users prepared their route at home and the second was the navigation phase in which users were out in the world navigating to their destination.

Guiding people along the safest or easiest path through the use of smartphones, smartwatches and beacons by using audio as a means of instruction, vibration as a means of confirming actions and visuals as a means of backup information.

The system is flexible and recommends different routes for different users based on their needs and preferences. For e.g. some users would be guided along a path with more obstacles because they have a guide dog for help whereas others would be recommended a path with less obstacles.

But how did we get here?

We discussed and defined the elements of the route and what the navigation phase would look like. We also designed the audio instructions and created a user journey for the new concept, made sitemaps, user flows, wireframes and prototypes.

First, we understood and defined the elements and components of a route.

This would be the things, points, spaces and other elements that a VIP would encounter in a route.


Imaginary points on a route that break it down into sections that are easy to understand, provide a sense of completion and the surroundings they are walking through.


An obstacle on the road is anything that prevents the straight passage of a person. E.g. construction sites, road closures, etc.

Decision points

Decision points are any points that offer multiple directions on a route or require a decision to be made. E.g. a junction or a traffic signal.

Next, we defined what the navigation phase would look like for a VIP.

We tried to understand what the navigation phase would look like. What are the elements that the user would encounter and how would he/she interact with it? How would instructions be provided? And when would these instructions be provided?

Timing of instructions

Instructions regarding the turn from the decision point would be provided 5m before reaching the point. This is to ensure that VIPs don't stop for a long period at the top of an elevator or a traffic signal and block others.

User actions at decision point

User stops when the watch vibrates. She then turns her body to orient herself towards the next instruction. As soon as her body/watch is facing the right direction, the watch vibrates again to confirm her actions. She then walks in that direction.

Using checkpoints

Checkpoints provide a sense of how much of the route has been completed and give an idea of the surroundings as well. Users can add more checkpoints to their route. For e.g. a cafe that the user smells daily could be added as a custom checkpoint so that she doesn't get confused the day the cafe is closed and has no smell.

We finished off the navigation phase by designing the audio instructions.

Based on our literature review and the recommendations of the Wayfindr open standard, we tried to create the best possible instructions for VIPs.

Once we were done with the navigation phase, we designed the smartphone app for the preparation phase.

We worked out user flows, wireframes and prototypes to get to the final product.

User flow

I created a user flow to have an idea of how users would go through the app and  how the pages would work in the prototype.


The Final Product

Home screen

The home screen allows users to easily enter their destination through voice or touch - whatever they prefer. A follow up screen confirms if they system detected the correct destination or not.

Push directions

Once the users select their destination and route and are ready to leave the house, they simply push the directions from the smartphone to the smartwatch.

Navigation phase (smartwatch) screen

The smartwatch primarily uses haptics and audio to guide the user as shown in the concept video. The visuals are only a backup for those users who prefer it.