Navigation for sight-impaired people with haptics

Navigation for sight-impaired people with haptics (Part 1)

Apple say their goal is to help people and make lives easier. For example, their devices allow us monitor how our bodies behave such as with heart rate and sleep tracking apps. In addition they offer software solutions that are classed as Accessibility features.


One such feature is VoiceOver. It allows sight-impaired people use the iPhone. How does it work? While a user is moving their finger over the screen, iOS checks whether it is pointing on a UI component. If so, it reads out all relevant information about that component. It’s simple and amazing, isn’t it?

Not long ago, I worked on an iOS Accessibility project. What did I want to achieve? I wanted to create an application that would allow sight-impaired people navigate using vibrations. What’s more the user would be able to put the device in their pocket and be informed about navigation progress by means of an external device.

In this article I’d like to share with you how I dealt with one of the biggest problems in this application — allowing user to put away the device while navigating. This part is not strictly related to the navigation process — I will cover this topic in Part 2. For now we’ll focus on the preprocesses required to make navigation work.


As we know, during the navigation process our device is gathering and analyzing a lot of data. I can usually see direction in which I’m pointing. In technical terms it’s called the heading. That’s very important information because we can verify that we point in the right direction. The next very important thing is to know where exactly I am — yes, our current location’s coordinates.

Let’s focus on the first type of data — the heading. The problem is that the heading’s value may vary a lot — especially when we put our device in our pocket. When device changes its tilt (relative to the Earth surface) it may even show us the opposite heading (jumping heading) to which we re pointing — that was next problem.

But the main problem for me was to get position offset between current device position (e.g. in the pocket) and the base position. Offset would be used in navigation calculations to determine if user is moving in the right direction. To determine offset I needed the previously mentioned heading.


We had to determine in which position user will keep the device during navigation – because it’ was an MVP we decided that for now, the user wouldn’t be able to change position of device during navigation. I needed to create process which would allow application to receive GPS data in as stable as possible. I could then calculate the offset for that position. It required me to not only work on software but also to determine how user should behave to make everything work correctly. I came to conclusion I need calibration! The calibration which would require user to perform specified actions to generate the offset. I needed to keep in mind that the process must cater for the specific group who would use it.


The process consisted of 3 steps:

  1. Finding North
  2. Putting the iPhone away
  3. Moving

Obviously, if the user accomplishes third step, the device is properly calibrated and ready to start navigation.

In the next sections I will go through the steps and explain it through the perspective of both user and programmer ????.

Finding North

The first step of calibration: The main goal is to adjust device and user’s body position in order to begin the calculations.

Finding North — User perspective

Navigation for sight-impaired people with haptics

At the end of this step I want user to face North holding device in the base position. The base position requires the device to:

  • be parallel to the Earth Surface,
  • have screen pointing to the sky,
  • be kept in front of user’s chest.

I use this position as the base point in all calculations. What does user need to do in this step? First they need to listen to the instructions (I had configured VoiceOver in the application). Once the user gets familiar with this step, they should adjust device’s position to be as similar as possible to the base one. Then the START button is pressed to trigger vibrations. The user needs to turn around in place until the device stops vibrating. In addition to stopping the vibration, the application gives audio confirmation that user is pointing North.

Finding North — What’s behind this

When it comes to the logic side of this step it’s quite easy. In this step I analyze only the heading value – to determine whether user points North. Why is the base position of the device is so important? After testing how the device behaves in different positions e.g. parallel or perpendicular to the Earth Surface, it turned out a parallel position is the most stable.

The heading takes values from 0° to 360°. It is virtually impossible for user to set the position to be exactly at 0° (North) so I’ve increased the angle for it — from 350° to 10°. This assumption make it easier for user to find desired direction.

struct Constants {
    static let northStartAngle = 350.0
    static let northEndAngle = 10.0
    /// Those constants will be used in next snippets
    static let north = 360.0
    static let south = 180.0

To inform user if it points to the North I used a remote haptic device. The goal for this step was to increase haptic volume when user turns away from North. Manipulating the vibration and volume allows the user to verify if they need to change rotation direction. I did method to check whether user points in the right direction.

func isPointingToTheNorth(heading: CLLocationDirection) -> Bool {
    return heading.isAngleBetweenOrEqual(start: Constants.northStartAngle,
                                         end: Constants.northEndAngle)

If the heading is correct, the remote device stops vibrating and the iPhone gives an audio confirmation to user.

The situation changes when heading is not North. In this case I needed to calculate haptic volume to inform user how far they are off North.

let volume = differenceBetweenAngles(heading, Constants.north) / Constants.south
// Sending information to the device with volume

In the code above we start by calculating the lower angle between the current heading and angle of North. We proceed with the calculations by dividing the previously calculated angle by the South angle. This division is necessary to get a volume value between 0.0 and 1.0. Why do we divide by the South’s angle? Because the maximum volume value is for 180°.

Putting the iPhone away

Second step is more informative. It explains user what application will require of them in the next step. After the user gets familiar with all instructions, they start the countdown (by pressing START) which gives a time to put the iPhone away before calculations begin.


The final and most important step: At the end of this, the application should have gathered all necessary data and calculated the position offset.

Moving — User perspective

What I want to achieve as programmer is to make device detect motion and heading used in the calculations. At the beginning of this step user should keep device in the position it will be during navigation. It can be in the hand, in a pocket – in any position bar a few I will describe later. In this step the user has quite a few options:

  • gently shake the iPhone
  • tilt its body
  • make a step forward and backward

The main purpose of these actions is to make the device detect motion. The crucial thing here is to make move as the crow flies ⚠️. It will ensure that our adjusted base position of the device and user’s body won’t change in the way it shouldn’t. The user needs to proceed with one of those movement options until it gets feedback from the iPhone about calibration result.

Moving — What’s behind this

Check the image below to see example of the device position (heading) and offset which we need to calculate.

Navigation for sight-impaired people with haptics

Movement ensures we get values for device heading. We can’t rely on just one value for the heading. You need to remember about heading noises which may appear. To overcome this I decided to gather a number of headings and calculate an average which will be used as final device heading. I also tested other options. Instead of an average, I could use for example low pass filter to determine final value. But in this case, average works perfectly fine.

Once we get average device heading, we can move on to calculating the position offset. It’s super easy! Check the image above again. We need to calculate angle between North (which is 0°) and the device heading. The difference between the positive value and 0 always will be this value. It means that by calculating an average, we’ve already calculated our offset ????????????.

The question is, have we solved all our problems? ????

When device changes its tilt (relative to the Earth surface) it may even show us the opposite heading (jumping heading) to which we're pointing.

We didn’t solve this one. What’s more, it may quite often be a problem because we don’t know how the user will put the device into the pocket. I’m not going to go into the detail about this. You can check it for yourself. Open any maps application which shows you your heading and try to rotate device in any dimension. You will notice the heading will jump to the opposite in specified tilt to the Earth Surface. That’s our problem. It’s impossible to solve this but it’s possible to prevent calculations from being proceeded when iPhone position will be one of those.

I used the iPhone’s built-in gyroscope to receive momentary tilts. We get this by using method startDeviceMotionUpdates of CMMotionManager.

func startMonitoringMotion() {
    guard manager.isDeviceMotionAvailable else {
        fatalError("Device motion is not available")

    manager.startDeviceMotionUpdates(to: queue) { motion, error in
        if let tilt = motion?.attitude.pitch.radiansToDegrees {
            /// Passing new tilt to the calculator

Then I repeated a storing pattern for tilts as was done for headings. Remember about noises! This is very important. In this case, the pattern has been extended by one more functionality. I check periodically whether the buffer contains critical tilts which make heading jump. If it does, calibration stops and application informs user that it should change device position.

Both processes (observing heading and tilts) work in parallel. It allows the application to check everything in real-time and to stop calibration as soon as possible if it’s necessary. If there won’t be any problems with tilts, the device should calibrate successfully ????. Now we’re ready to start navigation!


I managed to create a calibration process which solved a big problem. The result of this will be used in navigation which I will cover in Part 2. In my opinion the solution turned out not to be very complicated. But it definitely wasn’t a fast thing to do.

What’s the most important in my opinion? Ease and safety of use. The user doesn’t need to perform any actions which may be dangerous. Making step forward and backward is perfectly fine in this situation.

What’s the thing I like the most about this? You don’t need good GPS signal which means you can calibrate your device in a building. It’s not problem if it will show the North in different direction each time. The compass will always have North at 0°. That’s the only thing we needed.

Now you can go to Navigation for sight-impaired people with haptics (Part 2) to learn about other things related to the navigation process.