Super Lidar: An app for exploring your environment
Super Lidar is a free app available on iOS that makes use of the Lidar scanner built into the iPad Pro 2020, iPhone 12 Pro and Pro Max, and iPhone 13 Pro and Pro Max. Lidar (light detection and ranging) is similar to Radar but uses light instead of radio waves to detect objects.
The Super Lidar app boasts the ability to detect the distance of obstacles (depth description), detect people and how far they are away (people detection) and detect whether they are wearing a mask (mask detection).
The app is developed in Boston, USA by Mediate. This software house is also responsible for developing the Supersense app.
Richard, one of our Digital Enablement Officers, tested this app. Here is what he had to say:
App lay-out overview
On launching, you are taken to the main screen with the detection mode enabled. The app can be used straight away with default settings.
Most of the screen is taken up with the camera view. There is a button in the top left that takes you to the settings menu and a start/stop button in the same position as the camera button, in the centre at the bottom of the screen.
According to the developers, this feature detects people and gives information on how far away that person is.
This feature worked quite well, although it did detect more people than were actually around. It turned out that the extra person detected was a reflection of a person in a picture frame.
Unfortunately, there was no indication given of how far away the person was, although the distance was implied by the audible tones.
According to the developers, this feature tells you if a person is wearing a mask or not.
This feature worked very well, although, there was still an occasional inaccuracy. The one time that it incorrectly identified that someone was wearing a mask was when it detected me as a reflection in a picture frame.
There were a couple of occasions where it didn’t detect a mask when there was one being worn.
According to the developers, this feature creates a virtual 3d map of your surroundings, then interprets obstacles.
Feedback is given via sound and haptic feedback on the iPhone – iPads do not have the option for haptic feedback. The more intense the haptic feedback is, and the higher the tone, the closer the obstacle is.
This was incredibly accurate. The range of the hardware is five metres and the audio feedback created a very accurate portrayal of the physical surroundings.
However, the detection cone (hardware limitation) meant that anything below the level the phone was being held at was not detected but the detection area did include anything up to about seven feet. Use with a cane or guide dog is highly recommended!
Unfortunately, there was no haptic feedback, even when I put my phone against a wall.
Another feature is the ability to detect specific objects. This appears limited to the objects specified by Apple in the operating system: ceiling, door, floor, seat, table, wall and window. On testing the app, the object identification was very accurate.
The settings menu allows you to swap between the modes of detection – people, environment or both at the same time.
There is another section where you can change the settings for audio feedback. There is a mute option to kill all the sound, a reverse audio option, which inverts the pitch of the tone of the feedback, and an option for rhythmic or continuous feedback. The continuous feedback sounds like a metal detector being waved over some metal. The rhythmic feedback sounds like the rhythm of a train but with a tone. The higher the pitch of the tone, the closer the object is.
The rest of the menu offers instructions for use, the option to request a call for someone to talk you through how to use the app, the opportunity to rate the app, information about the app and a shortcut to the Supersense app if you have it installed.
Dynamic fonts did not work with the menu items, only the menu subheadings. The contrast levels were good throughout the app.
VoiceOver worked well. Every item that needed to be focusable was.
The screens only contained what was required for the app functionality, which meant swiping was kept to a minimum. Every focusable item was described accurately.
Richard’s final thoughts
This is a great example of the potential of Lidar in apps. The hardware limitations of what Lidar will pick up can only be improved by Apple in terms of angle of capture and range.
The selection of objects identified is again down to Apple, although there is scope for the developers to use their own artificial intelligence (AI) system.
I suspect the biggest limitation for the app though, is the requirement to hold your phone against your chest. However, on that note, it did work remarkably well when placed in my shirt pocket!
Find out more
We do not yet currently have a video demonstrating Super Lidar, however if you want to see Lidar in action, you can watch our video demonstrating the People Detection feature of the iOS magnifier app, which is another great example of the possibilities with Lidar.
For more information about Super Lidar, visit the Super Lidar website.
If you would like more information about Super Lidar, and other solutions to enable you to access visual information, and you live in Greater Manchester, get in touch with our Digital Enablement Team. You can call us on 0300 222 5555 or email email@example.com.
You can also visit our Knowledge Village packed with blogs, videos and eBooks on many aspects of living with a visual impairment.
We can't do it without you
Henshaws rely on voluntary donations; our work just wouldn’t be possible without people like you. Your support empowers local people living with sight loss and a range of other disabilities to increase their independence, achieve their dreams, and go beyond expectations.