
Dot Go
Dot, 2022
The first object interaction app for the visually impaired.
For blind and visually impaired people, it’s challenging to enter unfamiliar environments. Objection detection apps can identify objects but fail when it comes to interacting with them in a meaningful way.
Dot Go not only detects objects in the environment but also connects them to actions. These actions could be internal actions like sounds and vibrations, external actions in other apps, websites, and even smart home devices.

Logic
Following the simple material conditional logic used in programming — “if x then y” — Dot Go allows users to create on a simple principle: Any object can be connected to any action. For example: A bus stop sign could trigger open the public transportation app to buy the right ticket.
Use Cases
Instead of fixed use cases, anyone can create presets for specific objects, locations, and needs. Nobody has to start from scratch, as they can download existing presets from a growing library curated by users, institutions, and brands.

UI Design
Dot Go combines open-source computer vision models with simple automation (system shortcuts and deeplinks). While allowing people to assign actions to objects, the app acts as an accessibility toolkit.
Since there are many levels to visual impairment, the app was designed with options to customize as per one’s needs; be it low-vision, light sensitivity, or colour blindness.

Accessories
For enhanced usability and performance in special situations like sports, Dot Go works with adaptive accessories like lanyards and shirts to enable handsfree use.

BTS
Here’s a behind-the-scenes look into our production process.



Credits:
Serviceplan Innovation, Serviceplan Korea, Hyperinteractive, Albert Coon, Paulus, Niklas May, Inês Ayer, Saliya Kahawatte, DamianDamian