WORK
WORK
ABOUT

Dot Go
Dot, 2022
The first object interaction app for the visually impaired.
For blind and visually impaired people, it’s challenging to enter unfamiliar environments. Objection detection apps can identify objects but fail when it comes to interacting with them in a meaningful way.
Dot Go not only detects objects in the environment but also connects them to actions. These actions could be internal actions like sounds and vibrations, external actions in other apps, websites, and even smart home devices.

Following the simple material conditional logic used in programming — “if x then y” — Dot Go allows users to create on a simple principle: Any object can be connected to any action. For example: A bus stop sign could trigger open the public transportation app to buy the right ticket.
Instead of fixed use cases, anyone can create presets for specific objects, locations, and needs. Nobody has to start from scratch, as they can download existing presets from a growing library curated by users, institutions, and brands.

Dot Go combines open-source computer vision models with simple automation (system shortcuts and deeplinks). While allowing people to assign actions to objects, the app acts as an accessibility toolkit.
Since there are many levels to visual impairment, the app was designed with options to customize as per one’s needs; be it low-vision, light sensitivity, or colour blindness.

For enhanced usability and performance in special situations like sports, Dot Go works with adaptive accessories like lanyards and shirts to enable handsfree use.

BTS
xxxxxxxx



Credits:
Serviceplan Innovation, Serviceplan Korea, Hyperinteractive, Albert Coon, Paulus, Niklas May, Inês Ayer, Saliya Kahawatte, DamianDamian