AI projected interface concept

Wayfinding

10/18/2023

AI projected interface concept

Wayfinding

10/18/2023

AI projected interface concept

Wayfinding

10/18/2023

AI projected interface concept

Wayfinding

10/18/2023

What if maps could be more personal, providing just the right level of guidance by seeing what you see?

We are fascinated by the possibilities presented by AI-powered wearable devices, envisioning their ability to transform the way we engage with our world. Drawing upon more natural interactions unlocked by AI, as well as the context provided by a camera that you wear, we believe the future of maps is smarter, more personal, and more precise than what's ever been possible before.

How could AI-powered hardware go beyond phone or watch navigation?

Sees what you see. Receive more personal, precise directions based on your actual surroundings. By seeing exactly what you see, the device could guide you through even the most confusing spaces without you ever having to reach into your pocket.

More conversational. Today's maps direct you using generic instructions based on distances and unfamiliar street names. What if you could receive more tailored directions based on your familiarity with an area, the same way a friend might provide guidance?

Proactively surfaced. Rather than asking for directions to a destination, what if the device could surface relevant guidance on the fly — using physical context clues like your surroundings or digital content like calendar events, travel history, and messages?

Wander freely. Getting directions today is a rigid experience, but we don’t always need step-by-step navigation or alerts. What if an intelligent map could stay out of your way while you’re exploring, and remain easily accessible when you’re ready for guidance?

What if maps could be more personal, providing just the right level of guidance by seeing what you see?

We are fascinated by the possibilities presented by AI-powered wearable devices, envisioning their ability to transform the way we engage with our world. Drawing upon more natural interactions unlocked by AI, as well as the context provided by a camera that you wear, we believe the future of maps is smarter, more personal, and more precise than what's ever been possible before.

How could AI-powered hardware go beyond phone or watch navigation?

Sees what you see. Receive more personal, precise directions based on your actual surroundings. By seeing exactly what you see, the device could guide you through even the most confusing spaces without you ever having to reach into your pocket.

More conversational. Today's maps direct you using generic instructions based on distances and unfamiliar street names. What if you could receive more tailored directions based on your familiarity with an area, the same way a friend might provide guidance?

Proactively surfaced. Rather than asking for directions to a destination, what if the device could surface relevant guidance on the fly — using physical context clues like your surroundings or digital content like calendar events, travel history, and messages?

Wander freely. Getting directions today is a rigid experience, but we don’t always need step-by-step navigation or alerts. What if an intelligent map could stay out of your way while you’re exploring, and remain easily accessible when you’re ready for guidance?

What if maps could be more personal, providing just the right level of guidance by seeing what you see?

We are fascinated by the possibilities presented by AI-powered wearable devices, envisioning their ability to transform the way we engage with our world. Drawing upon more natural interactions unlocked by AI, as well as the context provided by a camera that you wear, we believe the future of maps is smarter, more personal, and more precise than what's ever been possible before.

How could AI-powered hardware go beyond phone or watch navigation?

Sees what you see. Receive more personal, precise directions based on your actual surroundings. By seeing exactly what you see, the device could guide you through even the most confusing spaces without you ever having to reach into your pocket.

More conversational. Today's maps direct you using generic instructions based on distances and unfamiliar street names. What if you could receive more tailored directions based on your familiarity with an area, the same way a friend might provide guidance?

Proactively surfaced. Rather than asking for directions to a destination, what if the device could surface relevant guidance on the fly — using physical context clues like your surroundings or digital content like calendar events, travel history, and messages?

Wander freely. Getting directions today is a rigid experience, but we don’t always need step-by-step navigation or alerts. What if an intelligent map could stay out of your way while you’re exploring, and remain easily accessible when you’re ready for guidance?

What if maps could be more personal, providing just the right level of guidance by seeing what you see?

We are fascinated by the possibilities presented by AI-powered wearable devices, envisioning their ability to transform the way we engage with our world. Drawing upon more natural interactions unlocked by AI, as well as the context provided by a camera that you wear, we believe the future of maps is smarter, more personal, and more precise than what's ever been possible before.

How could AI-powered hardware go beyond phone or watch navigation?

Sees what you see. Receive more personal, precise directions based on your actual surroundings. By seeing exactly what you see, the device could guide you through even the most confusing spaces without you ever having to reach into your pocket.

More conversational. Today's maps direct you using generic instructions based on distances and unfamiliar street names. What if you could receive more tailored directions based on your familiarity with an area, the same way a friend might provide guidance?

Proactively surfaced. Rather than asking for directions to a destination, what if the device could surface relevant guidance on the fly — using physical context clues like your surroundings or digital content like calendar events, travel history, and messages?

Wander freely. Getting directions today is a rigid experience, but we don’t always need step-by-step navigation or alerts. What if an intelligent map could stay out of your way while you’re exploring, and remain easily accessible when you’re ready for guidance?

Smarter directions

Smarter directions

Smarter directions

Next-Generation Navigation

Sometimes, you might need a bit more help getting to your destination. When you're not as familiar with a neighbourhood or when you're on a tight timeline, we think there's room for walking directions to be smarter and more precise.


We imagine a map interface could provide intelligent, context based directions generated by seeing what you see. Raise your hand to see written instructions, alongside a full map interface and compass arrow. Paired with the device’s camera, Navigation enables more precise and tailored guidance based on what you see or hear. A few examples of directions you could expect to see here are:


”Turn right just past the Starbucks" 

"Follow the sound of the live music”

"Your destination is just ahead, past the blue mailbox”


We also imagine that you could ask for additional context when en route and receive informative answers back. Things you might ask a friend like:

“Is that café I went to with Carli last week on the way and still open?”

“Take the route with less stairs, so I can navigate it on crutches.”

(While pointing) “Which building does Minh live in? This one or that one?”

“Can you let Karan know that I’ll be a few minutes late?”

Sometimes, you might need a bit more help getting to your destination. When you're not as familiar with a neighbourhood or when you're on a tight timeline, we think there's room for walking directions to be smarter and more precise.


We imagine a map interface could provide intelligent, context based directions generated by seeing what you see. Raise your hand to see written instructions, alongside a full map interface and compass arrow. Paired with the device’s camera, Navigation enables more precise and tailored guidance based on what you see or hear. A few examples of directions you could expect to see here are:


”Turn right just past the Starbucks" 


"Follow the sound of the live music”


"Your destination is just ahead, past the blue mailbox”


We also imagine that you could ask for additional context when en route and receive informative answers back. Things you might ask a friend like:

“Is that café I went to with Carli last week on the way and still open?”


“Take the route with less stairs, so I can navigate it on crutches.”


(While pointing) “Which building does Minh live in? This one or that one?”


“Can you let Karan know that I’ll be a few minutes late?”

Sometimes, you might need a bit more help getting to your destination. When you're not as familiar with a neighbourhood or when you're on a tight timeline, we think there's room for walking directions to be smarter and more precise.


We imagine a map interface could provide intelligent, context based directions generated by seeing what you see. Raise your hand to see written instructions, alongside a full map interface and compass arrow. Paired with the device’s camera, Navigation enables more precise and tailored guidance based on what you see or hear. A few examples of directions you could expect to see here are:


”Turn right just past the Starbucks" 


"Follow the sound of the live music”


"Your destination is just ahead, past the blue mailbox”


We also imagine that you could ask for additional context when en route and receive informative answers back. Things you might ask a friend like:

“Is that café I went to with Carli last week on the way and still open?”


“Take the route with less stairs, so I can navigate it on crutches.”


(While pointing) “Which building does Minh live in? This one or that one?”


“Can you let Karan know that I’ll be a few minutes late?”

Personal, conversational guidance. When you’re ready to go towards your destination, contextually relevant directions are generated and displayed in real time, based on what you see in front of you. Traditional maps can only use predefined instructions.

Recall-based navigation. AI-powered hardware's long-term memory could learn through your chats and journeys, enhancing your search and navigation through natural dialogue. Forget a place's name? Simply describe it, and receive directions.

flexible wayfinding

flexible wayfinding

flexible wayfinding

Wander

If you've tried getting walking directions on your phone, you know that the experience is rigid. Navigation today does not tolerate straying from your route or exploring along the way. If you decide to check out somewhere off path, your phone will loudly try and bring you back to its preferred path using notifications and haptics. It doesn’t consider your preferences, nor does it have any awareness of the urgency you’re navigating with, or what you're looking at. This doesn't mesh well with human nature, and often results in you silencing directions altogether. People love to explore, and we believe wayfinding should accommodate your desire to explore, not penalize you for it.


We've imagined a new wayfinding experience called Wander. If you're en route to a destination, but not in a rush to get there, Wander will calmly guide you with a simple compass arrow that always points towards your destination. An ETA is displayed in minutes to give reference to how far away you are from your destination. There’s no need to ask for directions or any need to manually summon Wander. We believe an AI device could proactively surface relevant information for you using context intelligently gathered from your surroundings. It could also use information from your digital life such as calendar events, emails containing lunch plans, or your favourite takeout restaurants. We believe Wander should recede, without haptics or sounds to distract you while you’re carving your own path. 


When you need it, simply raise your hand to see the compass instantly projected on your palm. We believe this could be a more human way to navigate, allowing you to go at your own pace.

A clear, glanceable interface. A compass arrow will continuously point to your destination. This helps to quickly confirm with a glance that you're on the right path, while offering the freedom and flexibility to wander and discover along the way.

Move at your own pace. Guidance today is an all or nothing solution. Sometimes, you want to travel without the need for always on, turn-by-turn navigation. Wander gives you the freedom to go at your own speed, and the reassurance you need on your way.

Glanceable Guidance

Glanceable Guidance

Glanceable Guidance

Transit

Traveling using transit can be confusing, especially when you’re in an unfamiliar city. The pressure only increases when you’re in rush hour and your train is departing at any moment. You may not feel confident you’re getting on the right train. In these moments, pulling out your phone to check for directions can feel slow and cumbersome. We need tools that are proactively aware of our context and available to assist instantly. 


We believe the instant availability of guidance —summoned using voice, gestures, or displayed via projection— could make AI wearables the best way to get to your destination when using transit. 


Since the device would be proactively aware of your plans, preferences, and context, we believe you’d be able to simply point at the world and ask questions. When you’re rushing to make a train or a bus seconds before it leaves, you could confirm you’re going in the right direction by quickly glancing at your hand and seeing a reassuring arrow projected in a fraction of a second. Precise, time-sensitive voice and haptic feedback could also help you get where you’re going when your hands are full.


When timing matters most, navigating Transit with this technology could remove any friction you’d experience unlocking your phone to reference a map. With a device that understands you and your surroundings, you’d only receive the most relevant information at any given time, providing guidance the way that best fits any given moment.

Calm and confident guidance. While in transit, we imagine you could quickly glance at your hand to see a focused view of your trip presented in natural language. A simple tap would display an overview of your journey to help you confidently plan ahead.

This one or that one? With a system that can see your surroundings, we imagine you’d be able to ask questions simply by gesturing, the same way you’d ask a friend. If you’re struggling to find your way, you could just point and ask your device for help.

Why we built this

We're motivated by an intense curiosity around this new wearable form factor, and excited about how augmented intelligence could be used to power simpler, more contextually aware experiences. Rather than waiting for the device to be fully revealed, we decided to imagine how it could work for ourselves, in an area we believe would be completely transformed by this new technology.


We imagine that AI-powered hardware could provide more personal help along your route by factoring in what you’re seeing, and guiding you using natural language. As it stands today, directions you receive on your phone are prescriptive, and don't take into account what you see in front of you. Your phone is impersonal, and it isn’t tailored to your needs — if you put your phone beside anyone else’s and asked them both to navigate to a destination, you’d get the exact same guidance.


Our solution adapts based on each individual's needs and desires, with accessibility as a core consideration for guidance. Whether you’re traveling with a wheelchair or living with a recent injury, an ideal wayfinding interface should automatically adapt to suggest routes that are ideal for your mobility while making intelligent, real time decisions that factor in your environment. These could include sidewalk conditions, street safety, stairs, and other important needs.


We understand that trips don’t always require constant guidance with step-by-step directions. There may be occasions when we are familiar with an area and only need nudges to help us along our way. Often times, we might want to take our time and explore on our way to a destination. We only need reassurance that we're headed the right way. Getting bombarded with constant map notifications can feel intrusive and may even hinder our experience distracting us from our busy environment. With Wander mode, you get a light level of guidance that keeps you present in the world while helping you remain on track.

Through creating this concept, we've learned a lot about how ambient and intelligent computing could become a part of our everyday lives. We’ve created a concept that the both of us genuinely want in our lives, which has us so excited about what the future will bring.

Imagined by Zane Liu and Michael Mofina.

We'd love to hear your feedback. Send us a message to chat with us, or visit either of our websites to learn more.