Project Overview

Olfactory Passage: Multimodal Spatial UI
Olfactory Passage: Multimodal Spatial UI
Olfactory Passage: Multimodal Spatial UI
Engineering a multimodal spatial interface translating physical proximity into tactile, reactive digital environments.
Engineering a multimodal spatial interface translating physical proximity into tactile, reactive digital environments.

[Role]

Interaction Designer, Experience Prototyper (Solo)

[Duration]

7 Weeks, 2025

[Tech Stack]

Arduino Uno, TouchDesigner, Ultrasonic & Capacitive Sensors

Spatial Computing

Multimodal UI

Hardware Prototyping

The Challenge & Logic

Engineering Intentionality Beyond the Screen

Digital interactions are increasingly confined to flat screens, stripping away the richness of human senses. My primary challenge was building an interface that responds directly to physical presence rather than traditional clicks.

To bridge the gap between raw sensor noise and seamless UX, I implemented a 'Hysteresis Loop' within the system’s logic. By programming a specific data threshold, it distinguishes casual passersby from intentional users. This prevents visual flickering, ensuring transitions feel stable, deliberate, and human-centered.

Execution & Craft

A Full-Stack Sensory Pipeline

This project required building a custom sensory bridge between hardware, code, and chemistry. I established real-time serial communication between an Arduino Uno and TouchDesigner, utilizing advanced sensors to precisely track user proximity.

The proximity data dynamically drives the velocity and density of generative visuals, which are natively projection-mapped onto a hand-sculpted physical canvas. To ground this experience, I integrated a custom scent module that triggers upon physical approach, transforming passive viewing into an active, multimodal exchange.

Execution & Craft

A Full-Stack Sensory Pipeline

This project required building a custom sensory bridge between hardware, code, and chemistry. I established real-time serial communication between an Arduino Uno and TouchDesigner, utilizing advanced sensors to precisely track user proximity.

The proximity data dynamically drives the velocity and density of generative visuals, which are natively projection-mapped onto a hand-sculpted physical canvas. To ground this experience, I integrated a custom scent module that triggers upon physical approach, transforming passive viewing into an active, multimodal exchange.

Impact & Takeaway

Materializing the Digital Realm

Olfactory Passage proves that digital content becomes infinitely more powerful when it adapts to physical geometry and human senses. By shifting from a passive screen to an active spatial interface, this project demonstrates how we can leverage multimodal inputs to create deeply immersive environments.

Ultimately, this project solidified my core design philosophy: the true future of AI-native products lies not just in screen-bound software. Instead, it thrives in the seamless friction between complex digital logic and the tangible material world, transforming invisible data into human-centered experiences.

Materializing the Digital Realm

Impact & Takeaway

Olfactory Passage proves that digital content becomes infinitely more powerful when it adapts to physical geometry and human senses. By shifting from a passive screen to an active spatial interface, this project demonstrates how we can leverage multimodal inputs to create deeply immersive environments.

Ultimately, this project solidified my core design philosophy: the true future of AI-native products lies not just in screen-bound software. Instead, it thrives in the seamless friction between complex digital logic and the tangible material world, transforming invisible data into human-centered experiences.

All Works