Journeyman

Augmenting Personal Realities  ยท  AR Projection  ยท  Spatial Mapping  ยท  Hackathon

๐Ÿฅˆ 2nd Place Overall โ€” MIT Media Lab HARD MODE Hackathon ๐Ÿ† K2 Think V2 Sponsor Winner 48 Hours

Project Overview

Journeyman is a wearable AR projection-based guidance system for accelerated skill acquisition โ€” from learning piano to assembling machinery and operating industrial equipment. Instead of augmenting reality through isolated personal displays like traditional AR glasses, Journeyman projects guidance directly onto physical surfaces via a helmet-mounted projector, making learning visible, intuitive, and collaborative.

Built in a single hackathon weekend at the MIT Media Lab, the system orchestrates real-time spatial mapping, voice control, and synchronized haptic feedback in one unified pipeline. Piano was our demo, but the architecture is fully domain-agnostic.

Key Stats
  • Award: 2nd place overall + K2 Think V2 sponsor prize, MIT Media Lab Hard Mode Hackathon
  • Demo: Real-time projected piano guidance onto a physical keyboard
  • Hardware: Helmet-mounted projector + camera, Raspberry Pi, ESP32, INMP441 mic
  • Localization: AprilTag-based spatial mapping with pixel-accurate coordinate transforms

Demo Video

Embed video here โ€” replace with <iframe src="..."> or <video> tag

System Architecture

The full pipeline runs from surface detection to projected guidance to physical feedback in real time.

Spatial Mapping & Projection

A helmet-mounted camera detects AprilTag markers to localize and map target surfaces, computing precise projector coordinate transforms for pixel-accurate overlays. This lets the system know exactly where to project each cue relative to the physical object in front of the user.

Session State Machine

A central orchestrator manages the lesson loop โ€” tracking which step the learner is on, when to advance, and how to recover from errors. The state machine is designed to be fully domain-agnostic, so the same logic that guides a piano learner through a chord progression can guide a technician through a machinery assembly sequence.

Voice Control

An INMP441 microphone and Raspberry Pi handle voice command recognition, dispatching intents โ€” next, repeat, stop โ€” to the central orchestrator so learners can navigate lessons hands-free.

Haptic Feedback

An ESP32 microcontroller delivers haptic feedback synchronized with each projected step, reinforcing visual cues with a physical signal so the learner's body and hands stay engaged with the task.

Tech Stack
Raspberry Pi ESP32 AprilTags INMP441 Mic Helmet-Mounted Projector Spatial Mapping Voice Recognition Haptic Feedback Python OpenCV

Applications

While piano was our hackathon demo, Journeyman's architecture was designed from the start to generalize to any hands-on skill where guidance can be projected directly into the world.

Progress Photos

As is generally the case with hackathons, it was a bit chaotic but tons of fun!

Team

Built over a single weekend at the MIT Media Lab Hard Mode Hackathon.

Mohammed Aamir Keira Boone Nick Nim Chris Um Cecilia Xu Rachael Yang Anna Zhang Gloria Zhu
Back to Projects