Context
Lens was built at the Apple Developer Academy in a five-person team. We followed SCRUM, iterating on a highly sensory experience where small interaction details were as important as screen layout.
Problem
Phone photography often trains volume over attachment — endless shots, little sense of occasion. The product question was how to restore anticipation, scarcity, and emotional weight to a single frame without feeling gimmicky.
Solution
Lens borrows from instant cameras: capture is only part of the story. Sound, haptics, texture, motion, and 3D elements shape a slower, more intentional loop. After shooting, the user develops the image and places it on a digital gallery wall — memory as an object, not a file in an endless roll.
My role & impact
I was UI/UX designer, focused on interaction quality and coherence between the nostalgic promise and the actual feel of the app.
- Sensory UX: Helped define how feedback (audio, haptics, animation) reinforced emotion and pacing — not decoration, but meaning.
- Interface & IA: Kept navigation and states legible while the experience stayed atmospheric; clarity where users need control, restraint where mood does the work.
- Product coherence: Guarded consistency between the “instant camera” metaphor and every screen users touch.
- Build & AI: Supported targeted implementation work; contributed AI-related ideas in team discussions where they could strengthen the product story.
Process & stack
- Design: Figma for structure and UI; Keynote for narrative.
- Collaboration: SCRUM, Airtable, Zoom — tight loops on a detail-heavy product.
- Development: Swift, SwiftUI, UIKit when native components or integration demanded it.