View Case Study

Iris AI

1st place

How we shaped a new wearable layer of intelligence in 24 hours

In just 24 hours, our team of 2 product designers and 2 business strategists redefined how AI could move beyond screens and become an ambient, proactive intelligence woven into everyday life.

This work took place during the 2026 AI & Startup Design Hackathon, bringing together students from Parsons School of Design at The New School and the University of Arizona, where 14 teams and 44 participants were challenged to imagine products that could compete in the AI era and in the world of 2035.
The result was Iris - an agentic AI platform for smart glasses designed to deliver proactive, contextual assistance through an ambient interface woven into daily life.
Scope
  • Problem framing
  • Product concept
  • UX flow design
  • Interface design
  • Prototype
Role
  • Product Designer
Credits
  • Parsons School of Design
  • University of Arizona
  • Designpreneurs
Reframing the original idea
Early in the hackathon, our team explored BandJam, a mobile app concept proposed by the business strategists to help musicians match, meet, and play together. However, the idea still relied on traditional patterns of the screen-based interface.
At that point, as the product designer driving UX vision and concept direction, I proposed reframing the problem entirely. Instead of designing another app or dashboard, I asked a different question: What if intelligence could live around you rather than inside your phone?
That shift led to the core vision behind Iris - an agentic AI system that operates as an ambient interface layer through smart glasses, delivering contextual assistance without requiring users to constantly return to a screen.
problem framing
A few years ago, the promise of technology was that it would save us time.
Instead, most of us spend our lives managing technology instead of benefiting from it.
Despite having more powerful tools than ever before, everyday life often feels fragmented, reactive, and mentally exhausting.
Understanding “the now”
To understand why this happens, our team looked at 3 layers of today's technology landscape: hardware, AI assistants, and apps.
Insight 01
Wearables bring AI closer to the real world, but remain constrained by immature interfaces. The hardware is there, but the intelligence layer is still shallow.
Insight 02
AI Assistants are powerful but rely on explicit prompts. They react to requests but rarely understand context.
Insight 03
Most tools solve single tasks. Users rely on multiple apps to manage everyday activities.
Vision
What if intelligence didn’t live inside apps at all?
Instead of constantly summoning AI through prompts, we imagined a system that understands context and assists proactively throughout the day.
The future of AI is present - woven into everyday life, not waiting to be summoned.
concept
Our solution was Iris - an agentic AI platform for smart glasses that learns how you live and proactively assists you throughout the day.
Instead of acting like a chatbot on your face, waiting for prompts and responding in isolated moments, Iris functions as a discrete personalized agent that builds context over time, asks follow-up questions when needed, and creates personalized workflows for each user.
why glasses?
Smart glasses provide continuous access to the user’s visual context while keeping interaction hands-free. Unlike phones, they remain present throughout the day without requiring constant attention.
This makes them uniquely suited for ambient AI systems that assist without interrupting everyday life.
How Iris Works
Our system reflects a broader shift in human-computer interaction. Instead of navigating software through screens, buttons, and menus, people interact with AI agents that operate software behind the scenes.
To demonstrate how Iris works in practice, we designed several everyday scenarios.
Contextual Navigation
Our user starts at the airport, moving fast and already mentally overloaded. Instead of stopping to search crowded boards, Iris instantly finds the gate and checks the timing for him.
When he lands, Iris helps him get to the right baggage claim carousel without adding one more layer of friction to an already tiring travel day
Real-time Activity Tracking
Later, he heads to the gym. Iris shifts with him, tracking reps, sets, and workout progress in real time, then helping him locate the exact machine he needs.
Design Principles
Peripheral Information
The Iris interface is organized into distinct visual zones so information appears based on urgency and relevance, while the center remains completely clear to avoid obstructing vision or creating distraction.
simple design System
We designed a minimal design system built on simple shapes, bold colors, and signal-based elements to ensure maximum visibility and efficient use of zones.
AI-based Accessibility
Iris analyzes the real-world background in the user’s field of view and dynamically adjusts opacity and color contrast to maintain optimal legibility and visual clarity.
impact
Iris shifts the paradigm from basic chat-based assistance to longitudinal AI that supports users throughout everyday life.
Our team believes this shift will define the next major category of intelligent interfaces, and Iris is designed to help shape it.
Lesson learned
Strong product vision beats visuals. When you step back from existing patterns and rethink the problem itself, better ideas emerge.
Team Iris at the Top
My team took home first place at the AI & Startup Design Hackathon hosted by Designpreneurs, bringing together students from Parsons School of Design and the University of Arizona.
team iris
  • Evgenii Astapov - Product Design, Motion
  • Zarah Yaqub - Product Design, Brand
  • Kian Sadat - Business Strategy
  • Safiya Tarazi - Business Strategy
Next Case