FoodFrame
Mixed-Reality Nutrition Coach
- Hackathon
- Winner
- Cologne
FoodFrame turns food tracking into a spatial, hands-free experience. Instead of breaking the flow of eating to search, type, and estimate portions, it brings macros, coaching, and logging directly into view.
The long-term vision is an always-on AR companion that recognizes foods around you, overlays trustworthy nutrition labels, and nudges you toward your goals at home, at work, or while shopping.
What It Does
On Quest 3, FoodFrame combines live food recognition, instant macro overlays, a one-line coach tip, and one-tap or voice logging to an external tracker. Spatial widgets such as daily rings, a coach card, and a recent log panel persist across sessions using anchors.
-
Live food tracking
Detects common foods, shows per-item and meal totals for calories, carbs, fats, and protein, and can optionally surface fiber and water metrics. -
Transparent portions
Shows portion assumptions clearly, falls back to ranges when confidence is low, and asks for quick user confirmation on edge cases. -
Goal-aware coaching
Offers simple suggestions based on diet profile and daily progress, such as adding protein to reach a target or suggesting safer alternatives when high-calorie items are scanned. -
Seamless logging
Supports logging by voice or tap and syncs nutrition data to a connected tracking app with meal context such as time of day or workout timing. -
Spatial interface
Keeps floating labels near dishes and compact progress widgets in the environment so the experience stays glanceable and hands-free.
Inspiration
Traditional nutrition tracking interrupts the act of eating. FoodFrame was inspired by the idea that mixed reality can remove that friction: instead of forcing users into manual logging flows, the system brings the right information and the next useful action directly into the moment.
How It Works
-
Engine and MR stack
Built in Unity for Quest 3 using Passthrough, Scene Understanding, Spatial Anchors, and Interaction SDK for a hand-first mixed-reality workflow. -
Food detection
Lightweight on-device models enable low-latency item detection, followed by mapping labels to canonical foods. -
Portion estimation
Heuristics and size references estimate serving volume, with confirmation when the system is uncertain. -
Nutrition mapping
Standardized food IDs are linked to a nutrition database with normalized units such as grams and milliliters. -
Inference and sync
The main path stays local for latency, with optional cloud assistance for harder cases and REST or WebSocket-based sync to external trackers and a PWA mirror.
Challenges
-
Hands-only UX
Reliable pinch and hold interactions had to feel intentional without cluttering the view. -
Portion accuracy
Lighting variation and occlusion made exact estimates difficult, so confidence ranges and confirmations became part of the design. -
Anchor robustness
Spatial widgets had to remain stable across rooms and sessions to feel like a trustworthy persistent interface.
What We Achieved
- Built an end-to-end MR loop from scan to label, total, coach tip, and final log.
- Created a persistent spatial UI that reappears where the user left it.
- Enabled near-real-time headset-to-mobile synchronization for rings and recent logs.
What We Learned
- In mixed reality, short and timely coaching is more effective than dense advice.
- Showing assumptions explicitly builds trust and speeds up confirmations.
- Clear module boundaries across detection, mapping, coaching, and logging make iteration faster.
What's Next
- Depth-aware portioning to improve volume-to-mass estimation.
- Ingredient-level parsing for mixed dishes and home-cooked meals.
- Buffet-mode warnings and quantified swap suggestions.
- Restaurant menu import, barcode support, and stronger offline resilience.
- Privacy-respecting shared meals and collaboration flows with dieticians.
Built With
Unity, Quest 3, Meta XR Passthrough, Scene Understanding, Spatial Anchors, Interaction SDK, C#, lightweight vision models, a nutrition database, a REST/WebSocket backend, and a PWA dashboard.