
Digital Meal Companion XIN
Digital Meal Companion XIN is an AI digital human designed to support healthier eating habits through emotional companionship and personalised nutritional guidance. By combining conversational AI, food recognition, and embodied interaction, the project explores how everyday mealtime routines can become moments of care, reflection, and behaviour change.
Digital Meal Companion XIN
Digital Meal Companion XIN is an AI digital human designed to support healthier eating habits through emotional companionship and personalised nutritional guidance. By combining conversational AI, food recognition, and embodied interaction, the project explores how everyday mealtime routines can become moments of care, reflection, and behaviour change.
Category
Experience/ Interactive Design
Duration
10 Weeks
Year
2025


I. The "At a Glance" Summary
Project Title | Team Details | Duration & Context | Key Deliverables | Methods & Tools Used |
|---|---|---|---|---|
Digital Meal Companion XIN | Project by: | One academic semester of 10 Weeks from February to June 2025, |
|
|
II. The Deep Dive: Process and Rationale
A. Discovery & Research
This project began with an investigation into everyday eating behaviours, particularly among individuals who eat alone, lack nutritional literacy, or struggle with maintaining healthy routines. Initial research highlighted that while many nutrition apps focus on data tracking, they often fail to address the emotional and social dimensions of eating, such as loneliness, motivation, and accountability.

Target Users and Setting
Through scenario mapping and user profiling, I identified an opportunity to reframe nutritional guidance not as a rigid monitoring tool, but as a supportive presence embedded within daily meals. This insight led to the central question:
How might an AI companion gently influence eating behaviour through conversation, empathy, and contextual feedback rather than instruction alone?
B. Define & Ideate
The concept was defined as a digital meal companion, a virtual human that combines the roles of a nutrition advisor, emotional supporter, and behavioural guide. Instead of replacing human interaction, XIN was positioned as a lightweight, non-judgmental presence that accompanies users during meals.

Key functions of XIN
Key design principles guided ideation:
- Emotional first, functional second: prioritising tone, encouragement, and relatability before nutritional accuracy.
- Personalisation over prescription: adapting responses based on user habits, food choices, and emotional cues.
- Embodied interaction: using a visible digital human to increase trust, warmth, and engagement compared to text-only interfaces.
Early ideation explored personality, visual style, conversational flow, and interaction scenarios such as eating alone at home, managing portion control, or seeking gentle reminders during busy workdays.
C. Develop, Prototype & Refine
I independently developed the system across three parallel layers: digital human embodiment, conversational AI logic, and interaction interface.
On the embodiment side, I initially designed and rigged the digital human using tools such as Ready Player Me, MetaHuman/MetaPerson, and Mixamo to establish a controllable base model. In the later stages, I intentionally shifted to AI-assisted visual generation to refine the final appearance, softening proportions, expressions, and overall styling to create a more cute, approachable, and emotionally inviting character. Rather than focusing on complex physical actions, I explored a range of facial expressions and emotional states to support more natural, empathetic conversations during mealtime interactions.

Visual design of XIN
For intelligence and interaction, I prototyped conversational AI logic using Coze Bot and large language models to simulate supportive, context-aware dialogue. Voice interaction was implemented through button-activated voice input and text-to-speech output, prioritising conversational clarity and reducing unintended interruptions. While not fully hands-free, this interaction model ensured cleaner exchanges and better user control during meals.
A lightweight web-based interface was deployed using GitHub and Vercel, enabling iterative testing of dialogue flow, emotional tone, and user experience. Refinements focused on conversational pacing, emotional responsiveness, and maintaining a calm, non-intrusive presence that aligns with everyday dining routines.
III. Final Product & Reflection
The final outcome is a functional prototype of XIN, a digital meal companion capable of engaging users through voice and text (both in English or Chinese), responding to food-related inputs, and offering personalised, emotionally attuned guidance during mealtimes. Rather than acting as a strict nutrition tracker, XIN demonstrates how AI can operate as a relational interface, shaping behaviour through presence and dialogue.

Quick demonstration video of the interaction with XIN
Value Creation
This project proposes a shift in how health technologies are designed: from efficiency-driven tools to care-oriented systems. XIN shows potential value in supporting individuals who eat alone, lack nutritional confidence, or need gentle behavioural nudges, bridging emotional wellbeing and dietary awareness within a single, accessible interaction.
Future Vision
Future iterations could integrate real-time emotion recognition, adaptive gesture libraries, and deeper nutritional databases to enhance responsiveness and credibility. Beyond nutrition, the system framework could extend to other daily routines, positioning digital companions as long-term behavioural partners rather than short-term productivity tools.
