JAi Robot Dog

JAi is an embodied AI robot dog designed to support joint attention development in children with Autism Spectrum Disorder (ASD) through adaptive outdoor interactions. By combining behavioral science, robotics, and experience design, the project reimagines early intervention beyond clinics, bringing structured yet empathetic training into everyday environments.

Category

Service/ Experience Design

Duration

15 Weeks

Year

2025

JAi Robot Dog

JAi is an embodied AI robot dog designed to support joint attention development in children with Autism Spectrum Disorder (ASD) through adaptive outdoor interactions. By combining behavioral science, robotics, and experience design, the project reimagines early intervention beyond clinics, bringing structured yet empathetic training into everyday environments.

Category

Service/ Experience Design

Duration

15 Weeks

Year

2025

I. The "At a Glance" Summary

Project Title

Team Details

Duration & Context

Key Deliverables

Methods & Tools Used

JAi Robot Dog
A 3-person team consisting of the following members:

Yao Yuan
Xu Hanzhi
LIU XINXIN

One academic semester of 16 Weeks from March to June 2025,
Studio Design Project
  • Final Physical Product
  • Concept Presentation
  • Physical product design
  • Figma (for documentation & presentation)
  • CAD/Rendering Software (for physical product)

II. The Deep Dive: Process and Rationale

A. Discovery & Research

Joint Attention (JA) is a foundational social-cognitive skill that underpins language acquisition, social interaction, and emotional understanding, typically emerging between 9–18 months of age. Children with ASD often exhibit persistent deficits in JA, both in responding to and initiating shared attention, resulting in cascading developmental delays when these skills fail to generalise beyond structured therapy settings.

Through literature review and precedent analysis, we identified key limitations in existing JA interventions: high dependence on professional therapists, predominantly indoor and highly structured settings, and weak transfer of learned skills to real-world contexts. At the same time, research suggests that non-anthropomorphic, animal-like robots are more effective at capturing attention, reducing social anxiety, and eliciting gaze-shifting behaviors in children with ASD, making them promising mediators for joint attention training in natural environments.

B. Define & Ideate

Building on these insights, we reframed the challenge as a design opportunity:

How might we support joint attention development in children with ASD in unstructured, everyday outdoor settings, without increasing caregiver burden or overstimulating the child?

User interviews with autism education specialists highlighted several critical design principles: predictable and repeatable interaction flows, exaggerated yet friendly cues, gradual prompt fading, and strong positive reinforcement. Children showed strong individuality and preference-based engagement, while caregivers struggled to balance training, emotional support, and real-time assessment of progress.

During early ideation, I led concept exploration and form studies, investigating how a robot dog could act as a social bridge, guiding attention between child, object, and caregiver. We mapped behavioral strategies (prompt hierarchies, gaze cues, motion-sound pairing) alongside system-level thinking, ensuring the concept remained both developmentally grounded and technically feasible.

C. Develop, Prototype & Refine

The concept evolved into JAi Robot Dog, a quadruped robot system leveraging the existing Lite3 model from Deep Robotics while integrating AI-driven perception, adaptive interaction logic, and embodied feedback. I was responsible for the initial ideation, UI designing as well as the 3D CAD development of the robot's head design, including head proportions, internal component layout, and custom connectors to accommodate displays, sensors, and motors while maintaining a soft, approachable appearance.

We selected Lite3 for its mobility stability, open API, and sensor compatibility, all critical for outdoor interaction testing. The physical prototype combined custom 3D-printed parts with a modular hardware architecture, enabling expressive eye movements, head gestures, and responsive locomotion.

At a system level, JAi Dog operates through layered modules: environmental sensing (vision, gesture, audio), intelligent task generation (object recognition and prompt selection), and adaptive reinforcement (movement, voice, and visual feedback). Interaction follows a structured yet flexible prompt hierarchy, progressively fading assistance as the child succeeds, while caregivers can monitor, adjust, and review training sessions through a companion app. Multiple rounds of motor, display, and system integration testing ensured coordinated behavior across hardware and AI components.

III. Final Product & Reflection

Final concept demonstration video

JAi Robot Dog demonstrates how embodied AI can extend therapeutic principles into real-world contexts, transforming parks and everyday outdoor spaces into supportive learning environments. By balancing structure with playfulness, the system encourages children to actively initiate and respond to shared attention cues, while easing caregivers’ cognitive and emotional load through automation, adaptability, and clear feedback loops.

From a design perspective, this project strengthened my ability to translate behavioral research into tangible systems, bridging interaction design, service logic, and physical prototyping. Leading the early ideation, UI designing and 3D CAD development allowed me to shape both the conceptual direction and the embodied experience, ensuring that form, behavior, and technology worked cohesively to support vulnerable users.

Value Creation
-
For children with ASD: Engaging, low-pressure joint attention training embedded in play and everyday environments.
- For caregivers: Reduced intervention burden through adaptive prompts, real-time feedback, and structured training flows.
- For therapy ecosystems: A scalable, hybrid model that complements professional intervention while improving skill generalisation beyond clinics.

Future Vision
Looking ahead, JAi could evolve into a deployable intervention platform for schools, community parks, and family use, supporting longitudinal progress tracking and multi-child personalisation. With further refinement, the system could integrate broader social-emotional training modules, enabling embodied AI companions to play a meaningful role in inclusive, human-centered care ecosystems.