Danyal Khorami

Danyal Khorami

Embodied perception, sensor fusion, and computational media systems.

I build low-cost motion capture, sensing platforms, and AI-driven media installations to study how embodied systems perceive and act in the physical world.

Research Themes

My work spans computer vision, sensor fusion & IoT, XR/motion capture, and computational media, with a focus on embodied and agentic AI.

Embodied Perception & Sensor Fusion

Heterogeneous sensors (IMUs, cameras, environmental data) fused into robust real-time perception.

Low-Cost Motion Capture & XR Tools

Affordable motion and volumetric capture using ESP32 nodes and smartphones.

Temporal Models & Sequences

Temporal models for noisy, mixed-frequency streams such as financial or sensor data.

AI & Computational Media

LLMs and perception models in installations exploring emotion, memory, and environment.

Featured Projects

View all projects
View Mesquite MoCap – Open-Source Wearable 6-DoF Motion Capture

Mesquite MoCap – Open-Source Wearable 6-DoF Motion Capture

Open-source, wireless full-body 6-DoF motion capture using low-cost IMU nodes.

~32 FPS • <15 ms latency • ~99.7% packet delivery
sensor-fusionembedded-systemsmotion-captureESP32-C3XR

2024–2025 · Co-developer – firmware, sensor fusion, benchmarking

View To Wilt – MFA Thesis Installation

To Wilt – MFA Thesis Installation

Four-space AI-driven installation about love, decay, and environmental change.

LLMinstallationsensor-fusionmedia-art

2024 · Artist–researcher; concept, systems, and installation

Snapshot

Research Focus

Computer vision, temporal perception models, sensor fusion & IoT, XR/motion capture, embodied AI, computational media.

Systems

Open-source mocap (Mesquite), environmental IoT (Opuntia), multi-phone 3D capture, Temporal Fusion Transformers for time-series.

Teaching

Instructor of record for photography & digital media at ASU; 8+ courses taught and TA roles in gallery and studio contexts.