Mesquite MoCap – Open-Source Wearable 6-DoF Motion Capture
Open-source, wireless full-body 6-DoF motion capture using low-cost IMU nodes.
2024–2025 · Co-developer – firmware, sensor fusion, benchmarking

I build low-cost motion capture, sensing platforms, and AI-driven media installations to study how embodied systems perceive and act in the physical world.
My work spans computer vision, sensor fusion & IoT, XR/motion capture, and computational media, with a focus on embodied and agentic AI.
Heterogeneous sensors (IMUs, cameras, environmental data) fused into robust real-time perception.
Affordable motion and volumetric capture using ESP32 nodes and smartphones.
Temporal models for noisy, mixed-frequency streams such as financial or sensor data.
LLMs and perception models in installations exploring emotion, memory, and environment.
Open-source, wireless full-body 6-DoF motion capture using low-cost IMU nodes.
2024–2025 · Co-developer – firmware, sensor fusion, benchmarking
Four-space AI-driven installation about love, decay, and environmental change.
2024 · Artist–researcher; concept, systems, and installation
Computer vision, temporal perception models, sensor fusion & IoT, XR/motion capture, embodied AI, computational media.
Open-source mocap (Mesquite), environmental IoT (Opuntia), multi-phone 3D capture, Temporal Fusion Transformers for time-series.
Instructor of record for photography & digital media at ASU; 8+ courses taught and TA roles in gallery and studio contexts.