Project Overview
This project presents a mid-air gesture recognition system using a dual-IMU Arduino controller, developed for hands-free 3D box manipulation tasks. Built on a two-handed ergonomic design and a neural model trained on 10 DOF IMU signals, the system supports 9 gestures for intuitive real-time control. It uses BLE for streaming predictions wirelessly and runs fully on-device via TinyML.
✨ Built and deployed a two-handed Arduino Nano 33 BLE gesture controller with custom neural classifier
✨ Achieved 99% on-device classification accuracy using stability filtering and confidence thresholds
✨ Reduced average task time from ~170s to ~33s across 5 users by optimizing gesture mappings and prototype comfort
✨ Integrated real-time BLE gesture streaming with Python client and custom UI for live square control
Description
The final system used two Arduinos mounted on a cardboard cutout for ergonomic handling. Extensive prototyping was done with glove and wand variants before finalizing this version based on user feedback, comfort, and accuracy. Gestures like up/down/left/right, clockwise/counterclockwise, and resizing (+/-) were supported. A trained dense neural network was deployed using TensorFlow Lite Micro for real-time on-device inference.
1. Interaction Prototyping & User Testing
- Compared 3 form factors: wand, glove, and dual-hand controller
- Final prototype mounted on cardboard for comfort and stability
- Gesture sets included static poses, flicks, and rotation gestures
- Tested across 5 users for timing, fatigue, and prediction stability
- Final task averaged ~33s vs ~170s for initial glove/wand variants
2. Model Architecture & Training
- Trained a feed-forward dense neural network (ReLU activations) on 50×12 time-series windows
- Collected balanced gesture data with capped 1-minute recordings per class
- Evaluated confusion matrices, training/validation curves, and misclassification overlap
- Accuracy peaked at 99% on-device due to clean data, axis separation, and class balancing
- Model exported to
.tflite
for microcontroller deployment
3. On-Device Deployment & BLE Integration
- Deployed the
.tflite
model on Arduino Nano 33 BLE Rev2 using TFLite Micro
- Used
xxd
to convert model into a C byte array
- Real-time inference pipeline: IMU → buffer → model → BLE transmission
- Bluetooth client built in Python using
bleak
library
- Enabled mobile, power bank–driven untethered usage
4. Post-Processing & Stability Enhancements
- Confidence thresholding (≥ 0.95) filtered noisy predictions
- Added a 20-frame stability window to trigger sensitive gestures like
+
,-
, CW, CCW
- Remapped axes for rotation to reduce overlap with movement controls
- Dynamic gestures (flicks) improved intuitiveness over static poses
5. Results Summary
- Task time reduced from ~170s (glove) and ~146s (wand) to ~33s in final version
- Accuracy: 99% on-device with filtered predictions
- Error rate: Near-zero for all users after filtering
- User feedback: Positive on cardboard comfort, flick gesture intuitiveness, and UI responsiveness
Tools & Frameworks
Area | Tools / Stack Used |
---|---|
Microcontroller + Sensors | Arduino Nano 33 BLE Rev2 , BMI270 IMU , Two-Board Sync |
ML Model Training | Keras , TensorFlow , NumPy , scikit-learn , Matplotlib , Pandas |
On-Device Inference | TensorFlow Lite Micro , xxd , ArduinoBLE , C++ , tflite::MicroInterpreter |
Real-Time BLE Streaming | bleak (Python) , Serial , UUID |
UI + Post-Processing | PyQt5 , Matplotlib , Python UI Thread , Stability Filter , Confidence Filter |
Prototyping & Analysis | Training Curves , User Logs , Confusion Matrix , Cardboard Mounting , GIFs |