Run gesture recognition, activity tracking, and motion classification on a sub-€1 microcontroller. No cloud, no phone dependency, battery life measured in months.
Streaming IMU data over BLE to a phone for classification kills battery life. Users expect days of autonomy, not hours.
Most wearable AI requires a connected smartphone for inference. Without the phone, the device is a dumb sensor.
Running CNN-based gesture models requires expensive MCUs with large flash and SRAM. This inflates BOM cost at scale.
Luviner's Neural Networks are 6x smaller and 30x more energy-efficient than traditional CNNs. A full gesture recognition model fits in ~7 KB and runs on a Cortex-M0 — the cheapest ARM core available.
Classify hand and wrist gestures from accelerometer/gyroscope data. Swipe, tap, rotate, flick — all recognized on-device in real-time.
Walk, run, cycle, stairs, rest — continuous activity classification from IMU data without cloud processing or phone connectivity.
Sleep stage classification and movement detection from wrist-worn accelerometer. All processing on-chip, no data leaves the device.
A ~7 KB model runs on a Cortex-M0 with 32 KB flash (under €1). CNNs need Cortex-M4 with 256 KB (€3-5). At 10K units: €30K saved.
Your gesture recognition algorithm is your competitive advantage. UID binding ensures competitors cannot extract and clone your model.
Upload IMU data as CSV, click Train, download binary. Your firmware team handles the entire ML pipeline without a data scientist.
Capture accelerometer/gyroscope data for each gesture class. Export as CSV with labels.
Upload to Luviner. Edge V3 trains an ultra-compact neural network and quantizes for MCU automatically.
Register your device UIDs. The compiled model only runs on your authorized hardware.
Download the compiled C library, link into your firmware, and deploy. Zero dependencies.
Start free with the Evaluation plan. No credit card required.