The Living Thing / Notebooks :

Gesture recognition

I want to recognise gestures made with generic interface devices for artistic purposes, in realtime. Is that so much to ask?

Related: synestizer, time warping, functional data analysis

To Use

BTW, you can also roll your own with any machine learning library; It’s not clear how much you need all the fancy time-warping tricks.

Likely bottlenecks are constructing a training data set and getting the damn thing to work in realtime. I should make some notes on that theme.

Apropos that Museplayer can record opensoundcontrol data.

skatvg got a grant to make people’s gesturing and whooshing and beatboxing produce real soundtracks.