# Controllerism

### Also gestural interfaces, and other fancy words for making thing happen by waving your arms about on stage

“[…]Now. Where is it?”

“Where is what?”

“The time machine.”

“You're standing in it.” said [X].

“How… does it work?” [Y] said, trying to make it sound like a casual enquiry.

“Well, it's really terribly simple,” said [X], “it works any way you want it to. You see, the computer that runs it is a rather advanced one. In fact it is more powerful than the sum total of all the computers on this planet including – and this is the tricky part – including itself. Never really understood that bit myself, to be honest with you. But over ninety-five per cent of that power is used in simply understanding what it is you want it to do. I simply plonk my abacus down there and it understands the way I use it. I think I must have been brought up to use an abacus when I was a… well, a child, I suppose.

“[R], for instance, would probably want to use his own personal computer. If you put it down there, where the abacus is, the machine's computer would simple take charge of it and offer you lots of nice user-friendly time-travel applications complete with pull-down menus and desk accessories if you like. Except that you point to 1066 on the screen and you've got the Battle of Hastings going on outside your door, er, if that's the sort of thing you're interested in.”

[X]'s tone of voice suggested that his own interests lay in other areas.

“It's, er, really quite fun in its way,” he concluded. “Certainly better than television and a great deal easier to use than a video recorder. If I miss a programme I just pop back in time and watch it. I'm hopeless fiddling with all those buttons.” […]

“You have a time machine and you use it for… watching television?”

“Well, I wouldn't use it at all if I could get the hang of the video recorder”

On the dark art of persuading the computer to respond intuitively to your intentions, with particular regard to music.

The input-data side of gesture recognition.

In non-musical circles they call this “physical computing”, or “natural user interfaces”, or “tangible computing”, depending upon whom they are pitching to for funding this month.

• I just designed an interesting digital instrument with a bunch of free control parameters.

• I have an interface with a different (usually much smaller) number of control parameters.

• there is no obvious “best”, or even immediately intuitive, mapping from one to the other

How do I plug these into each other in an intelligible, expressive way so as to perform using them?

This question is broad, vague and and comes up all the time.

Ideas I would like to explore:

• Interpolating between interesting parameters using arbitrary regression. Rebecca Fiebrink's Wekinator does this using simple neural networks.

• Constructing basis vectors in some clever way, e.g. sparse basis dictionaries

• constructing quasi-physical models that explore parameter space in some smart, intuitive way, e.g. swarm systems, Hamiltonian models

• doing basic filtering of generic UI signals

• leaky integration

• differentiation (smoothed)

• gating

• thresholding and Schmitt-triggering

• constructing random IIR convolutional filters and harnessing for control. How do you select the best ones, though? What is the right objective function?

## Random mappings

• sparse correlation

• physical models as input

• random sparse physical models as input

• annealing/Gibbs distribution style process

• Der/Zahedi/Bertschinger/Ay-style information sensorimotor loop

## “Copula” Models

And related stuff.

Copula are an intuitive way to relate 2 or more (monotonically varying?) values by their quantiles.

The most basic one is Gaussian, where the parameter of the copula is essentially the correlation. For various reasons, I'm not keen on this in practice; I do't have time to go into my intuitions as to why it is so, but Gaussian tails“feel” wrong for control. Student-t, perhaps?

See copulas.

## UI design ideas

• circular sequencer

• gesture classifiers

• accelerometer harvest for iphone

## Protocols

### MIDI

More-or-less working since the 1980s; still the best idea, if you can live with 7-bit scalars as your lingua franca. See MIDI.

### OpenSoundControl

A short-lived project from the 1990s to produce a more flexible protocol than MIDI. Insinuated its way into many projects before death, and still haunts them Because it is more flexible than MIDI, it is sometimes discussed as if it were the apotheosis of protocols, as opposed to an incremental improvement on MIDI with many debilities of its own, and much narrower support.

# . Stateless protocol designed to support UDP – and therefor it's a one way

protocol. No question-and-response here. Therefore you always end up
re-inventing TCP if you want to do 2 way communication. Which presumably you do,
or you'd be using MIDI.


# . Doesn't guarantee delivery, due to UDP assumption. When you mention this,

supercollider fans tell you that you CAN in fact use TCP instead. Which you can,
but only if you are using supercollider, which rather diminishes the “universal
ultimate controller protocol for everything” argument.


TBD.

TBD.

## Software

• libmapper bundles together UI signals and provides a discovery protocol; libraries in C right now; also there is python, and some puredata/maxMSP implementation

But if you were doing that, why not use some IoT tools and benefit from greater brainshare?

• musicbricks include gestural controllers and syncing in their various open source umbrella projects.

• The Autobahn project:

provides open-source implementations of the The WebSocket Protocol and The Web Application Messaging Protocol (WAMP) network protocols.

WebSocket allows bidirectional real-time messaging on the Web and WAMP adds asynchronous Remote Procedure Calls and Publish & Subscribe on top of WebSocket.

WAMP is ideal for distributed, multi-client and server applications, such as multi-user database-driven business applications, sensor networks (IoT), instant messaging or MMOGs (massively multi-player online games).

includes javascript, python and a routing infrastructure called crossbar.

• Luis Lloret's OSMID aims to provide a lightweight, portable, easy to use tool to convert MIDI to OSC and OSC to MIDI.

• Mark Francombe's browser MIDI/OSC converter, MIDI MESSAGE GENERATOR.

### python

• MIDI

• Magenta's MIDI interface shows how Google does it so they can be cool.

• MIDO “is a library for working with MIDI messages and ports. It’s designed to be as straight forward and Pythonic as possible.”

• OSC

• The original, sorta-working thing: pyOSC

• C-based and easier to use: pyliblo

### javascript

• tangible.js is a resource for real-world interfaces plugging into to javascript. They intermittently publish useful reviews.

• There are various handy GUI frameworks designed for musical control.

• OpenSoundControl bridges

• osc.js is an Open Sound Control (OSC) library for JavaScript that works in both the browser and Node.js (And is still being maintained unlike many)

• supercollider.js does this and much more.

• OSC-JS exists, bridging websockets to OSC, but doesn't look as maintained as osc.js. Are there others?

• Yes. Legato.

legato is a small node.js library written in coffeescript, but that doesn't really matter. legato is designed to let you create beautifully simple connections between devices and software, regardless of environment.

### Lua

Reasonably comprehensive support for MIDI with decent timing in Löve2d.

### Supercollider

• mmExtensions by Martin Marier has the best-designed preset interpolation system I have seen, all so that its creator may plug a networked bath sponge into clarinet recordings.

## Interesting hardware

### Tablet computers

For iOs, Touchosc, Lemur…

For anything +Ableton, yeco.

## 3d interaction

The classic depth camera is the Kinect. More-open depth-camera: Orbbec3d

Calibration is tricky; Rulr attempts to solve this in an open-source, general way. (Rulr docs). openkinect does Kinect.

TBC.

### myo

myo is a wristband sensor that measures your muscles directly using EMG. Similar: the XTH using MMG - “which captures motion, direction and orientation sensors (integrated in a 9-DoF IMU) and muscle sound (also known as mechanomyogram or MMG)”

### leapmotion

Infrared hand tracker. In my experience, not really stable enough for on-stage use, (needs better Kalman filtering) but gee it's small and portable.

### Keith McMillan fancy controllers

e.g. QUNexus, multi-dimensional midi controllers. (Ongoing project – find out how to work them in Bitwig.)

### Makeymakey

makeymakey

[Turns] everyday objects into touchpads and combine them with the internet. It's a simple Invention Kit for Beginners and Experts doing art, engineering, and everything in between

Hmm. I'm not sold on this, as it's a rather expensive way of getting 1-dimensional controllers out of \$2 contact mics, and you could do a lot more with this if you were clever. Nice if you are short of time and quirk, but not short of money,

### wiimote

Wiimote should be a normal HID device, but has nasty sharp edges. So you avoid them using alternate libraries:

• wiiuse is a library written in C that connects with several Nintendo Wii remotes. Supports motion sensing, IR tracking, nunchuk, classic controller, Balance Board, and the Guitar Hero 3 controller. Single threaded and nonblocking makes a light weight and clean API.

• OS X mapper Darwiinremote.

• OSX driver wjoy

• osculator is a commerical product which does this; it's pretty good.

• libcwiid seems to be linux-happy? But it's a naked C library and apparently threading-tricky. Has a python interface.

• nodewii uses it though for node linux

See synestizer.

## Accessibility

Human Instruments does good accessibile interface work.