Real-Time Quantisation of Gestural Midi Controllers
12th January 2020Throughout our work at OmniMusic we very often use devices which detect the movement of whatever someone has voluntary control of – for example a finger or arm – and, simply put, convert this movement into music.
We call these devices variously ‘gestural synthesisers’, ‘gestural midi controllers’, or just ‘gestural controllers’. Probably the term ‘gestural midi controller’ is the most accurate, as they are usually linked up to and control a midi device which is the piece of technology actually creating the sounds.
One great problem we’ve encountered over the many years we’ve been using these devices is the fact that they are really very difficult to play rhythmically or ‘in time’. In the case of the SoundBeam, the trigger for the note being played is the breaking of an invisible ultrasonic beam with, for example, a hand. It can therefore be appreciated that the player can’t see or sense where any of the ‘notes’ actually are along the beam. It’s a similar problem with other gestural midi controllers, for example a proportional switch used with the Midi Creator.

It’s therefore been an aspiration of ours for a long time to address this problem somehow or other. Recently we’ve teamed up with Apollo Ensemble, the company behind the Apollo software we use a lot in our workshops, to develop something with the somewhat grandiose title of a ‘Real-Time Quantiser’.
Very simply, in Music Production, quantising is the process by which a piece of software can correct the errant timing of a musician’s recorded playing. In other words, if a musician has played out of time whilst recording some music, the piece of software used for recording has a ‘quantising’ function which can correct any of the sloppy playing!! However, this function is usually performed on the recorded music after the recording has been made.
What we need is something which will re-time the playing of a gestural midi controller as it’s being played, so that it becomes possible to play it rhythmically and in-time during a rehearsal or performance.
At the moment we’re at the early stages of developing the technology. As anyone with experience of making things knows, developing new technology is always a bit of a ‘suck-it-and-see’ process! One always starts with a bit of educated guess work to come up with a solution to a practical problem but experience shows that very often one has to be very flexible as to how that solution might have to be changed and adapted as work progresses. Occasionally a particular approach has to be abandoned altogether and then it’s back the the drawing board!