Real-Time Quantisation of Gestural Midi Controllers
Throughout our work at OmniMusic we very often use devices which detect the movement of whatever someone has voluntary control of – for example a finger or arm – and, simply put, convert this movement into music. An example of such a device is a SoundBeam.
We call these devices variously ‘gestural synthesisers’, ‘gestural midi controllers’, or just ‘gestural controllers’. Probably the term ‘gestural midi controller’ is the most accurate, as they are usually linked up to and control a midi device which is the piece of technology actually creating the sounds.
One great problem we’ve encountered over the many years we’ve been using these devices is the fact that they are really very difficult to play rhythmically or ‘in time’. In the case of the SoundBeam, the trigger for the note being played is the breaking of an invisible ultrasonic beam with, for example, a hand. It can therefore be appreciated that the player can’t see or sense where any of the ‘notes’ actually are along the beam. It’s a similar problem with other gestural midi controllers, for example a proportional switch used with the Midi Creator.
It’s therefore been an aspiration of ours for a long time to address this problem somehow or other. Recently we’ve teamed up with Apollo Ensemble, the company behind the Ensemble software we use a lot in our workshops, to develop something with the somewhat grandiose title of a ‘Real-Time Quantiser’.
Very simply, in Music Production, quantising is the process by which a piece of software can correct the errant timing of a musician’s recorded playing. In other words, if a musician has played out of time whilst recording some music, the piece of software used for recording has a ‘quantising’ function which can correct any of the sloppy playing!! However, this function is usually performed on the recorded music after the recording has been made.
What we need is something which will re-time the playing of a gestural midi controller as it’s being played, so that it becomes possible to play it rhythmically and in-time during a rehearsal or performance.
At the moment we’re at the early stages of developing the technology. As anyone with experience of making things knows, developing new technology is always a bit of a ‘suck-it-and-see’ process! One always starts with a bit of educated guess work to come up with a solution to a practical problem but experience shows that very often one has to be very flexible as to how that solution might have to be changed and adapted as work progresses. Occasionally a particular approach has to be abandoned altogether and then it’s back the the drawing board!
Update: August 2021
After agreeing a design brief for the quantiser block with Apollo Ensemble, we’re very pleased that we’ve received the new Quantiser block and have started testing it with a regular rhythm coming into it from a midi sequencer to act as a trigger input, and a gestural midi controller input from a SoundBeam. So far we’re very pleased with the results with the SoundBeam solo – for the first time ever!! – being ‘in-time’ with a walking bass line! We’re very pleased in the way it’s working, with the SoundBeam not following the bass line ‘slavishly’ but with a very nice natural feel to it whilst still reflecting the rhythm of the bass line.
The next step is to use it in a workshop situation and see how it works there. Just to reiterate – we believe this is a significant development in the use of gestural midi controllers in Assistive Music Technology. For the first time that we’re aware of, this helps to overcome the very great difficulty in playing any gestural midi controller with a sense or feel of rhythm.