2016 - 2018 // website
I often get absorbed by the tool-building phase of an art project, and sometimes the tool becomes its own project.
Often while working on a Processing sketch, I’ll want to control some aspect of
it more directly, to explore or test a range of variability. It’s usually easy
enough to do this with a mouse or trackpad by mapping mouseX
and mouseY
to
control the values I want to manipulate, but sometimes I want more control. And
for showing projects in front of an audience, control becomes critical.
controlP5 is a Processing library
meant for this, but it ties you to manipulating sliders and dials with your
mouse.
When I first saw the korg nanoKONTROL2, it seemed like a better approach. I
connected it to my laptop and started a Processing sketch to read its values,
using The MidiBus as an
example-slash-troubleshooting instrument, and quickly realized that raw MIDI
would get me about forty percent of what I wanted: the sliders and dials worked
right out of the box, but that was it. The buttons didn’t quite work - I’ve
forgotten most of the details, but I remember that their mode (press-to-toggle,
or press-and-hold) didn’t work. I had to dive into the KONTROL2’s
system-extension messages, and its “scene”: the onboard configuration, and all
the proprietary messages that configure it. And anyway, tracking all the
sliders and dials in a sketch was going to be tedious to organize each time; I
found myself wanting to instantiate some kind of Kontrol2
class and have an
object for the device that organized all that stuff for me.
Here’s an example sketch that shows it in use. The k2
object is a Kontrol2
instance. You just instantiate it, and it finds the device from the available
MIDI connections. It has methods to configure it (e.g. recordMode
), and
methods to read its current values (e.g. slider(2)
for the second slider).
import korgnano.*;
Kontrol2 k2;
void setup() {
k2 = new Kontrol2();
// The buttons all default to Momentary mode - they're only "on" while you're
// pressing them down. Let's set some to Toggle mode.
k2.recordMode(3, ButtonMode.Toggle);
k2.soloMode(8, ButtonMode.Toggle);
}
void draw() {
background(
map(k2.slider(1), 0, 127, 0, 255),
map(k2.slider(2), 0, 127, 0, 255),
map(k2.dial(3), 0, 127, 0, 255));
// If the eighth solo button ("S") is pressed, toggle between stroke & fill.
// Note: it's 1-based, not 0-based.
if (k2.solo(8)) {
stroke(255);
noFill();
}
else {
noStroke();
fill(255);
}
// If the seventh mute button ("M") is pressed, draw a circle.
if (k2.mute(7)) {
ellipse(width/2, height/2, 50, 50);
}
// If the third record button ("R") is pressed, toggle RGB & HSB.
if (k2.record(3)) {
colorMode(HSB);
}
else {
colorMode(RGB);
}
}
The library is a simple, thin layer between your code and the device - it doesn’t map values to your target range, it reflects the underlying MIDI reality and reports values as integers from 0..127, per the MIDI spec.
I also bought a nanoPAD and a nanoKEYS. The original plan was to write an adapter for each of them, but the balance of forces didn’t work out the same in these cases. The nanoKEYS is basically a piano, and plays MIDI notes; there’s not much that an intermediary could contribute. It’s the same for the nanoPAD, but even more so: my clearest idea for it would be to use it to trigger different modes for a sketch (color blend? motion blur? kinds of randomness? …color palettes?), but the MIDI part of that is trivial - the hard part is building the modes and how they interact…and that’s just not about MIDI.