How do you make use of technology? Would you say your background as a software developer plays into this? In terms of the feedback mechanism between technology and creativity, what do humans excel at, what do machines excel at?
Academic studies confirm that the complexity of music has declined steadily since my childhood. Music is increasingly made by non-musicians, as in persons who lack musical training and are unfamiliar with the theory and practice of instrumental music. Music technology corporations are partly to blame, because they market their products by spreading the convenient fiction that music is sound design.
More generally the widespread adoption of music technology is a double-edged sword. On the one hand it’s had a democratizing effect: now nearly everyone can be a music producer. But on the other hand it’s led to de-skilling, often reducing musical expression to the level of a video game. For example when Rebirth was released, people imagined they could use it to become the next Richie Hawtin, but what they were really doing is playing Richie Hawtin in a simulation.
Like all corporations, music technology corporations seek to maximize their profits, so it shouldn’t surprise us that their products are conceptually conservative and have the effect of reinforcing the musical status quo. If you use the same tools as others, you will have similar degrees of freedom and therefore unavoidably achieve comparable results.
To achieve unique results, you need unique tools and methods, and that’s why I decided long ago to create my own composition tools. My career as a professional software designer made this decision possible, but there were many daunting hurdles. For example, by 2003 my original polymeter MIDI sequencer had become hopelessly obsolete and was limiting my creativity, but at that time I lacked the skill to adapt it to a modern platform. During the fifteen years before I returned to the electronic music scene in 2018, I worked as a software consultant in the 3D printing industry, and it was during those years that I gradually acquired sufficient programming skill to modernize my sequencer.
Since the 1990s I’ve been acutely aware that collaborating with technology could not only allow me to overcome my limitations as an instrumentalist, but more importantly allow me to explore unknown musical territory that would otherwise be inaccessible or even inconceivable. Computers can perform complex calculations accurately in real time, and easily manipulate huge datasets, and these capabilities are indispensable to my artistic process. By offloading music theory computations onto machines, I free myself to approach musical expressiveness in a more abstract and intuitive way.
Above all, I value orthogonality, meaning I strive to isolate fundamental aspects of music—timbre, rhythm, pitch, melody, harmony—into independently controllable parameters, so that for example the rhythm can be changed without changing the harmony, or vice versa.
You used algorithmic music techniques and a robot choir on the album, tell me a bit about these, please. Do you see a potential for AI in exploring novel musical concepts?
I was writing two albums at once during this period. The other was my “Polymeter” album which consists of generative solo piano and solo guitar, in a fusion of neoclassical and jazz, reminiscent of “stride” piano. I was also teaching myself atonal music theory, and that’s audible on both albums, for example on “Overshoot.”
Polymeter modulation is an outstanding tool for rule-based harmony generation. My software defines a “scale” very abstractly as any collection of pitches, and a “chord” as any subset of a scale. I’m headed away from the common scales, and towards generative atonal harmony, because it has tremendous potential for ambiguity and surprise. Atonal music often suffers from the “cat walking around on the piano” problem—too many adjacent semitones—but I have methods for avoiding that.
In 2014 I developed a software called ChordEase that makes it easier to improvise to jazz changes. You can play jazz using only the white keys, because the software automatically translates them to the needed scales in real time. It codifies a lot of knowledge about jazz, and that makes it an expert system, which is a type of AI. It’s also an example of offloading, which is a hot topic in AI. I wrote a paper about it and presented it at NIME. Guess who really hated it? Jazz musicians. I almost got beaten up in a jazz club once just for talking about ChordEase.
The robot choir draws inspiration from the chorus in classical Greek tragedy. It seems plausible that our machines will outlive us, so it makes sense for them to tell the story of our hubris and demise.
Production tools, from instruments to complex software environments, contribute to the compositional process. For Apologize To The Future, you eventually spent many years developing your own sequencer. How does this manifest itself in your work? Can you describe the co-authorship between yourself and your tools?
Having covered such questions above, I’m going to pivot and talk about the elephant in the room. "Apologize to the Future" relentlessly expounds the pivotal issues of the 21st century: climate change, economic inequality, intergenerational injustice, artificial intelligence, overpopulation and overconsumption, antinatalism, and human extinction. This is unprecedented in electronic music. And yet here we are blithely discussing compositional processes as if nothing were amiss. It's as if I told you an asteroid is headed straight for Earth and you responded by asking me about my childhood influences. It feels like an example of denial, which is another major theme of the record.
My earlier work is often ironic, but on this album I felt an obligation to speak from the heart, in plain language that anyone could understand. “Apologize to the Future” preaches that procreating isn’t just selfish, it’s cruel. There’s no ethical justification for creating new humans only to abandon them on a wrecked planet. Future generations will suffer for crimes they didn’t commit, while the perpetrators abscond, smugly dead.
I have spent nearly thirty years attempting to increase public awareness of the climate crisis and its causes, through art, music, writing, street theatre, culture jamming and more. These efforts were not in vain: public awareness has increased greatly and we may be approaching a cultural tipping point. But the disaster is already upon us, and our usual routines won't suffice. We either wise up fast, or the future won’t include us.
Collaborations can take on many forms. What role do they play in your approach and what are your preferred ways of engaging with other creatives through, for example, file sharing, jamming or just talking about ideas?
I’m a very solitary artist. My musical methods are incomprehensible to most people. I have a friend who is a gifted mathematician and he talks me through some of the thornier problems. I sometimes share unfinished pieces with close friends whose judgement I trust, but only if they’re gentle. Harsh criticism can be very destructive. I try to make everything open source. I should write a book.