logo

Name: David Roif aka Rezident
Nationality: German
Occupation: Producer
Current release: Rezident's Inner Circle EP is out via Purified.
Gear Recommendations: Granulizers or weird effects can be really fun to come up with new ideas. Paulxstretch is one that’s completely free and lets you stretch audio into infinity basically. Other plugins worth checking out are: Fragments by Arturia, Crystallizer by Soundtoys and Permut8 by Soniccharge.
Snapchat world map: of course I’d suggest you record your own samples but if that’s not an option it can be fun to sample some random stories from the Snapchat app. It seems like a lot of people don’t know about this but if you zoom out, you can see stories from all around the world and sample an exotic bird, or some random footsteps, or whatever you can find. And then you can send that through the audio tools mentioned above and get some crazy and unique sounds that you’d never get using synthesizers.

If you enjoyed this interview with Rezident and would like to find out more about his music, visit his official website. He is also on Instagram, Facebook, and Soundcloud.



What was your first studio like?

I started making music in my bedroom when I was about 15 with a Windows PC and some obscure headphones.

Nowadays I do sessions at ‘real’ studios every now and then for collaborations or recording vocals, but most of the work still gets done of my Berlin apartment with laptop & headphones.

How and for what reasons has your set-up evolved over the years and what are currently some of the most important pieces of gear for you?

My studio has evolved and de-evolved many times over the last years. I’ve tried different gear and software: I’ve been using windows and MacOs, desktop and laptop PCs, different hardware synths, controllers, plugins and DAWs on my search for a fun and intuitive workflow.

At a time around 2018 I had 4 launchpads and a launchpad hooked up to my computer in my student dorm in Stuttgart. It was fun until I fried my new music laptop that I had saved up for by using a wrong electricity plug for my USB hub haha.

Sometime later I finally switched to mac and when I started getting my first royalties I bought some synth clones that I made new music with.

Examples:

A bass layer from ‘Inner Circle’ in the 2nd drop comes from my Moog Model D clone.



The bassline from ‘Behind The Scenes’ came from my Moog Model D clone and the hi-hats came from a Korg MS20 clone.



The bassline from ‘Aura’ came from my Pro-1 clone.



The bassline from ‘Echoes’ came from my Arp 2600 clone.



There are also a lot of those synth sessions in my IG highlights from the last years. Lately I haven’t been using the hardware a lot and I found it to be quite distracting because it doesn’t really help with finishing music (arrangement, mixdown, etc) so I sold a bunch of it and went back to just doing everything on my laptop.

That’s also the most important piece of gear for me and it’s amazing how good laptops have become over the last couple of years.

Some see instruments and equipment as far less important than actual creativity, others feel they go hand in hand. What's your take on that?

Of course I rely a lot on technology - my laptop is my ‘instrument’ and I wouldn’t be able to do music without it and the software on it. I’ve set it up in a way that makes it easy for me to make music and it does go hand in hand from making the first sketch to finishing final track.

That said, tech can provide a lot of inspiration and sounds to start from, but it doesn’t really help with making decisions about the arrangement, mixdown, sound selection, etc. Oftentimes it’s the little imperfections or things that are a bit odd that make up the character of a track, and it’s hard to imagine that a piece of tech or an algorithm would be able to do that right now.

A studio can be as minimal as a laptop with headphones and as expansive as a multi- room recording facility. Which studio situation do you personally prefer – and why?

At times, I’ve been to bigger studios but they come with some compromises. Sometimes there’s no space to put my laptop down because of a big mixing desk that I don’t use, or the booking system is too complex / inflexible, the studio is far away, I’m not familiar with the setup, etc.

There isn’t really a way around professional studios for things like collaborations, recording sessions, final mixdown or mastering sessions. Sometimes I book one for these purposes, but for everything else, I work from wherever I feel comfortable with a laptop and headphones.

I’ve actually found a free spot at the Funkhaus studios in Berlin that I’ll be moving into next month. It’s pretty close to my apartment and located in a nice area around some nature and lots of other musicians and friends in the same building, so I’m looking forward to that!

From traditional keyboards to microtonal ones, from re-configured instruments (like drums or guitars) to customised devices, what are your preferred controllers and interfaces? What role does the tactile element play in your production process?

I’ve settled on the Push 2 as my main controller for Ableton Live. I can record and sequence melodies or drums on it, trigger clips, edit and automate some parameters, and it has nice visual feedback with colors and track / parameter labels. Also sometimes I record percussion or instruments using a microphone or my phone. A tactile element like this is great for finding ideas and making studio sessions more fun.

Other than than I do most of my work on my laptop and I also really enjoy using that - the trackpad, keyboard and screen are great and it definitely enhances the whole workflow.

I don’t use any separate ‘customised devices’ but there’s a lot of exploration possibilities with vst plugins, signal paths inside the daw, modulations, etc.

In the light of picking your tools, how would you describe your views on topics like originality and innovation versus perfection and timelessness in music? Are you interested in a “music of the future” or “continuing a tradition”?

Tools can that have a certain history or emotion attached to them. It could be a vintage synth, or a legendary guitar pedal, or something else. It could be also a recording from my phone from a moment I like to remember, or a cool place, even if no one else will notice it.

But I’m not really interested about making ‘music of the future’ or ‘continuing a tradition’ - it’s nice to work on whatever feels right at the moment.

Most would regard recording tools like microphones and mixing desks as different in kind from instruments like keyboards, guitars, drums and samplers. Where do you stand on this?

I think there’s a different skillset involved between things like playing a guitar or using a mixing desk.

I really respect people who play traditional instruments, but personally I’ve always been more interested in using sequencers and digital tools. I like that I can put down a melody or groove from my head into the software and then move on to working on the details until it’s exactly how I want it to be, instead of focusing on playing it again and again without making mistakes.

How do you retain an element of surprise for your own work – are there technologies which are particularly useful in this regard?

Sometimes I can surprise myself by bringing in an element of randomness into the production. It can be fun to see what tools do when you throw something at them that they were not designed create some surprises with modulations or randomized parameters.

Whenever a ‘happy accident’ like this happens, I look through the session to see where it came from and then sample it or decide how to re-use and place it in the project. I usually work separately on small ideas, and then find ways to combine them - that’s where the element of surprise also comes in.



On ‘Morning Gate’, the piano in the intro was the first sketch of the track, and the part afterwards came later. They don’t share the same chord progression, but somehow they still work together nicely - probably because they originated from the same idea.

On the track ‘Inner Circle’ it was the opposite - the second drop was the first sketch of the track, and the whole first part came from a later session. They share the same piano chords as a lead element, but the bassline is different.

However, I think the ‘unsurprising’ elements aren’t getting enough credit. For many years, all of my tracks were surprising, but not in a good way - elements didn't really fit in and drew too much attention to themselves. It takes some time to figure out how to create the ‘core’ of the track that gives the listener something to hold on to without drawing too much attention to itself, and then spacing out the more interesting elements throughout.

To some, the advent of AI and 'intelligent' composing tools offers potential for machines to contribute to the creative process. Do you feel as though technology can develop a form of creativity itself? Is there possibly a sense of co-authorship between yourself and your tools?

I think AI can definitely develop a form of creativity and a sense of co-authorship if or when it becomes conscious. Until then, the question is - can there be creativity without consciousness?

Most things in nature seem very creative to me without a consciousness behind them - I don’t think the maple tree in my garden understands what’s going on here but I’m still impressed by it’s use of little helicopter seeds to spread throughout the world. There were millions of useless mutations of this mechanism, but only the good one survived and remained to this day with natural selection as a selector.

It could be similar with AI - I can see how for every 10.000 AI-composed tracks or ideas, one of them is great. But it would need a selective process to ‘rate’ each idea and dismiss all bad ones. Because music is tied so closely to our culture, it’s hard for me to imagine how this could be achieved without a consciousness.

But if there’s a human making these final decisions and judging the work of an AI, I don’t see how this is different to what’s already happening today. Everyone is using pre-made software, equipment, samples, presets, loops, etc. If you open up a plugin, there are already hundreds of presets at your fingertips. And we’ve all seen the ads for midi chord packs and things like that.

Electronic music production will get a bit more accessible with advances in AI, and maybe companies won’t need to hire humans for generic background-music. But other than that I don’t think there will be any profound changes in the next years.

What tools/instruments do you feel could have a deeper impact on creativity but need to still be invented or developed?

I hope there will be some advances in DAW controllers in the next years. They should be a lot easier to customize while being dynamically mapped to the DAW. And it should be a lot easier to receive feedback like color tags, track / device names, meter levels, etc. Auto-mapping your favorite plug-ins exactly how you want them to ashould also be easier.

The midi standard itself is kind of outdated and in many applications outside of the music world it has been replaced by OSC which has a lot of advantages for smart controller mappings. There’s great software to build custom OSC touch-screen controllers (TouchOSC, Open Stage Control) but what’s mostly lacking is the script that would make the controller work and control the DAW. There isn’t any official documentation on how to write these scripts and I hope this will be more accessible in the future.

Some interesting projects that are worth checking out:

Isotonik Studios makes a lot of custom controller and Max4Live scripts for Ableton Live AbletonOSC is a github project by ideoforms that lets Live send and receive OSC messages.