I’m an integrationist by nature. One of my creative strengths is in putting elements together in ways that other people might not have considered. This is true of tools and techniques as well as sounds: I get a kick out of making different bits of technology to work together, combining their strengths. My latest track, Within Epsilon of Bleem, is eclectic even by my standards, so I thought it might interest people to know how it was assembled.

During the making of Bleem, my studio changed quite a bit as I subtracted several pieces of gear which I no longer use enough to justify keeping them, and found new ways to integrate several other bits and pieces that I have found useful or fun to use. Most of what I’ve removed has been hardware, and most of what I’ve integrated has been software, though this hasn’t been an explicit part of the plan. Software just takes up less space, so it’s easier to justify keeping it. That being the case, most of what I’m about to write concerns getting several of my favorite synthesis software packages to talk to each other.

At the center of my studio is Ableton Live. I’ve used it for the better part of ten years, and have yet to find any other sequencer that more easily allows me to sketch out simple ideas — either in audio or MIDI — and then pull the best of those ideas out into a fully-realized composition. I’m not an expert at DAW’s, but I have a hard time understanding why people use the more complex, “serious” workstation software that’s out there. Before I started using Live, I had a brief go at learning Digital Performer. Although I got some good results, using it made me feel like kicking puppies more than it made me feel like writing music. Live always feels like it’s at least trying to help, so even when I stumble into some dark corner of the software I don’t recognize, it seems like it might be an exciting place to explore rather than a place I might get knifed and have my wallet stolen. Live is easy to get started with, surprisingly deep, and always inviting.

Another favorite tool of mine is Renoise. It’s an interesting beast. Renoise is a tracker, a style of music sequencer with a long and rich history. Tracker-style music editors co-evolved with the early ’90s demo scene, which means that the earliest ones were designed for efficiency, working as they did with very limited resources. Their earliest users also tended to lean more in the direction of hot-dogging hackers than professional musicians.

Given those constraints, the design sensibility is eccentric, to say the least. The main composition screen is something like a sideways piano roll crossed with a spreadsheet. Notes are traditionally entered through the computer keyboard, and appear in text form as an entry in a vertically-arranged grid. There’s a conceptual learning curve when you first get started, but the upside is that, once you get your head around the basics, actually using the thing is very, very fast. In that regard, I think it’s more like learning a traditional musical instrument than a lot of more standard computer music environments. The people who really work at it achieve levels of virtuosity that I find hard to imagine using a point and click interface. Renoise comes from that tracker tradition, but its feature set rivals that of a commercial grade DAW: MIDI implementation, VST hosting, Rewire integration, you name it. Rock solid timing and very robust, too. I can’t remember ever having crashed my copy.

Renoise excels in quickly laying down complex, multi-layered percussion tracks, and this is usually where I start when I use it. Most of the faster, fiddlier bits of percussion programming in Bleem were written in Renoise. You can here an excerpt here. I’ve written many tracks entirely in Renoise, and enjoyed doing it; but for all my praise, it does have its weak spots. I find it hard to see the large-scale structure of a composition written with a tracker, in a way that I don’t with something like Ableton Live. It’s also too easy to get ensnared endlessly fiddling with minor details instead of moving along in the composition: when I write with Renoise I tend to think in terms of creating a single pattern with layer upon layer of detail, stacking ideas vertically on top of each other instead of moving along horizontally through the composition.

So for Bleem, I tried something new: I used Ableton Live as my starting point, but tied Renoise in as a Rewire slave. By doing this, I gave myself the best of both worlds: Live as my wide-angle overview, with the ability to easily bring in parts sourced from other places, but Renoise for the lowest level drum programming details, where its pattern editor really shines. I liked this setup so much that I might make it the default starting point for future efforts.

Another favorite music tool of mine, and one that figures heavily in Within Epsilon of Bleem, is Metasynth. Metasynth is another strange creature, basically a consequence of asking what it would be like if you took a spectrograph and made it editable, and then pushing that idea in many different directions. It was famously used to create the “hidden face” in an Aphex Twin track, as described here. Many of the sounds in this piece started out life as recordings mutated into something new via Metasynth filtering. For example, the wobbly metallic loop that starts the track was created by processing a recording in Metasynth; it recurs, completely transformed into a series of high-pitched textures, later in the track.

I find it interesting to contrast Metasynth with Renoise, because they are so similar in some ways: both are utterly unique in their approach, doing things in a way that nothing else quite does; and both can be used to write entire compositions on their own, but each has strengths and weaknesses that I feel make them work best as players contributing within a larger context. Metasynth is very different from Renoise in other ways, though. Where Renoise is visually minimal and keyboard-intensive, Metasynth is by its nature highly visual and graphically oriented; and where Renoise’s developers are very responsive to user feedback, Metasynth’s creators seem to have an idiosyncratic vision which they follow nearly to a fault. Nowhere is this more evident than in Metasynth’s stubbornly isolationist insistence on being a stand-alone tool: no MIDI integration, no Rewire, no VST hosting. It just doesn’t fit with their way of doing things, and if you’re using Metasynth, the attitude goes, you’ve got everything you need.

As you might expect, this rubs my integrationist tendencies the wrong way. So I took an unusual step of configuring Metasynth as the default external audio editor in my copy of Ableton Live. This means that whenever I want to do something particularly weird and Metasynth-like to a sound in Live, I can easily edit it in Metasynth, pull the results back into Live, and keep going where I left off. It’s not a perfect approach, but it works for me. You can hear a lot of Metasynth driven sounds on Bleem. A brief excerpt showing why I love Metasynth can be heard here.

The melody lines that come in around 2:48 were written in Buzz Tracker. It’s hard to say what keeps me coming back to Buzz. The version I have is old, and crashy, and I have to keep a separate Windows laptop running almost exclusively so I can run it, but it was an amazing synth program when I first discovered it, and nearly ten years later I still find new things to like whenever I start it up. Usability and workflow issues aside, there are synths in Buzz whose sound I love, and so far that’s continued to justify the hassle of maintaining a separate environment.

In the case of Buzz, integration came in a form of DJ’ing, essentially: luckily Ableton Live and Buzz have very solid timing, so I was able simply to program out the Buzz sections while leaving a looped section of Ableton Live running in the background, record the output of my Windows laptop into a new Ableton Live audio channel, and everything stayed more or less synchronized.

The newest contributors in my studio are Numerology and Native Instruments’ Komplete plugin bundle. I bought both of them this past year, during the making of Bleem. I’m still getting comfortable using both of them, but together they make an appearance in the form of the bleepy loop that sneaks its way into the track early on, but only appears prominently toward the end. I integrated Numerology similarly to Renoise: by tying it in through Rewire while composing, and then mixing its output down to a separate audio track in Ableton Live. I briefly flirted with running Renoise, Numerology and Live all at the same time, including plugins. It worked, but with enough timing glitches and dropouts to convince me that I need either better hardware or to spend some time optimizing my MacBook before trying to use all three together. So it’s back to layering, which is fine. I think switching between three radically different environments is too complicated for my brain anyway.

Oh, and hardware? Yeah, I use hardware sometimes too. In particular, Bleem includes some rhythm loops I created on a Korg ER-1, and some background textures created with a Nord Micro modular. This has gone on a bit too long, though, so I think those can be the subject of another post.