Don’t tell the brass, but I’ll let you in on a secret: as an iZotope blog contributor and mix engineer, I wasn’t too enthused when I heard about iZotope’s assistive audio technology. I thought it would be cheating to use it.
Here are five examples of recent mastering sessions where iZotope RX took on a more creative audio mastering role. Featured in this article is GRAMMY-winning mastering engineer Glenn Schick (Future, Ludacris, The Weeknd) and iZotope’s very own Education Director Jonathan Wyner (David Bowie, Howie Day) to also share some key insights on how they’ve used RX in their mastering workflow.
RX is used by sound designers, those in film post-production, and dialogue editing the world over. With RX 7 offering the ability to isolate vocals from songs and automatically detect noise in samples, it’s moved into the hands of creative producers too.
Today I’m sharing six of my favourite RX 7 tricks for music production, along with audio examples for evidence. I suggest using a pair of headphones to listen.
The purpose of reverb is to create a sense of ambiance, foster a feeling of depth, or take listeners to new locales. But today we depart from these more prosaic usages to focus on something a little more creative—namely, how to use reverb as a tool for sound design.
The vocal is often (nearly always) the most important element in a track. The presence that you hear in a professional vocal helps the listener understand the lyrics and connect with the song. This human element is accessible to the listener and should be clear to hear.
Barring distortion, few effects are as essential to mixing guitars as reverb and delay. From reggae strokes to stadium rock epicness and blissful tape echo soundscapes, we’ve relied heavily on ambience processors to shape some of the most distinctive guitar sounds in contemporary music.
In the world of rock music, being ridiculous and flashy can get you a long way. For decades, rock has been propelled by bombastic lead singers, drummers, and guitar players. Despite the revelry often attached to the job description of “rock musician,” bass players hold the distinct challenge of having to blend in. While the bass in rock music has long served as a humble anchor underneath the cacophony, don’t be fooled—there are more possibilities for rock bass lines besides another unexciting eighth note cadence on the root note.
In this article, we’ll cover six ways to use reverb in a sound design and arranging context. We’ll cover how reverb can be used as an insert effect when creating a sound, how to give return reverbs more character, and how to use reverb as a standalone transitional effect or groove element.
The best place to start with psychoacoustics is to get familiar with the limits of human hearing. You probably already know that we can hear sounds within a range of 20 Hz to 20 kHz (20,000 Hz), with the upper limit decreasing to around 16 kHz with age. Noise-induced hearing loss and tinnitus will impact the perception of sound too, and for producers with these conditions, workarounds need to be developed to achieve balanced mixes.
First, make sure something isn’t actually amiss with your gear. Many are the times where it hasn’t actually been my ears. With a panoply of hardware pieces and software abounding, it’s easy to see where something might mangle the proceedings in the chain.
Let me hear…route the audio through T-Racks One, Sonarworks, Ozone 8, and out through Sonarworks. Where did all that sub-50 information come from 😉
Convolution is one of the more sophisticated processes regularly used in audio production. Its ability to accurately impart the characteristic timbres of spaces and objects on other signals is useful in both sound design and standard processing applications. With a wide range of realistic and otherworldly sonic possibilities, convolution can be a fantastic addition to any producer’s toolkit.
In this article, we’ll discuss what digital reverb, both algorithmic and convolution, technically does to an audio signal to achieve the effect of reverb. With this information in mind, we’ll also cover some considerations for handling reverb in your own projects.
Reverb can be tricky to deal with in a mix. The space that it adds can be very helpful, but sloppy reverb sounds can often become smeared over the mix, reducing clarity. Achieving the proper balance when mixing reverb will give a sense of space without becoming distracting in the mix.
In this article, we’ll cover some methods for mixing reverb. We’ll discuss EQing, ducking, timing, and retriggering reverb.
I try to make sure I post to the blog when I add a section to the iZotope Tools binder. I file the article, and when possible, all of the sound samples and videos. Videos go in the videos folder with bookmarks attached to the article. Sound samples are stored in the article (which becomes an outline element).
In Episode 6, learn how EQ in mastering can help correct and restore the clarity and intelligibility of a track, why you should prep before applying EQ, why filter shapes matter, and how to make thoughtful subtractive and additive EQ decisions. Practice your skills at home by downloading a free trial of Ozone, iZotope’s mastering software!
Will the track receive further processing? Use an insert
Reverb, delay, compression, modulation, distortion—these are some effects that often wind up on auxiliary channels. You send some of your track to a reverb aux, and dial in as much as needed. But if the effect is meant to be used in a series and will receive further processing on the track level, it’s wiser to use the effect as insert even in parallel operation (as in, a chorus with a wet/dry control).
Worth considering. Logic Pro X instruments use both inserts and sends to achieve the fine results. There’s a reason for that.