shockwave-sound.com
View Cart | License | Blog | Contact
[ Home ][ Testimonials ][ Help/FAQ ][ Affiliate program ][ CD collections ][ My tags ][ My orders ]

Depth and space in the mix

Part 1

by Piotr Pacyna


"When some things are harder to hear and others very clearly, it gives you an idea of depth." - Mouse On Mars

There are a few things that immediately allows one to distinguish between amateur and professional mix. One of them is depth. Depth relates to the perceived distance from the listener of each instrument. In amateur mixes there is often no depth at all; you can hardly say which instruments are in the foreground and which in background, simply because all of them seem to be the same distance from the listener. Everything is flat. Professional tracks, in turn, reveal an attention to precisely position individual instruments in a virtual space: some of them appear close to the listener’s ear, whilst others hide more in the background.

For a long time people kept telling me that there was no space in my mixes. I was like: guys, what are you talking about, I use reverbs and delays, can’t you hear that?! However, they were right. At the time I had problems understanding the difference between using reverb and creating a space. Serious problems. The truth is - everyone uses reverbs and delay, but only the best are able to make the mix sound like it was three-dimensional. The first dimension is of course panorama – left/right spread that is. The second one is up/down spread and is achieved by a proper frequency distribution and EQ. The third dimension is the depth. And this is what this text is going to be about.

There are three main elements that help to build the depth.

1. VOLUME LEVEL

The first, the most obvious and pretty much self-explanatory is Volume Level of each instrument track. The way it corresponds to the others allows us to determine the distance of the sound source. In a situation where the sound is coming from a distance, its intensity is necessarily smaller. It is widely accepted that every time you increase the distance twice the signal level is reduced by 6 dB. Similarly, the closer the sound you get, the louder it appears.

It is a very important issue that often gets forgotten...

2. TIME THE REFLECTED SIGNAL NEEDS TO REACH OUR EARS

The second is the time taken by the reflected signal to reach our ears. As you all know, in every room we hear a direct signal and one or more reflected signals. And if the time between these two signals is less than 25-30ms, then the first part of the signal gives us a clue as to the direction of the sound source. If this difference increases to about 35ms or more, the second part of signal gets recognized by our ears (and brain) as a separate echo.

So, how to use it in practice?

Due to the fact that the PAN knobs are two-dimensional and keep moving from left to right, it's easy to fall into the trap of habitually and set everything in the same, dull and obvious way – drums here, piano there, the keys here… as if the music was played in a straight line from one side to the other. And we all know that is not the case. When we are at the concert we are able to hear a certain depth, “multidimensionalism” quite brilliantly. It is not hard for us to say, even without looking at the scene, that drummer is located in the back, guitarist slightly closer to the left side, and the singer is in the middle, at the front.

And although the relative loudness of the instruments is of great importance for creating the real scene, it’s the time the signal needs to arrive to our ears that really matters here. These very tiny delays between certain elements of the mix get translated by our brain into meaningful information about position of sound in space. As we know, sound travels at a speed of approximately 30cm per 1 millisecond. So if we assume that in the case of our band the snare drum is positioned 1.5m behind the guitar amps, then snare sound reaches us 5ms later than the signal from the amplifier.

Let's say that we want to make the drums sound like they were standing at the end of the stage and near the back wall. How to do that? When setting reverb parameters remember to pay attention to 'pre-delay'. This element allows us to add a short delay between the direct signal and the reflected signal. It somehow separates the two signals, so we can manipulate the time, after we’ll hear the echo. It’s an extremely powerful tool in creating a scene. Shorter pre-delay means that the reflected signal will be heard almost immediately after the appearance of the direct signal; actually the direct and the reflected signal will hit our ears almost at the same time. And longer pre-delay, however, moves the direct signal away from the reflective surface (in this case the rear wall). If we set a short, few ms delay to the snare, longer one for the guitar or even longer for the vocals, it would be fairly easy for us to catch the differences. Vocals with a long pre-delay sound a lot closer than the snare drum.

 

 

We can also play along with pre-delay when we want to get a real, natural piano sound. Let’s say we place the piano on the left side of our imaginary stage. When sending it to a stereo reverb let’s try to set a shorter pre-delay for the left channel of the reverb, because in reality the signal would bounce back from the left side of the stage (from the side wall) first.

[Pre-delays.mp3]

First we have a dry signal. Then we are in a (quite big) room, close to the drummer. And then we are in the same room again, but this time the drummer is located by the wall, far from us.

3. HIGH FREQUENCY CONTENT

The third is the high-frequency content in the signal. Imagine that you are walking towards the concert in the open air or a pub with live music. What frequency do you hear most of all? Of course the lowest. The closer to the source of music we are, the less dominant "basses" are. This allows us to conclude that the less high frequencies we hear, the further the sound source
is, hence a fairly common practice that helps to move the instrument to the background is a gentle high frequencies rolling off (instead of bass boost) by LPF (low pass filter).

I often like to additionally filter the returns of reverbs or delays - the reflections seem to be more distant this way, deepening the mix even more.

Speaking of bands, we should also pay attention to frequencies somewhere around 4-5kHz. Boosting them could “bring up" the signal to the listener. Rolling them off will of course have the opposite effect.

"It is totally important when producing music that certain elements are a bit more in the background, a bit disguised. It is easiest to do that with a reverb or something similar. In doing that, other elements are more in focus. When everything is dry, in the foreground, it all has the same weight. When some things are harder to hear and others very clearly, it gives you an sense of depth. And you can vary that. That is what makes producing music interesting for us. To create this depth, images or spaces repeatedly. Where, when hearing it, you think wow, that is opening up so much and the next moment it is so close again. And some times both at the same time. It is like watching… you get the feeling you need to read just the lense. What is foreground and background, what is the melody, what is the rhythm, what is noise and what is pleasant. And we try to juxtapose that over and over again." (Mouse on Mars)

PROBLEMATIC BAND

All modern pop music has one thing in common: it is being recorded at a close range, using directional microphones. Yes, you’re right, it’s typical for near-field recording. This is the way how most instruments are recorded, even those you don’t normally put your ears to – bass drum, toms, snare, hihat, piano (anyone puts his head inside the piano to listen to music?), trumpet, vocals ... And yet even musicians playing these instruments (and for sure the listeners!) hear them from a certain distance. Musicians too - it's important. That's the first thing. And second - the majority of studio and stage microphones are cardioid directional close-up mikes. Okay, these two things are a quite obvious, but you’re wondering what is the result? It turns out that we record everything with the proximity effect printed on tracks! Literally everything. In short, the idea is that directional microphones pick up a lot of the wanted sound and are much less sensitive to background noise, so the microphone must handle loud sounds without misbehaving, but doesn't need exceptional sensitivity or very low self-noise. If the microphones get very close to the sound source - within ten or so microphone diameters - there's a tendency to over-emphasise bass frequencies, but between this limit and 100 cm maximum limit, frequency response is usually excellent.

Tracks with the proximity effect printed sound everything but natural.
Everyone got used to it and even for the musicians their instruments recorded from a close distance sound okay. What does this mean? That all of our music has a redundant frequency hump around 300Hz. Some say that it’s rather 250Hz, others that 400Hz - but it’s more or less there and it can be
concluded that almost each mix would only benefit from taking a few dB’s off (with a rather broad Q) from the low-mids.

Rolling off these freq’s will make the track sound more “real” in some way and it’s also actually something common on the sum. The mix gets cleaned up immediately, loses its muddyness and despite the lower volume level it is louder. Low mid appears to not contain any important musical information.

And this problem affects not only the music being recorded live - the samples and sounds are all produced as customers want it and it means they are "compatible" with the sound of the microphone. So it is worth to get familiar with that issue even if you produce electronic music only.

The bottom line is: if you want to move the instrument to the back - roll off the freq’s around 300 Hz. If you want to get it closer, simply add some extra energy in this range.

Continue to part 2 of this article -->

About the author: Piotr "JazzCat" Pacyna is a Poland based producer, who specializes in video game sound effects and music. He has scored a number of Java games for mobile phones and, most recently, iPhone/iPad platforms. You can license some of his tracks here.
Other articles you may find useful:
Asset Management: How to keep track of sound clips using metadata and cataloguing. Three ways to build a sound library: Record sounds yourself, or find another way. Timeline of classical composers: Get an overview of the lives and times of classical music maestros. Depth and space in the mix, part 1: How to use reverb, pre-delay, EQ and delay to make your mix better. Depth and space in the mix, part 2: Further tips on improving the sound of your productions. Maximizing composer agreements: How you as a composer for games and other media can get the best out of the contract.
YouTube and music use: How "fingerprinted music" is causing advertisements on your YouTube video. Using Reverb to enhance your production: John Radford on the use and abuse of Reverb in music. Do the work: Music composers' tips and strategies for overcoming procrastination and getting the job done. Sound effects in music composition: How you can use sound FX in music production for games, film, media. Mixing as part of the composing process - part 1: Planning your instrumentation and approach. Mixing as part of the composing process - part 2: Making your sounds and instruments work with your composition to best effect.
Sound for picture - Faking it: Some great tips on making your audience feel they are there. Royalty Free music in 24-bit: Why we are upgrading to High Definition music downloads. Choosing music for a short film project: We look at some options for obtaining your musical score. Choosing music for a Documentary: Help and tips for obtaining your film soundtrack. Tips and Curiosities from Computer Game Music, pt 1: Piotr Koczewski discusses video game music. Tips and Curiosities from Computer Game Music, pt 2: More talk about composing music for video games.
Copyrights in Classical music and Public Domain music: We try to explain why "public domain music" still has rights attached to it. 1 year Shockwave-Sound.com exclusive: Some of our best music can be found only at Shockwave-Sound.com first year. Browse royalty-free music super quick: With our free Demo DVD-ROM you can skim through tracks quickly on your own PC. Getting started with voiceover: Things you need to know if you would like to make a living as a professional voice talent. Sennheiser PXC 450 noise canceling headphones: Video review of these classy noise reducing headphones Surviving your first composing gig: How to handle your client when composing music for video games or film/TV.
Creating radio ads with music and voice: We discuss some good practices and neat tricks for a great sounding ad spot. Recording sound for perspective: Good sound recording practice for a realistic result. Making a long-playing sound or Audio-CD starting out from a short, looping sound file. Creative workflow in Sonar, part 1: Save time and frustration while working in Sonar music production. Surround music in video games: Rob Bridgett discusses the viability and aesthetics of 5.1 sound heaven Shockwave-Sound's sister site for sound-fx
We introduce our new site for listening to and buying sound effects.
Cue the Music, Part 1: Using copyrighted music in your project or presentation Cue the Music, Part 2: We look at Five ways to get music for your project without breaking copyrights. Cue the Music, Part 3: How to use Royalty Free Music to the best effect for your project. Working with audio in Sony Vegas, Part 1: Importing & Timestretching audio files Working with audio in Sony Vegas, Part 2: Adding FX, Mixing & Rendering Audio Files Samson Zoom H4 portable recorder: An in-depth product review of this handy sound recording unit.
How to get music on your web site: We explain how to Embed music on a web page and how to make a Flash that plays music. Music rights terms and expressions: Podcast safe music, Sync License, Royalty Free Music, Performance Rights... Confused yet? Royalty free music explained: What really lies behind this term? We talk a little music licensing history and look at this expression. How to build a music track from loops: Do this to get the "set of music loops" to play as a longer music track YouTube Safe Music: How to find music for your YouTube video and properly credit the composer and publisher. Common myths and misunderstandings about music rights: We try to clear up some of these.
Orchestral MIDI arrangement: A beginner's guide to the Orchestral MIDI Mockup. A guide to virtual pianos: We take a look (and a very close listen) to virtual piano plug-ins. Strengthen your 3D animation with audio: How to use royalty-free music and sound-fx with 3D animation Ideas for Effectively Using Sibelius and Pro Tools 8: We look at ways to streamline and optimize your composing work. Cleaning up noisy dialogue: Get rid of background noise and improve sound quality of voice recordings Interactive Music in Games: We look at ways to make videogame music react and respond to the players actions.
Writing music for games, part 1: Video games composer Kole on some issues to keep in mind. Writing music for games, part 2: Kole dissects another video game music project. Writing music for games, part 3: Finding a way to compose music for Facebook games and stay within the boundaries. Writing music for games, part 4: How to make the most of what little resources you have available. The Cost of Music: Options for filmmakers to source music for their film. Music Production hardware and software tips, part 1: Important issues you need to consider when selecting your tools, computer, CPU, RAM, software etc.
Music Production hardware and software tips, part 2: Useful tips for choosing your hardware for music production. Composing music for cellphones / mobile phones - Part 1: Tips and tricks of the trade. Composing music for mobile phones / cellphones - Part 2: More useful info for producers. Observations of Memorable Themes: We discuss music composition and how to make your melodies memorable. Developing musical ideas in video games - Part 1. Ways of making the music work for the game. Developing musical ideas in video games - Part 2. More thoughts on scoring music for video games.
Ideas for creating unique musical colours: Tips to try to make it sound and feel 'different' Choosing the right classical music, part 1: We recommend 10 pieces of bombastic, powerful, awe inspiring classical music. Choosing the right classical music, part 2: In this part we recommend 10 beautiful, soft, heavenly and emotional classical music tracks      

Would you like to contribute an article to Shockwave-Sound.com? We will pay you $150.00, and we will include your bio, a link to your web site, and if you wish, a quick plug of your product or service. Shockwave-Sound.com is used by almost 4,000 unique visitors every day. Contact us if you have an article idea/pitch for us that you feel is useful, relevant and well written. First, though, you may want to read this blog post about article requirements.

Copyright notice: This article and all other text on this web site is under Copyright to Shockwave-Sound.com. This text may not be copied, re-printed, re-published, in print or electronically, in whole or in part, without written permission from Shockwave-Sound.com.

[Switch to Classic Navigation]