Essential Physics for Music Producers: What You Need To Know


There's been a lot of talk on my Facebook account (feel free to give me a friend request!), regarding sound wave physics and how it relates to music. There seems to be a lot of misunderstanding on the subject, so I thought I'd do a 'back to basics' post about what you need to know, why you need to know it, and how you can apply it (and unconsciously already do) in your own music productions.  

1. What is a sound wave?

Basically put a sound wave is a series of pressures placed on the air in our environment. When a sound wave is travelling in the air, it has the physical affect of compressing air molecules together, and then, expanding them. The rate of this compression and expansion is called frequency. This diagram is a visual representation of a sound wave moving through the air:

Sine Wave

This is a sine wave. It's the simplest waveform of them all. It represents a single frequency. Frequency is measured in Hertz (Hz). Hertz measures the amount of times per second that a sound ave completes a cycle (the complete journey from compression to expansion and back again, as shown in the picture above).

The amount of cycles per second defines the frequency of the sound wave. So, if a sound wave completes 200 cycles per second, it is said to have a frequency of 200Hz


2. What Sound Waves Can We Hear?

Humans can only perceive sound waves within a certain range of frequencies.

20Hz, to 20,000Hz.

That's why when you look at an EQ plugin, say in Ableton for example, the graphic readout shows only that range.


3. Why Is This Relevant?

Because it's the foundation of everything we do in music. Whether it's playing the piano, creating a sound using a synthesizer, or performing a mixdown or mastering a track.


All Music Is Essentially Mathematics


This is proved by overlaying a piano keyboard over frequencies, such as this example below:


Here we see that each note on the keyboard actually relates to a frequency. Therefore, you are not actually playing an 'A' when you play one on the keyboard (let's say note A3), you're actually triggering a frequency of 440Hz (also known as Concert Pitch A, which is what most musical instruments are tuned to ensure that they play together harmoniously).

This is also why frequency is also (mistakenly) called pitch.

So in reality, you can look upon a piano keyboard not as a musical instrument, but rather a ruler of measurement for sound wave frequency! As you play further down the keyboard (to the left) the notes get lower in pitch, and the frequency reduces. This means that the sound wave produced goes through fewer cycles per second (Hz), whilst ultimately means Lower Frequencies produce sounds with more bass.

As you play notes further up the keyboard, the frequency increases. This means more cycles per second travelled by the sound wave (higher frequency), therefore it produces sounds with more "treble" (sounds such as hi hats for example)

4. How Can I Apply This In My Own Productions

You already do. By even starting to produce music, you are already applying these fundamental pieces of knowledge. In order for your productions to improve, a solid understanding of sound waves and physics is an absolute necessity.

How can you hope to control something you don't fully understand?


It was one of the first things I learned when I decided to take music production and audio engineering seriously. It's something I use every single day in the studio without fail, no matter how simple or complex my tasks are. It also is the key to understand so much in our world, as Nikola Tesla stated so perfectly:


So, here's how to start understanding this core knowledge in the context of your own music production:

1. Begin to notice what sounds have what ranges of frequencies

Start to analyse with your ears (try not to use your eyes too much), what you kick drum is made out of. Is it mostly lows? Highs? Somewhere in the middle? Then listen very carefully to the other elements in your track: claps, snares, hi hats etc...

2. Start to understand where everything sits within our frequency range of 20Hz - 20,000Hz.

Once you start to see how all of the instruments in your productions are placed, both within our range of hearing, and in comparison to one another, you can really start to deeply understand how everything fits and works together. Over time, your understanding will deepen even further, as this post is the tip of a massive iceberg that smashes wide open the 'source code' of all music, especially music made on computers.

3. Learn to trust your ears, not your eyes.

Whilst learning, using a graphical readout of what frequencies a sound possesses is extremely useful. Your hearing system isn't quite able to place what any particular sound has, as you have not developed critical listening skills.

However, you ultimately hear and listen with your ears, not with your eyes! I sometimes say in my 1-2-1 Training Sessions that trying to listen with your eyes is like buying a Ferrari, and then having a horse pull it down the road for you in stead of using the engine. The only way your critical listening and hearing skills will develop is through listening deeply, which means restricting, or denying yourself of visual stimulation regarding the sounds within your productions.

Learn to trust your ears, as if you don't, you'll never reach your full potential as producer.

We will return to this subject regularly on the blog, as we have only just scratched the surface today!

So, please use the comments and let me know how this has helped you to expand your understanding. Any feedback or discussions would be brilliant as well, and if you'd like me to cover anything specific, please don't hesitate to get in touch