Phase: What Does It Mean In Sound?

by Joost Nusselder | Updated on:  May 26, 2022

Always the latest guitar gear & tricks?

Subscribe to THE newsletter for aspiring guitarists

We'll only use your email address for our newsletter and respect your privacy

hi there I love creating free content full of tips for my readers, you. I don't accept paid sponsorships, my opinion is my own, but if you find my recommendations helpful and you end up buying something you like through one of my links, I could earn a commission at no extra cost to you. Learn more

Understanding phase in sound is essential for mixing and mastering music.

A sound’s phase is determined by its timing with respect to other sounds, and it affects how the sound is perceived when multiple sounds are heard together.

This introduction will provide an overview of the concept of phase and how it can be used in sound to create different effects.

Phase What Does It Mean In Sound(7rft)

Definition of phase


In sound production and recording, phase is the relationship of varying time that exists between sounds of different sources. It may also be used to describe the relationship between two waveforms at a particular point in time. When first discussing phase, we typically think about microphone placement and phasing issues; however, it can also be addressed in any area where multiple sound sources are combined in the same environment including multitrack recording and live mixing for music performance or sound reinforcement.

Phase relationships involve degrees of relative timing, meaning if one source is panned to one side and another is panned to the other side, an additional 180-degree angular offset in timing also applies between them. This results in either cancellation (or attenuation) of frequencies or an overpressure (“building”) effect where frequencies are enhanced. To determine how two signals interact with each other regarding this effect they must be analyzed on a graph (a frequency response curve). This type of analysis helps identify how two signals combine and whether they combine additively (added together) or constructively (in-phase) — each contributing its own unique level or creating cancellations or additional levels depending on their relative angle with each other (out-of-phase). The term “phase” is also commonly used when discussing multi-miking techniques since it describes how MICs interact with each other and ties into mic placement techniques such as X/Y configurations

Types of phase


The phase of an audio signal refers to the timing relationship between two or more signals. When two sound waves are in phase, they share the same amplitude, frequency and duration. This means that the peaks and troughs of each wave occur at exactly the same place and time.

Phase can be described in terms of degrees, with 360° representing one complete cycle of a waveform. For example, a signal with 180° phase is said to be “in complete” while one with 90° phase would be “half out” of phase from its original form. There are four main types of phase relationships:
-In-Phase: 180°; both signals move in the same direction at the same time
-Half Out-of-Phase: 90°; both signals still move in the same direction at different times
-Out-of-Phase: 0°; one signal moves forward while another moves backward at exactly the same time
-Quarter Out-of-Phase: 45°; one signal moves forward while another moves backward but slightly out of sync.

Understanding how these different types of phase work helps engineers create more nuanced mixes and recordings, as they can emphasize certain sounds to create interesting sonic effects or balance levels throughout a mix.

How Phase Affects Sound

Phase is a concept in sound that can help determine how sound is heard. It can either add clarity and definition, or it can create mud and muddle. Understanding the concept of phase may help you to create better sounding mixes. Let’s look at how phase affects sound and why it’s important when producing audio.

Phase Cancellation


Phase cancellation occurs when sound waves interact with each other causing the amplitude of the combined sound to cancel out and in some cases even disappear completely. It occurs when two (or more) sound waves of the same frequency are out of phase with each other and their amplitudes interfere in a negatively correlated fashion.

In other words, if one wave is at its peak level while another is at its lowest it will create cancellation, resulting in loss of volume. This can be caused by two or more mics being placed too close to each other and picking up similar sounds or due to an instrument’s placement within a room – for example a guitar standing directly next to its amp with both pickups turned on.

It also happens when two speakers placed close together are playing the same signal but with one inverted (out-of-phase). Theoretically speaking, it should still be audible since not all frequencies will be affected but changes in level can make it difficult to detect. Practically speaking however, when adding multiple speakers together you may experience some degree of cancellation depending on their exact placement – particularly when they are close together.

This effect has relevance in recording too where it can help us improve mic placement by allowing us to hear exactly which sounds are cancelled out when certain dependencies occur – such as identical mic positions that capture the same sound source but from different angles.

Phase Shifting


When two or more audio sources are combined (mixed) they will naturally interact with one another, sometimes enhancing and other times competing with the original sound. This phenomenon is known as phase shift or cancellation.

Phase shifts occur when one of the signals is delayed in time, resulting in either constructive or destructive interference. Constructive interference occurs when the signals combine to amplify certain frequencies resulting in a stronger overall signal. By contrast, destructive interference occurs when the two signals are out of phase causing certain frequencies to cancel each other out resulting in quieter overall sound.

To avoid destructive interference, it is important to be aware of any possible time offsets between sound sources and adjust accordingly. This can be accomplished through recording both separate audio tracks at the same time, using a mixer to send a copy of the signal from one source directly into another source with minimal delay, or introducing a slight delay into one track until a desired outcome is achieved.

In addition to preventing canceling out frequencies, combining audio tracks also allows for some interesting effects such as stereo imaging by panning one side left and right as well as comb filtering where high and low frequency sounds emenate from distinct points in an environment rather than blending together throughout a given room or recording space. Experimentation with these subtle details can create powerful and engaging mixes that stand out in any sonic context!

Comb Filtering


Comb filtering occurs when two identical sounds frequencies are mixed together with one of the frequencies being slightly delayed. This produces an effect that cuts certain frequencies and reinforces others, resulting in interference patterns which can be both audible and visual. When looking at the waveform, you will notice repeating patterns that appear to have a comb-like shape.

When this type of effect is applied to sound, it makes some areas sound dull and lifeless while other sections seem overly resonant. The frequency range of each “comb” will depend on the delay time used between tracking/mixing of signals and also the tuning/frequency setting when recording/mixing instruments.

The primary causes of comb filtering are phase misalignment (when one set of sounds is out of phase with another) or environmental acoustic problems such as reflections from walls, ceilings, or floors. It can affect any type of audio signal (vocal, guitar or drums) but is particularly noticeable on vocal tracks in recording studios where out-of-phase issues are common due to a lack of accurate monitoring systems. To eliminate comb filtering you must rectify phase misalignment or othe environmental effects by using proper acoustic treatments/designs in recording spaces as well as checking phase alignment in mixing stages at each track level and master level respectively.

How to Use Phase in Recording

Phase is an important concept to understand when recording audio. It describes the relationship between two or more audio signals and how they interact with each other. It is an essential element of sound engineering as it affects the sound of a recording in a number of ways. Understanding how to use phase in recording can help you create a more professional sounding mix. Let’s discuss the basics of phase and how it affects the recording process.

Using Phase Shifting


Phase shifting is the alteration of the timing relationship between two waves. It’s a useful tool when mixing and recording sounds because it allows you to control the output level, frequency balance, and imaging in an audio production. With phase shifting, you can also alter the tonal color of a sound by changing its harmonic content and why it is essential for achieving desired recordings.

Phase shifting does this by stretching or compressing different frequencies at different points in the sound wave to create a filter effect. This filter effect is controlled by adjusting the time differences between the left and right channels of a single signal. By delaying one of those channels slightly, you can create an interference pattern that has interesting effects on the frequency response and stereo imaging of a sound.

For example, if you place a mono pad (a keyboard part) in front of an acoustic guitar and send them both out to their own separate channels on your audio interface, they will naturally combine with each other but be completely in phase – meaning they will sum together evenly when heard together in both speakers or headphones. However, if you were to introduce a negative 180 degree phase shift to one channel (delay the other channel briefly), these waves would cancel each other out; this could be used as a creative tool to create contrast with two types of instruments that potentially clash harmonically when recorded at once together. In addition, any frequencies that may not be capturing your desired sound could be reduced with this technique and/or unwanted hiss – as long as you’re playing with phase relationships carefully.

It’s important to note that working with phase requires very delicate balance adjustments since even minor misalignments will have profound effects in terms of frequency balance and imaging on recordings – but as long as it’s done properly, it can also result in enhanced tonalities that were simply never achievable before.

Using Phase Cancellation


Phase cancellation describes the process of adding two signals together that have exactly the same frequency, amplitude and wave shape but are in opposite polarity. When signals of this nature mix together, they have the potential to cancel each other out when their amplitudes are equal. This lends itself quite nicely to recording situations as it can be used to mute and isolate sounds within a track while still allowing instruments with similar properties to sit nicely in a blend.

It is also possible to use phase cancellation creatively as an effect on a signal while recording or mixing. For example, if you combine two or more mics on one source and pan one off-center by adjusting the relative signal level of one mic, then you can create dynamic changes in the sound by cancelling out certain frequencies with opposing polarity signals at certain points during playback. This can create the effect of anything from a wide sounding mix to a tight centred sound depending on where you position your mics and how much polarity you introduce into their signal chain.

Phase relationships between instruments will also play an important role during recording sessions. By aligning all of your instrument tracks to each other in terms of phase/polarity, it ensures that as each element goes through its own individual reshaping process (compression, EQ), there won’t be any audible artifacts created due to unexpected cancellation between recorded elements when they mix together. Ensuring all your tracks have proper phase alignment before bouncing them down is essential if you’re looking for clean mixes with minimal EQ adjustments needed afterwards.

Using Comb Filtering


One of the essential applications of phase in recording is known as “comb filtering,” a type of temporal interference that can create hollow-sounding resonances between multiple tracks or microphone signals.

This effect occurs when the same sound is recorded using two or more microphones or signal paths. The delayed version of the track will be out-of-phase with the original track, resulting in Cancelational Interference (aka “phasing”) when these two tracks are combined. This interference causes certain frequencies to appear louder than others, creating a unique style of frequency eq and coloration in the signal.

Using comb filtering to intentionally color audio signals is common practice in recording studio settings. It’s often employed when an engineer needs to add a distinct tone to an instrument, vocal part or mix element such as reverb through ‘colourisation’. Achieving this distinctive sound requires careful manipulation of microphone and signal balance along with delays intermingled with raw dry signals defying traditional equalization techniques based on static frequency boosts/cuts on individual tracks/channels.

While it requires thoughtful decision making and skillful execution, this kind of equalization can help bring life and character to audio that traditional EQ often can’t provide. With a better understanding of how phase works, you’ll be well on your way toward becoming an expert ‘colouriser’!

Conclusion


Phase plays an important role in sound engineering and production. From adjusting the timing of one track in order to fit perfectly with another to making sure vocals & guitar stand out in a mix, understanding how to use it effectively can add an incredible amount of clarity, width and texture to your mixes.

In summary, phase is all about time and how your sound interacts with other sounds if their start points are off from each other by less than a millisecond. It’s not always as simple as adding delay or reverb; sometimes its beneficial to adjust the timing of different tracks rather than just their tone or levels. This means taking into account what is going on between the speakers, too! Once you understand how phase works and make that extra effort to get it right your tracks will start sounding great in no time.

I'm Joost Nusselder, the founder of Neaera and a content marketer, dad, and love trying out new equipment with guitar at the heart of my passion, and together with my team, I've been creating in-depth blog articles since 2020 to help loyal readers with recording and guitar tips.

Check me out on Youtube where I try out all of this gear:

Microphone gain vs volume Subscribe