Advanced Live Sound 201

ALS 201 is an 8 week course that aims to take a novice sound engineer to the next level. It's very hard to pick up the "conventional wisdom" of the industry from an OJT learning situation. Yet everything you are going to learn here is paramount to the overall job of "Sound Engineering." Below is a list and description of the classes in this course.


You don't have to be a mathematician to learn about sound and light. Yet understanding some of the basic concepts and simple rule of thumb calculations will help you a great deal. We will examine some of the physics and some simple math that can help.

An overview of the techniques used to distribute power in the entertainment production industry.

Safety is of vital concern in this area and visual identification of power devices and connectors is essential. Power distribution is paramount to the correct, noise free operation of your equipment as well as the safety of you and your artists and your guests. Alternating current is dangerous and learning the safe and acceptable methods to distribute power is fundamental to any production. 99% of all buzzing, humming, 60 cycle noise etc is caused by improper grounding and multiple ground paths. Electricity always seeks the path of least resistance. 

Feedback or acoustic feedback as it's sometimes called is the bane of the live sound engineer.

Because we must add huge amounts of gain to some microphones, a vocalist's for instance, in order to make them heard above the basic stage level we risk the occurrence of feedback.

How to train and use your ears and use technology to tune the acoustic feedback out of amplified audio systems. 

Eliminate feedback before it can happen. Learn how professional audio engineers eliminate feedback from the audio system before any artist walks in the venue. Identifying the room modes and free air resonance frequencies of a performance space and eliminating them before sound-check is the only way to assure that acoustic feedback cannot happen during a performance. There are straight forward methods that do not involve machinery or analysis by anything but your ears. There are also may useful devices for pinpointing troublesome frequencies, nodes and anti-nodes and other wave phenomina that can  severely affect your mix. We will examine both aproaches in this class.

Mixing Console Overview...Analog and digital consoles examined and compared. 

With the advent of digital mixing consoles, what seemed like easy operations on a simple analog console can become daunting to perform on a digital mixer. But in reality they both use the same audio toolbox. In fact, a digital mixer often gives the live sound engineer the same dynamic tools that a studio engineer or a mastering engineer would use to enrich the quality of the sounds in a recording studio environment. Despite using the same audio tool box, the two different types of mixers, analog & digital, have inherently different approaches to signal path and gain structure. In this course we will examine the fundamental physical differences between the two without delving into the subjective or collective value or their quality of sound.

Gain structure, the most misunderstood concept in professional audio engineering.

This is an overview of the logic of Gain Structure to ensure proper throughput in any type of system. Gain structure is the defined as a balance of signals throughout the signal chain of the entire sound system . The signal chain starts at the input stage of an audio mixing console. It could be a microphone or it could be an iPod that is connected. One type of signal requires high amounts of gain added to it whereas the later requires relatively little added gain in comparison to the microphone.

There are many "gain stages" in a mixing console. Just as there is an input amplifier for the microphone, there is an output signal fed from the amplifier to the next gain stage. Balancing the inputs and outputs of all the gain stages in a console is achieved by using metering. The best consoles allow you to meter levels at any spot in the signal chain. This is crucial to what I call Relative Audio Balancing. Voices and instruments vary wildly in timbre and pitch and making adjustments that make individual sounds relative to one another involves adjusting the gain at different points. Dynamic processing such as limiting or compression are designed to help the engineer keep levels within the acceptable design limits of the circuitry. Input clipping, over saturation, phase cancellation and phenomena that are relative to audio signals "in the wire" are to be avoided at all costs. In this course we will learn how to effectively manage gain structure. 

Dynamic Processors, can't live with'em, can't live without'em.

Several of the most difficult problems for audio engineers are over-saturation of signal, bleed-through of unwanted sounds and excessive dynamic range. This class will examine the nature of audio dynamic processors such as noise gates, compressor-limiters and the concept of side-chain insertion which allows an engineer to combine these line level devices in novel ways that solve those problems. Add to that the fact that now most of these "devices" are plug-ins and work strictly in the digital realm. Still...

Audio engineers must know how to control signals and dynamic range through the use of dynamic processing devices. Audio signals can often be affected by a widerange of both internal and external influences. Excessive signal level or excessive dynamic range, bleed-through into mics from adjacent sound sources and Brick-wall limiting functions are all achieved through the use of dynamic processing. Unfortunately, these devices can ruin your sound just as easily as they can fix it! Understanding the operations of these devices, how and where they are connected to other devices and the proper approach to setting the adjustments to dynamic processing are all topics we will cover in this week's class.

Live audio effects such as reverb, echo, chorus, and delays are examined and explained in terms of content and technique of use.

Audio effects take your mix to the next level by punctuating certain emotional elements of performances. Audio engineers must know how to operate Audio FX while mixing music. The "raw" sound of a live performance can have a great deal more emotional impact when subtle effects like echo and reverb are added. Learning to use effects for live audio mixing will take your sound to a whole new space! The psycho-acoustic phenomena associated with delays and echos have been used for years by audio engineers to improve their mixes. Reverb, chorus, and other types of effects can also play an important role in defining the emotional content of songs and performances as well.

Advanced techniques for mixing and the conventional wisdom of the industry are examined.

The different workflows and sensory inputs that must be compared and contrasted. An attempt to gain a deeper understanding of the role of the sound engineer and the methods required to achieve very high levels of satisfaction. Mixing technique can be one of the hardest things to learn in an OJT learning environment. The role of the sound engineer is much more complex than most people imagine. First and foremost, techniques for controlling a large number of inputs simultaneously are examined in the light of my personal experience and that of industry standards. By definition the term "conventional wisdom" translates to "adjustable truth". Engineers can easily fall into the trap of relying on the conventional wisdom of the industry rather than believing their eyes and ears.