1 Introduction to Audio

Audio is an integral part of the video production experience, and to create professional-level content you will need to pay equal attention to sound as you do with visuals. If you doubt this, just turn off the sound next time you are watching a broadcast TV programme! All too often more emphasis is placed on the visual aspect of video production, but a well-recorded and mixed soundtrack is often the difference between an average production and one which stands head and shoulders above others.

This section covers all aspects of creating a professional soundtrack for video – comprising principals of sound and its characteristics, microphone types and their applications, sound monitoring, external recording devices, and post production treatment of the soundtrack. A professional videographer will need to understand all these principles and applications in order to create perfect audio.

1.1 General Aims and Objectives


Before we go into too much detail, there are a few general principles of audio production that it

1.2 The Nature of Sound Waves


To record good sound, a videographer must learn something of the nature of sound. In general, sound travels away from its source in a similar way to ripples caused by a stone thrown into a pool of water; the waves are higher at first and then fade out as they travel.

The difference is that sound spreads in three dimensions rather than just on a level surface, and the waves are actually variations in air pressure, but it makes for a reasonable analogy.. Of course light is also transmitted in a similar way, as electromagnetic waves, but at a very much higher frequency. One consequence of this is that light tends to be blocked by an object in its path, whereas as the wave frequency gets lower its energy can more easily pass around (or through) an object. Which is partly why you will hear the bass frequencies coming through the half-open door (and the walls and floor) from a teenager bedroom but not the higher treble frequencies.

The speed of sound in air is approximately 1130 feet (344 metres) per second. This is much slower than light, so at outdoor events like a rock festival the sound may appear out of sync with the live action if you’re half a field or more back from the stage. It is also what gives rise to reverberation in big halls, and colouration in smaller rooms, as the sound takes an appreciable time to bounce from one wall to another as each wave slowly diminishes.

Waves, whether generated in water or air, have both positive and negative peaks, with respect to the prevailing conditions. An extreme example of this, in the case of water, can be seen in the conditions that immediately precede the tidal wave of a tsunami hitting a beach, where the water is first drawn away to an abnormally low (negative) level before the high (positive) peak of the tidal wave.

The height or ‘amplitude’ of sound waves determines their volume or ‘loudness’. The rate at which waves are generated determines their ‘frequency’ or pitch, and represents the time between peaks – the faster the peaks pass a given point, the higher the frequency. Frequency is measured in ‘cycles per second’ (cps) also known as Hertz, the SI unit hertz (Hz) being established by the IEC in 1930 in honour of pioneering physicist Heinrich Hertz for his work on electromagnetic waves.

The human ear can discern frequencies within a limited spectrum; between about 20Hz and 20kHz, with the upper limit typically diminishing with age. In musical terms the range is something around 10 octaves, and the ears sensitivity varies according to the frequencies involved, and the volume at which they’re reproduced.

The inverse of wave frequency is ‘period’, the time that occurs between two successive peaks. The period is proportional to the ‘wavelength’, which can be calculated as the distance that one cycle of a wave of a particular frequency will travel in the open air during its period.

The formula is:- Wavelength = Wave Speed * Period (or Wavelength = Wave Speed / Frequency) where “Wave Speed” is the speed of sound, 344M/sec.

This means that higher frequencies have shorter wavelengths and lower frequencies have longer wavelengths.

The wavelength is important as it dictates how sound waves interact with real-world environments. For example, in a room with parallel surfaces (walls and ceiling) sound waves, whose wavelengths exactly equal the distances between those surfaces, will resonate, like the effect of blowing over the top of an empty bottle, introducing unnatural emphasis to those particular frequencies – a phenomenon referred to by Sound Recordists as making a room “honk”.

1.3 The Inverse Distance Law


As sound waves travel through air, each doubling of distance from a sound source results in a halving of sound pressure. This is known as the Inverse Distance Law.

This loss of power is equal to 6 decibels (dB). A decibel is not a unit with an absolute value. It’s a relative unit, a logarithmic measurement that represents the ratio between one value and another. Because of the way that we perceive the volume of sound waves through our ears, in a non-linear, logarithmic way, it is an ideal unit for use in an audio environment. An increase of 3dB is roughly equal to doubling of audio power. As humans, we will perceive a 3dB increase in volume in much the same way, whether it’s caused by an increase from 1 watt to 2 watts of power, or from 100 watts to 200 watts.

To look at it another way, if you were to move away from a sound source in an area (with an absorbent ground surface) from 4 feet to 8 feet, the sound level is reduced by 6dB; another 8 feet to 16 feet the level is reduced by another 6dB (giving a total of 12dB); then move from 16 feet to 32 feet away, the total combined sound level reduction is 18dB, and so on.

In practical terms, this rule helps to explain why microphones should be placed as close to the sound source as possible. The closer the microphone is to a given sound source, the greater the sound pressure it will receive from that source, the higher the signal output it will generate, hence improving the signal to noise ratio and the smaller the amount of colouration that will occur.

It also explains why it is difficult to move away from unwanted background noise that isn’t being generated in the close vicinity. If the source is 3 feet away, then moving another 9 feet away could reduce it by 12dB. If it’s a building site 100 yards away, you’re going to have to move 300 yards further away to achieve the same 12dB reduction.

1.4 Practical Applications


Whenever you turn up at a location to shoot, take note of the environment. Listen, and note down what is happening around you. The human brain is accustomed to tuning out sounds that are distracting and that it’s not interested in, whereas microphones record or ‘catch’ everything without distinction. It is therefore important to get your microphone(s) as close as you can to the source subject you are recording, and as far away as possible from any sound source that is producing unwelcome noise.

Remember too that bass or low frequencies carry better than high frequencies, and are more likely to pass around or through obstacles. Choose the right microphone for the setting that you are in and, if faced with audible background noise, remember that a ‘Wild Track’ (a separate recording of the ambient or background sound) recorded on location can save any number of complications in editing when you are trying to adjust your source sound in post-production, and match the background sound of one shot to another.

Back to: IOV Approved Training Level 1 > Audio

©2025 copyright IoV

Contact us

We're not around right now. But you can send us an email and we'll get back to you, asap.

Log in with your credentials

Forgot your details?