Media texts historically, have often incorporated other media texts within themselves, therefore redefining their meaning. Books for example, have always benefited from illustrations and photographs to help convey a meaning, whilst the internet (more obviously), has swiftly been transformed into a haven for multimedia. Since its conception, film has incorporated music, and later foley and effects, (all of which can be classified as a media type in their own right). These elements, mixed in with the dialogue, come to form the soundtrack of the film, the laborious product of a sound design, sometimes as elaborate as the parent media text itself.
As significant advancement in visual technology was made, audio technology improved on par. Many of the improvements to movie soundtracks are rooted in, and were driven by industries that produce audio-only media texts, such as the Radio and Recording Industries. A few key examples spanning almost entirely across the twentieth century, are the invention of magnetic tape recordings (1930s), the shift from monophonic to stereophonic sound (1930s) followed by a later shift towards Quadraphonic Sound (late 1960s) and then 5.1 surround sound (1980s), the digitisation of analogue audio (mid-20th century), followed shortly thereafter by the invention of the first modern, and commercially available synthesizer (1965). Some advances were purely scientific by nature, such as that of the Theremin (1928), and yet managed to significantly influence the stylisation of film soundtracks.
This technological revolution in the world of audio evoked new styles and practices in the production of soundscapes. When synchronised images and audio, for example, became the standard in cinematic production, it introduced the use of dialogue to help define the media text. One such step that has vastly influenced the way a soundtrack is put together, is the introduction of multi-track audio recorders. The ability to practically layer audio tracks, later to be mixed down to a final master stereo or surround track, revolutionised the way soundscapes were designed. The way that foley, effects, and music now gave moving pictures new meaning was dynamic and exciting. Sounds would move across a panorama or a given space, to match particular actions; movement was now perceived by an audience both visually and aurally, therefore making the viewing experience that much richer.
As synth technology evolved, the stylisation of movie soundscapes shifted towards using synthetic sounds, effects and music. One of the most renowned pioneers to embrace such possibilities was none other than Sir Alfred Hitchcock. In 1945 he directed a self-penned movie entitled “Spellbound”, in which Dr. Samuel Hoffmann was commissioned to lay an audio track using the Theremin device. In 1963’s “The Birds”, Hitchcock again revlutionised soundscape stylisation by altogether foregoing the classic accompanying score in favour of source music (or diegetic sound), and by using effects created by the synthesizer’s direct predecessor, the Mixtur Trautonium.
Synthesizers did not only influence the moving pictures industry, it was also responsible for significant cultural reform evident in popular music of the time. The influx of music being produced by means of such technology soon filtered its way through mainstream audiences, and thus became an accepted and valued art form. The movie industry in turn, governed by audience research and demographics, instinctively needs to embrace youth culture, modern trends, and popular music in its quest to remain current, relevant, and profitable, particularly during such displacements in popular culture. The early Hitchcock features already mentioned paved the way, and later, important works such as Stanley Kubrick’s “A Clockwork Orange” (1971) and Oliver Stone’s “Midnight Express” (1978) further highlighted a departure from traditional orchestral music scores.
During the mid to late Seventies, “Disco” music took the world by storm. Hollywood was quick to capitalise with releases such as “Mahogany” (1975), “Car Wash” (1976) and “Saturday Night Fever” (1977). The soundtracks to these movies all brought a special meaning, fixed at a particular point in popular culture, and with it, box office success. Unsurprisingly, the theme music for all three movies generated number one singles and renewed careers for singers such as Diana Ross, Rose Royce, and The Bee Gees with the iconic Staying Alive, which to date remains synonymous with the movie “Saturday Night Fever”.
Popular music (of all genres) was by now well intertwined with cinematography for the masses. But things changed when George Lucas released “Star Wars” in 1977. His vision brought about an interesting merger of pioneering visual technology with a more traditional sound design, comprised of organic sound effects and traditional orchestral score. Lucas commissioned John Williams to compose the original score and to conduct the London Symphony Orchestra during the recordings, and sound designer Ben Burtt to create foley and effects by editing and processing organic sounds. Star Wars made movie history and went on to win six Oscars, one of which was for the Best Original Score, another for the Best Sound, and another for a Special Achievement for Sound Effects Editing.
The relationship between sound and moving pictures had come full circle, and Lucas changed how filmmakers and sound designers would approach audio production forever. At this point, the concepts of traditional and modern soundscapes flourished hand in hand, and after all the technological development and experimentation in stylisation, choices would once again be based on project overview rather than on demographics. Now, on to the music itself.
It could be argued that much of the stylistic progression that came about throughout the latter part of the twentieth century could have passed by unperceived by mainstream audiences. Since most audiences (to some extent) direct popular culture and trends through their purchasing power and media consumption, they could have perceived the shifts in style as a natural progression rather than as changes inspired by technological development; simply put – they remain so engulfed in the experience of popular culture that they might not perceive the industry behind it. Irrespective of this, there was always the same basic understanding of music, whether traditionally scored and orchestrated or electronically synthesized.
Even during the days of silent movies, music has been adding new meaning to film. During those early screenings, musicians would play the score supplied with the film, further enhancing the sentiment portrayed by actors and filmmakers. This was true for screenings aimed at the higher class where a small orchestra could be afforded to provide the music, as well as for those held in lower class establishments, where only a pianist (and a violinist at times) could be employed. Yet in both scenarios, the musicians present successfully portrayed the same intended and wide ranging sentiments.
Audience perception has generally, over the years relied on associations and familiarity acquired through repetitive use of signifiers, stereotypes, and other templates. It is for this very reason that composers can draw on pre-established techniques to empower musicians, in whatever numbers, to meaningfully convey the exact same emotion. These musical connotations have been forged through years of composition, long before music’s affiliation with the movies, which is why they were immediately applicable.
Lalo Schifrin has been composing music for movies since 1957. His illustrious career has earned him four Grammy Awards, honoured with countless nominations (including six for an Oscar), presented with his own star in the Hollywood Walk of Fame, and more. He has composed an impressive body of work, including some very famous film scores such as Mission Impossible, Dirty Harry, and Amityville Horror. In his book ‘Music Composition for Film and Television’, (2011) Schifrin reveals the very essence of how composers convey sentiment through their music.
These connotations are not simply implied by one magical formula, but by the correct combination and application of multiple techniques combined. By playing a melody in a specific mode or scale, the composer can immediately draw from a lexicon of very basic, but well-established associations. One can refer to happy and jovial emotions by use of the major modes (appendix 1), or re-affirm a sad melancholic state of mind by switching to its minor counterpart (appendix 2). Furthermore, there are other modes which can be drawn upon to invoke a sense of hope, affirmation, or adventure (appendix 3). Intervals also bring meaning to music in the same way, since they are intrinsically intertwined in definition with modes, i.e. the nature (major or minor) of the interval or double stop, is ultimately governed by the mode the piece is in.
The subtleties conveyed by the use of modes, scales and intervals could be further enhanced by the choice of instruments (or synth sounds). The piercing characteristics of trumpets, trombones, and clarinets can easily depict crisp rhythmic effects and melodies. Warmer sounds generally associated with more swell (or amplitude), like a French horn, tuba, or bassoon, are typically used for layering. One can similarly emphasise different moods, by clever use of phrasing. Broken up and unpredictable rhythms can bring an element of excitement and suspense to a scene (appendix 4). On the other hand, repetitive and predictable rhythmic phrasing can induce a sense of inevitability (appendix 5).
Finally, parallel motion is often used to musically interpret a subject moving up or down on screen, e.g. a fast ascending scale on a xylophone as a cartoon runs up the stairs. Contrary motion is not exactly the complete opposite in terms of pitch, but more in terms of mood. This is clearly explained by Schifrin (2011) when he describes a very serious western bar brawl, in which the antagonism is sonically generated by the continuous (almost uncharacteristic) honky-tonk piano player.
Throughout all the years of technical and cultural development in the history of movies (as the original media type to include sound and music within itself), a distinct sense of emotion has always been successfully transmitted aurally to the audience subconscious, irrespective of the technology applied, the working practices invoked, or even the popular culture at the time of production. Since the first silent movies, and up until present day, music has consistently featured in soundtracks. Irrespective of its own transformations, cultural or technological, music has since before the movie industry, successfully portrayed such meanings as those already discussed. It is therefore logical to conclude that technology, and practices have only enhanced the delivery of audio media within other texts, but the basic connotations and associations must be routed deep within our popular culture, ever since the earliest days of musical composition.
Christian Gadd (1713 words).
Semester 1 (September – December 2013) : Language and Image.
Theme music to “The Never Ending Story” written by Giorgio Moroder and Keith Forsey, performed by Limahl.
Theme music to “The Godfather” written by Nino Rota, conducted by Carlo Savina.
Theme music to “Raiders of the Lost Ark” written and conducted by John Williams.
Theme music to “Back to the Future” written and conducted by Alan Silvestri.
Theme music to “Eyes Wide Shut” written and performed György Ligeti.
Theme music to “The Exorcist” written and performed Mike Oldfield.
Schifrin, L (2011), Music Composition for Film and Television, Boston: Berklee. Available from…
[Accessed 1st December 2013]