The ear can hear faster than the eye can see.

The ear can hear faster than the eye can see.

Our ears can perceive and decode a lot more of the information in sound in a given time than our visual systems can decipher in an image.tensec1We can hear a melody easily that when presented in visual form is just too rapid to decipher what is going on.


This post and a companion After Effects tutorial have been under construction for some time, but were abandoned when an earlier version using some of my favourite music got copyright flagged on YouTube. So rather grudgingly I have been re-doing them with some of the  open use music that YouTube supply. The animated GIF at the top is a short bit of a medium paced latin melody, see the YouTube video for the whole piece and more with sound at the bottom of this post.

Attempts to link visual effects to music have a long history. I remember a travelling act that would arrive in a local theatre for a week or so with a water, lights and organ set up. The gimmick was that popular show tunes and classics were rendered with all the bombast the multi-tone organs of the day could produce, while an array of fountains and water sprays rose and fell in time with the phrases of the music while a rainbow of lights coloured the shapes created by the water in time with the faster changes.
Jean Micheal Jarre was not the first.

Electronics advanced the options from turning different colours on and off, to more sophisticated links between sound and light. I suspect every electronics geek has built a sound to light control box with the sound divided into low mid and high triggering the brightness of three lamps or channels. Discos made much use of such flashing light effects. As well as adopting some of the psychedelics experiments with strobes and lazers from the progressive rock field.

Computer graphics opens up the possibilities even further. Now any aspect of the music can be mapped onto any aspect of the visual field. With that removal of constraints comes the need to confront the inherent limitations of the humans the effect is attempting to target.

Sound events have to be spread over a longer time to enable the visual system to grasp them. A practical example is the peak meter.vumeter1
The peak meter compensates for the slowness of the eye that would otherwise miss short high peaks by introducing a lag in the response. Reacting to a rise fast, but a fall slowly, preserving the high peak long enough to be grasped by the visual system.

While the visual system is slow at decoding a new input, it is fast and sensitive to small changes in an existing pattern. Once the shape and movement is resolved any changes to the shape or pace are picked up faster than would be the case if an entirely novel visual image was presented. Even with this it is usually necessary to introduce a peak lag to give the eye time to see what the audio signal is doing.

On this basis I have been working on variations of the graphic frequency plot. The default effect in AE gives a frequency plot with options on what range of the audio to cover and in what detail. As ever there are inaccuracies. Sampling the audio takes time to determine the frequencies present, our ears are slightly better than most software and hardware setup generally available. There is also the problem that picking out a dominant melody from the audio can be difficult. The human voice and most lead instruments have many overtones. The root note is overlaid with numerous mathematical multiples, so picking out the note from the harmonics and chords is impossible.
There is an option to use the actual separate note data. Effectively putting in the musical notation either from written sheet music or from MIDI, the standard music digital code. In the days of multitrack recording it is also possible to use the separate audio tracks to trigger different visual effects. Often the rhythm of the bass and drums can be derived from the full audio by filtering out the high frequencies. But a cleaner and more manipulable result can come from access to the separate elements.

So most music visualisation is directed towards the modulation of existing cyclic or evolving patterns. Gravilocity, an app on the ipad and other tablets is a good example.. There is also a vast store of VJ flashy grafex on Youtube that have patterns that modulate at different tempos for easy application to existing music..

This video shows the difficulty in seeing as fast as the melody and my own efforts to disguise the frequency plot into an existing pattern or image.

The intention is to post an After Effects tutorial on the details of these effects shortly. Unless I get in the studio and redo the whole thing again with some of my own music!

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s