In order to develop my experiments with JavaScript animation in Project 2, I have extended this for the interaction be based not just on an audio input but on the frequency of that audio input. This gives a more nuanced reaction and a greater scope for animation interaction, the frequency can be broken up and elements isolated to only react to a certain range of these. Initially (with the help of James Field) I have been able to set this up to have three mouths reacting by rotating to a degree specified by the low, mid-range and high frequencies of a loaded in audio track.
The next step is to have this react to microphone input, again pulling out the different range of frequencies for the .obj file to react to.