Hi to All!
I’ve recently had to do basic research of the Web Audio API and I’ve come across several things that lead me to believe animations could be controlled by sound in Hype in a future version. @Daniel does this seem feasible to pull off?
The attached example at bottom is created by Apple’s motion graphics program “Motion”. An object in Motion can be set to respond to amplitude & frequency with most properties that can be keyframes (scale, position, color, rotation, skew, opacity, etc.). You can also control what range of frequencies and amplitude levels affect the object, as well as floor & ceiling values. Additionally You can control the amount (intensity) of how the object respond to these cues.
Below is a screen shot of the Properties inspector showing one object’s audio behavior parameters. You can assign more than one audio behavior to an object.
A different sound track assigned to the same object using these same parameters would elicit different visuals.
This is more than just about creating “musical” animations. For instance You could use songs or create simple sound patterns in a program like Garageband that generate certain types of responses for graphic objects and then mute the soundtrack itself leaving just the animation.
Like any tool it would have its specific uses - such as the production of complex animated backgrounds ~ or main objects ~ compared to key framing or scripting the animation manually in Hype. It wouldn’t replace anything per se, as much as expand Hype’s animation possibilities.
Just below is a simple example from Motion, put together in less than five minutes, of objects animated by sound. In this demo the animation of the rectangle (Color), two circles (one Scale & the other variable Positioning) & a line (Rotation) are driven by the frequency & amplitude of the music. The circles have been set in motion paths that are not under control of the audio parameters.