Animation Controlled by Sound

Hi to All!

I’ve recently had to do basic research of the Web Audio API and I’ve come across several things that lead me to believe animations could be controlled by sound in Hype in a future version. @Daniel does this seem feasible to pull off?

The attached example at bottom is created by Apple’s motion graphics program “Motion”. An object in Motion can be set to respond to amplitude & frequency with most properties that can be keyframes (scale, position, color, rotation, skew, opacity, etc.). You can also control what range of frequencies and amplitude levels affect the object, as well as floor & ceiling values. Additionally You can control the amount (intensity) of how the object respond to these cues.

Below is a screen shot of the Properties inspector showing one object’s audio behavior parameters. You can assign more than one audio behavior to an object.

A different sound track assigned to the same object using these same parameters would elicit different visuals.

This is more than just about creating “musical” animations. For instance You could use songs or create simple sound patterns in a program like Garageband that generate certain types of responses for graphic objects and then mute the soundtrack itself leaving just the animation.

Like any tool it would have its specific uses - such as the production of complex animated backgrounds ~ or main objects ~ compared to key framing or scripting the animation manually in Hype. It wouldn’t replace anything per se, as much as expand Hype’s animation possibilities.

Just below is a simple example from Motion, put together in less than five minutes, of objects animated by sound. In this demo the animation of the rectangle (Color), two circles (one Scale & the other variable Positioning) & a line (Rotation) are driven by the frequency & amplitude of the music. The circles have been set in motion paths that are not under control of the audio parameters.


Very cool animation. For simple effects, you could do this now with a bit of coding tying web audio api information to the Hype getter/setter API.

However, I do see that it would be interesting to have some of this built in; just brainstorming it would be rather neat to also have an aspect of the sound (volume, frequency, etc.) affect the timeline playhead position. so you could precompose animations and have them go back/forth in time depending on the audio!

1 Like

awesome! very interesting.

Hi jonathan!

Thank You for responding & delighted this has piqued your interest. The creative possibilities would really open up, indeed. Sound can store a lot of info (not just music or voice). Look forward to your implementation!

Hi michelangelo!

Glad You liked it. I thought You would be one of several people on this Forum that could see the possibilities.


Steve Warby

1 Like

Thanks Steve!

A well written piece. There was one key component not discussed in this article (though it was long enough already) and that is the PannerNode which is expressly designed for 3D environments such as games (or a Hype animation).

If You could take the same sound parameters and some how translate them to a visual object You could move that object around in 3D space - many of the properties & methods of the PannerNode appear to be a perfect fit for this purpose.

How this translation would be accomplished is above my pay grade. @jonathan ;->

Very cool ideas, @JimScott! I’m wondering if using something similar with voice input would work. I.E., saying certain words would trigger specific scenes or timelines. Just pondering…

1 Like

I’ve done a lot of this with After Effects using Red Giant’s Trapcode Sound Keys plug in. A nice thing about it is that you can isolate multiple aspects of an audio track for various things. For one project I placed a logo under virtual 3D water and had the kick drum create a circular ripple and had the snare trigger a series of text animations, while a wide swath of midrange was used to control the hue and saturation of various swirling elements.

For another project I had explosions create a pulse that shook objects in a 3D scene with the degree of the effect on various objects being dialed to correspond to the perceived masses of those objects. Anything that can be keyframed, including triggered scripts (ExtendScript or whatever) can be controlled. Dial in the pitch and set the sensitivity to amplitude, then create a formula that widens the range of sensitivity to affect individual objects.

I must agree. Trapcode and plugin of sorts are great. As a core Hype function I think it would be to specialized. Could be a Extension, though. Having live Audiovisualization makes only sense when something is dynamic otherwise just prebake it as a movie.

Yes please- the feature of enhancing the key framing process animation of layer elements pair object motion path points to audio- elements and touch events all responsive to a stroke across the image
I’m looking for- a way for UX- illusion of touch feel- physics is part of it-
With elements in the image-
An example- User can dip finger reach in a virtual fishbowl- catch a wriggling fish With finger- hoist it. Up let it slip around and splash back in the water after having a chance to look at it
Is this sort of programming possible yet? For devices and pads
It can be in 2D first- a competent artist can pull the illusion off in a fun way-
But maybe mini 3D manipulate a layer by touch also I think possible in part already- I’m still researching what’s out there.

  • sounds as if the fine tuning of this feature is in dev- I hope-
    Just to second the feature request- Ill be looking around- to find it when it comes.


1 Like

Here’s an intro video of an example project using Trapcode Sound Keys: