Recognition of audio markers

This is not a high priority, but it would be nice if Hype could recognize markers in audio files. It’s easy to create them in an audio editor and it would certainly help with triggering actions.

Another great feature would be audio triggering. - a tool that can track frequencies of an audio track in three bands, with separate sensitivity controls. The user can then set an action to watch one of the three bands and then be triggered by the peak. This is common with motion graphics apps.

1 Like

Huh, that would be quite interesting!

This could theoretically be done now; the WebAudio API has a lot of hooks for determining information about the audio going through it, and then on specific triggers you could use the Hype API to manipulate timelines.

1 Like

This is a video I created some years ago using this technique. It was superseded and dropped from the original site, and has been repurposed by some franchise in France and doesn’t play smoothly. In any case, check out the section near the start where the brand name is spinning. The ripple effect and the vertical jump of the logo is triggered by the kick drum in the lower band, with the acceleration/deceleration of the rotation being controlled by a formula that responds to one beat in four.

1 Like

Cool example. I definitely like the idea of “implicit animations” - Physics was our first foray into this but audio would be another amazing area.

Here’s a link to an intro video about the plug-in used:

1 Like

Very cool. (It also makes me appreciate node-based compositors a bit :slight_smile:!)

Red Giant’s plug-ins can be extremely complex. Their particles plug-in – Particular – has many more operators than something like Sound Keys, as you need to control how a particle evolves after it is emitted, how it interacts with other particles, how to handle animated images attached to particles (such as an explosion of butterflies that flap their wings and rotate in 3-D space, etc.), and a whole lot more.

Developing sound triggering in Hype would obviously need to be far simpler, such as dialing in only three user-determined bands at three volume levels, assigning a percentage factor that would multiply against the manual setting for an assigned keyframe channel (height, position, color), etc. Adding a formula to affect the release rate and time would be too much to start with. I would envision a window displaying the aural historigram that is overlaid with three columns representing the three available bandwidths (which could be moved horizontally to center on a desired frequency), and with each column divided into three boxes that you click in. When clicked it would present a list of enabled attributes (such as width, x rotation, or whatever), you select the attribute for that frequency and amplitude and move a slider to set the intensity (factor of multiplication). A button would also be available to assign the entire amplitude range of a column as one.

The user would then select the audio-enabled attribute, drop in a keyframe, and then select the default setting. If the vertical position is audio enabled, the user sets a position of 100 px and the multiplication factor to .8, the assigned object would bounce between 80 and 100 pixels with each beat.

Obviously, there would be special cases, such as looping a color spectrum, but it would be doable. Of course, it would require the full attention of a programmer for several weeks and would involve other programmers as well, which would mean that other, more vital issues would be ignored, but one can always dream.

1 Like

Apple’s Motion app covers a wide range of possibilities in a very simple interface.

In the screen capture below the “Scale” of an image (see “Apply to” at very bottom of image) can be set to a wide range of sonic triggers. Any keyframable property can be animated to a discrete set of audio parameters. Very straightforward in the execution.

Possibly Hype & the WebAudio API could make beautiful music together in some future version.


I definitely agree about Motion’s audio triggering.

I used to be a huge fan of Motion, from the week it was released up until they overhauled the UI. I love the way that everything can be controlled by everything else. When, due to Apple’s tunnel-visioned decision that everyone uses laptops / single monitor rigs, they dropped the floating, dockable palettes and slammed them into tabs inside the main window, it knocked my productivity down catastrophically. Instead of instantly jumping to any control I wanted, every time I wanted to perform an operation I had to waste several seconds clicking through nested tabs — digging, digging, digging without any pause, every second I was using the app.
Having a house-afire need to get projects out the door quickly, I had to drop Motion. If Apple ever restores the dockable palettes, I’l go back to Motion without hesitation. It took years for Apple to recognize their false assumptions regarding the presumed appeal of the MacPro cylinder, and it will take even longer with Motion (primarily, I suspect, because most Motion users these days never experienced the vastly more productive nature of the earlier UI). Aiming for the lowest common denominator is never the best choice for a pro product.

1 Like