How Could Hype Plug Into Apple Intelligence?

It's funny! I was getting text message from friends, how the WWDC 2024 keynote was underwhelming. But, when you think about it… and I did …this could be a good direction for AI. I've seen lots of tech demos for AI, but Apple's felt less dystopian. It felt like they struck the right balance for this new technology.

So, this seems like an opportunity for Tumult. Perhaps some of those AI features should be available in Hype…

Anyway, here's the video if you missed it…

I thought they have done so much.
Unrelated to programming, I have been giving feed back to apple every now and then for them to extend their privacy of what photos apps have access to, to your contacts. So access to your contacts are enclaved per app. They finally did it.. Well chuffed.

But back to the subject I think to add the API for Apple Intelligence to you app is just a few lines of code and in some cases where textfields are used you just get it.

Within the Hype IDE:
The first things I think of:
Hype could use the image creation within the Apple Intelligence to allow users to create new images within the Hype IDE using natural language.

Hype could possibly include a way to tap into the new auto Localisations within the Apple Intelligence API. channeling that output to a new hype localised json file or even a hype function.

On another note. The Apple Intelligence will only be included in the iPhone15x onwards and more importantly for Hype M1 Macs onwards.

The iPhone bit is annoying to me personally is it means most of the billion people with iPhones will need to upgrade. I luckily I now havde an M1 Mac

I think Apple chose to do this because of speed and data compression of running the AI onboard the devices. But maybe to get use all to buy new devices?.

1 Like

I believed that Hype could support the continuity feature for previewing on the phone while displaying on the main computer.