I have all of a sudden getting the dreaded spinning beach ball every time I attempt anything from nudging element position to hiding a layer. Super frustrating.
The thing that is different is I “upgraded” to OS 10.12.6 recently.
I also update to Hype Version 3.6.7 (596) Professional Edition.
I’m png’s in this project. Could that be why? File below.
The problem stems from your images being gigantic (in resolution, not in file size). Computationally, there’s a lot of memory and operations that are required to resize and display. Even opening them in Preview takes a while!
Fish HBKLT.png: 30,419 x 10,671 Logo tales.png: 19,962 x 13,284 Bobbles.png: 5,739 x 4,566
I recommend resizing to a size lower than 2048x2048, recommended to less than 1024x1024, and ideally just the final size that it will be exported as.
Depends on how complex the SVG file is as they can be pretty heavy. If they are you can minify SVG’s https://www.svgminify.com If you still think they’re big maybe use a transparent PNG. But Looking at your file a few kb’s the most for svg.
The SVGO GUI is nifty, thx. For some strange reason Imageoptim SVG compression messes up SVGS for me, not sure what im doing. For pngs, lossy compression does the job.
It is dependent on how a web rendering engine implements it, but I did just do a test with a SVG defined to be 6,000 x 6,000 and didn't find any terrible performance hiccups so it seems likely that browsers have correctly optimized rendering for vector-based images.
One must always keep in mind that all images at some point need to be decompressed and occupy their bitmap data. Thus the formula of pixels wide x pixels height x 4 bytes per pixel will tell how much memory it will occupy. The large fish in the OP’s document is 30,419 x 10,671 x 4 = 1,298,404,596 bytes or 1.2 GB in size. No matter what this will always take a long time to render even if it is for only one read. And then one may need to shuffle the image memory data around or even deal with making room via virtual memory because of how much space it occupies! Computers are fast but 1.2 GB is still a lot of data.
I tried using the largest image from that document and still did hit some performance hiccups, but spritesheets probably will be immune to some of the effects beyond initial loading because the image is encapsulated within the sprite sheet element. This means two things: resizing uses the transform:scale property on the smaller sprite sheet element and not the entire image, and also Hype specifically disables webkit graphics acceleration on the sprite sheet element. The graphics acceleration tends to help small images but hurt larger ones where there’s a lot of data to shuffle to the graphics card.
I should also note that these images have alpha channels, which means there’s a lot more computation to do. If your images/sprite sheets do not have alpha channels then it will be much faster.