One must always keep in mind that all images at some point need to be decompressed and occupy their bitmap data. Thus the formula of pixels wide x pixels height x 4 bytes per pixel will tell how much memory it will occupy. The large fish in the OP’s document is 30,419 x 10,671 x 4 = 1,298,404,596 bytes or 1.2 GB in size. No matter what this will always take a long time to render even if it is for only one read. And then one may need to shuffle the image memory data around or even deal with making room via virtual memory because of how much space it occupies! Computers are fast but 1.2 GB is still a lot of data.
I tried using the largest image from that document and still did hit some performance hiccups, but spritesheets probably will be immune to some of the effects beyond initial loading because the image is encapsulated within the sprite sheet element. This means two things: resizing uses the transform:scale property on the smaller sprite sheet element and not the entire image, and also Hype specifically disables webkit graphics acceleration on the sprite sheet element. The graphics acceleration tends to help small images but hurt larger ones where there’s a lot of data to shuffle to the graphics card.
I should also note that these images have alpha channels, which means there’s a lot more computation to do. If your images/sprite sheets do not have alpha channels then it will be much faster.