I’ve drafted a proposal for asynchronous conversions between Blob, Image and ImageData:
I’ve been asked what the potential use cases are for this on real-world sites. I don’t really know beyond my own, but perhaps others here have feedback on use cases (or any other aspect of the spec draft).
In general this is aimed at providing an async alternative for getImageData/putImageData, which are synchronous and sometimes very slow (100ms+), which causes UI jank. Also if you need to batch-convert Blobs to ImageData, the async methods allow it to be parallelised and use multiple cores, improving performance.
Let me know if you have any ideas or feedback.
Half-baked, pet-project use case – this would make fooling with Laplacian pyramids for responsive images (inspired by Yoav’s proposal) a little easier, and it would make such a thing much more palatable if it ever made it to production.
A more realistic use case might be things like Photoshop in the browser.
What about doing this with asm in a Worker?
One of the goals of the spec is to reduce intermediate copies of data to keep the peak memory usage down. A canvas in a worker seems like it would require an intermediate copy. Also Blob is not currently transferrable, and HTMLImageElement is unlikely to ever be transferrable.
Ah, yeah, you’re right. I was mistaken about how the decoder worked.