[Proposal] Asynchronous JSON parsing and serialization

Currently, the JSON global object contains the methods parse and stringify in order to, respectively, parse JSON strings to ordinary JavaScript values, and serialize JavaScript values to JSON strings. Alas, both these methods are synchronous, which is bad if they’re used on the main thread. Although usually very fast, for larger objects they could take hundreds of milliseconds to do their jobs (e.g. parsing a string larger than 1 MB).

My proposal would be to introduce two other methods, parseAsync and stringifyAsync (tentative names), with the same signatures of parse and stringify respectively, but returning a Promise of the expected type. For example:

const data = JSON.parse(rawJSON);
// would become
JSON.parseAsync(rawJSON).then(data => { ... });

const payload = JSON.stringify(data);
// would become
JSON.stringifyAsync(data).then(payload => { ... });

I think this could be valuable especially on a Node-based server, where avoiding to block the main thread is paramount.

These new methods would go well for an eventual, future streaming JSON parsing and serializing API.


This is a good idea if it can be done in a way which improves performance, but AFAIK there’s not much to be gained from this. In particular to make stringifying async, it would have to first synchronously copy the entire object to be stringified, in order to avoid it mutating as it stringifies it asynchronously. The copy could well be slower than just stringifying on the spot. So that means there’s no point making an async version. I’m not sure about parsing, but I would guess there’s a similar issue there.

Parsing is done on a primitive, so I guess everything should be ok, but you’re probably onto a good point about the serialization part. I’d be surprised to know copying the object would take the same amount of time of stringifying, but still it could be significant. Anyway, I’m not and expert on this.

The alternative would be… not copying the object at all. Developers would have to be careful not to make modifications to the object being serialized until the task is done. It shouldn’t be hard and it would be a good practice to work with immutable values.

I guess this behaviour wouldn’t be very aligned with the rest of the API ecosystem, though. I wonder how bad that could be.

I know it requires a bit more setup, but couldn’t you just do this by sending the object to a worker, and then have the worker post back the stringified data?

Yes, that’s how one would do it today, indeed. But that, more than just having to do extra work, would only work in browsers, not in Node.

And I actually had Node in mind for this proposal. But thanks for pointing out a workaround :+1:

Ah, the Node point is true. I wonder then if it’s worth moving this down the stack to the TC39 folks?

@domenic, do you have any opinions? I imagine this must have been discussed before.

You should benchmark this, since I think it’s likely that the overhead of posting to a worker (especially an object, which invokes the structured clone algorithm) will be larger than just doing the work on the main thread normally (where nothing needs to be cloned).