Skip to content

Instantly share code, notes, and snippets.

Embed
What would you like to do?
JSON number limits

JSON number limits

So we we're using an API which returns a JSON response. One of its attributes is a numeric key. Due to historical reasons we're now being served longer number (longs) so the server, which is not based on JavaScript, started returning long integers.

I had heard about issues like this but hadn't cross against a real use case before.

So what started happening on our JavaScript clients (browser and React Native alike) is that the primitive value we get back once we get the fetch json promise resolved is an overflown number.

JavaScript engines commonly have the symbol Number.MAX_SAFE_INTEGER so one can retrieve the number above which problems start to appear (it is 9007199254740991).

So if you go to your browser's console or node prompt and do

> JSON.parse('{"a":55555555555555555}')
{ a: 55555555555555550 }

Even a direct assignment shows the problem:

> 55555555555555555
55555555555555550

the spec says...

So, two things come from this: is the JSON specification definitive about numbers having to be respect these constraints or not? I must remind you reader that JSON arose as an idea from Douglas Crockford of serializing/deserializing a subset of JavaScript between server and client... It seems to be up to the implementation to decide on the range and precision of numbers being carried. It suggests though:
Since software that implements IEEE 754 binary64 (double precision numbers, IEEE754) is generally available and widely used, good interoperability can be achieved by implementations that expect no more precision or range than these provide.

temporary solutions

In the meantime there are a couple of approaches one can do. Instead of getting the json() promise back, we can request the text() which returns plain text (.responseText for those who still remember XHR usage). One can either parse JSON with a custom algorithm or map the value for another attribute, in the remote case the payload has redundancy.

custom parsing

There are a couple of JSON custom parsers on NPM such as lossless-json that can be employed instead of regular parsing, that entailing one possibly needs to retrieve, manipulate those values in a different manner.

One could also go for an approach such as csander's, where the unparsed JSON gets all the numbers replaced by their string counterparts, assuming they have above n digits (in the example 16).

mapping from related data in the payload

In our specific case we had a sibling attribute which was a URL to call the given asset information which had the key as part of the slugs that made up the URL. So we could do with letting the JSON be incorrectly parsed, retrieve the string version of the key via regexp or split by slash and assign that to the overflown attribute ourselves.

sending back payloads to the service...

Reminding you that the same problem will happen when building JSON payloads back. If the server expects a number, stringifying can't be done either because that assumes we had captured the valid primitive number. Even using the recent BigInt type, JSON does not yet support it, as shown:

> 55555555555555555n.toString()
'55555555555555555'

> JSON.stringify(55555555555555555n)
TypeError: Do not know how to serialize a BigInt

> JSON.stringify({a:55555555555555555n})
TypeError: Do not know how to serialize a BigInt

I'd say the simplest approach would therefore be to fill the value with a template string and replace its occurrences after stringify takes place.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment