JSON numbers interpreted as Decimals

Turns out JSON numbers don’t have to be interpreted as floating point numbers if you don’t want them to be.

>>> import decimal
>>> json.loads(‘1.1’, parse_float=decimal.Decimal)


Background: Computer processors in the early 80’s were engineered for calculating numbers in scientific notation where the important thing was having the value accurate to a certain number of decimal places. This is really useful for science and engineering circles. This way of representing decimals became very standardized, and became the default way of representing decimals in many programming languages, including JavaScript.

However, there are some problems with floating point numbers. A really common example is that 0.1 + 0.2 = 0.30000000000000004. For science and engineering, that’s not a problem, but for a lot of other applications it’s a really big problem. It doesn’t make any sense when you are working with money for example. Slightly more than 30 cents, you say.

Anyway, there are newer ways of representing numbers like python’s decimal.Decimal which I find to be much easier to work with, which makes 0.1 + 0.2 = 0.3, just like you would expect.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s