When you create a variable with a number value, that is assigned a Number type.
const one = 1 const minusOne = -1 const bigNumber = 5354576767321
A number can be defined using hexadecimal syntax by using the
const thisIs204 = 0xCC //204
There is an octal syntax too, which is removed in strict mode so I won’t talk about it.
We can define decimals too:
const someNumber = 0.2 const pi = 3.14
This means one big consequence: some numbers cannot be represented exactly.
What does this mean in practice?
No problem for integers, numbers defined without a decimal part: 1, 2, 100000, 2328348438.. up to 15 digits. Staring from 16 digits you’ll have approximation issues.
Decimal numbers are the ones that gives the most problems.
Here’s a simple example of what this means:
2.2*2 //4.4 2.2*20 //44 2.2*200 //440.00000000000006 (???) 2.2*2000 //4400 2.2*20000 //44000
0.1 * 0.1
You might expect
0.01 from this operation? No, the result is
You might never run into problems but you might, so you need to keep this in mind.
The problem is generally solved by avoiding to process numbers as decimals:
(0.1 * 10) * (0.1 * 10) / 100
But the problem is best avoided by not storing decimals at all, and using calculations to just render numbers as decimals to the user instead of storing them as such.