I am trying to do write a code for a function that displays a boolean after checking if the value is an integer or not. see my definition of an integer in my ideal outputs below when i call the function:
So far, my code is failing because it generates true for validInteger ( 10.0 ) when i want it to be false. Javascript is struggling because 10.0 = 10
DESIRED OUTPUT
validInteger ( '10' ) // should return True
validInteger ( 10 ) // should return True
validInteger ( '-10' ) // should return False
validInteger ( -10 ) // should return False
validInteger ( 0.0 ) // should return False
validInteger ( 10.0 ) // should return False
validInteger ( -10.0 ) // should return False
I wonder if anyone has an idea on how i can get a false result for validInteger ( 10.0 ). so far this is the code i have been able to write:
function validInteger (value) { // value can be a string or a number (integer)
const numberValue = Number(value);
return Number.isInteger(numberValue) && numberValue >= 0 && String(numberValue) === String(value);
}
2
Answers
10
and10.0
are the same in JS. It’s a primitive value of typenumber
. You cannot distinguish between these notations in runtume.https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number
You can check only whether an expression is a number and has a decimal part with
Number.isInteger()
.Also it seems as a XY problem since its purpose isn’t clear.
For allocating integers you could examine typed arrays.
The solution I see here is to make the function receive the argument as a string.
Something like this.