i have an if statement in JavaScript, the problem (was its fixed now but I don’t know why)
when the parameters are val1 and val2 as objects, this Code works:
if (typeof(val1) === typeof(val2) === 'object' && val1 !== null && val2 !== null)
but this Code does :
if (typeof(val1) === 'object' && typeof(val2) === 'object' && val1 !== null && val2 !== null)
How is
typeof(val1) === 'object' && typeof(val2) === 'object'
difrente from :
typeof(val1) === typeof(val2) === 'object'
i treid to ask chat gpt this is the answer and it worked but i want to know why
It looks like the issue is with how the comparisons are structured. The expression typeof(val1) === typeof(val2) === ‘object’ doesn’t work as intended because it evaluates left to right. Instead, try breaking it down like this:
if (typeof(val1) === 'object' && typeof(val2) === 'object' && val1 !== null && val2 !== null) {
2
Answers
Because of operator precedence and associativity
is equivalent to
This first evaluates
typeof(val1) === typeof(val2)
, which returns eithertrue
orfalse
. Neither of these can be equal to the string'object'
, so the whole condition is alwaysfalse
.in the first case we get a boolean && boolean which must be true for the condition to be satisfied.
in the second case we compare "object" with "object" and get boolean true, and then compare it with string and get false.
here we get first true -> false && true or false && true or false
if you want to understand, study data types and typeScript