skip to Main Content

i have an if statement in JavaScript, the problem (was its fixed now but I don’t know why)
when the parameters are val1 and val2 as objects, this Code works:

if (typeof(val1) === typeof(val2) === 'object' && val1 !== null && val2 !== null) 

but this Code does :

 if (typeof(val1) === 'object' && typeof(val2) === 'object' && val1 !== null && val2 !== null)

How is

typeof(val1) === 'object' && typeof(val2) === 'object' 

difrente from :

typeof(val1) === typeof(val2) === 'object'

i treid to ask chat gpt this is the answer and it worked but i want to know why

It looks like the issue is with how the comparisons are structured. The expression typeof(val1) === typeof(val2) === ‘object’ doesn’t work as intended because it evaluates left to right. Instead, try breaking it down like this:

if (typeof(val1) === 'object' && typeof(val2) === 'object' && val1 !== null && val2 !== null) {

2

Answers


  1. Because of operator precedence and associativity

    typeof(val1) === typeof(val2) === 'object'
    

    is equivalent to

    (typeof(val1) === typeof(val2)) === 'object'
    

    This first evaluates typeof(val1) === typeof(val2), which returns either true or false. Neither of these can be equal to the string 'object', so the whole condition is always false.

    Login or Signup to reply.
  2. typeof(val1) === 'object' && typeof(val2) === 'object'
    
    typeof(val1) === typeof(val2) === 'object'
    

    in the first case we get a boolean && boolean which must be true for the condition to be satisfied.
    in the second case we compare "object" with "object" and get boolean true, and then compare it with string and get false.

    if (typeof(val1) === typeof(val2) === 'object' && val1 !== null &&
    val2 !== null)
    

    here we get first true -> false && true or false && true or false

    if you want to understand, study data types and typeScript

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search