How can two javascript objects be equal and not equal at the same time?

The following is an example comparing two JavaScript objects, but the return values ​​confuse me.

var i=new Object()
var j=new Object()

i==j false

i!=j true

i>=j true

i<=j true

i>j false

i<j false

How are the values ​​defined for the above? I am having trouble understanding.

+4
source share
2 answers

Here are the reasons

i==j false //Since both are referring two different objects

i!=j True  //Since both are referring two different objects

i>=j true  //For this, the both objects will be converted to primitive first,
           //so i.ToPrimitive() >= j.ToPrimitive() which will be 
           //evaluated to "[object Object]" >= "[object Object]" 
           //That is why result here is true.

i<=j true  //Similar to >= case

i>j false  //Similar to >= case

i<j false  //Similar to >= case

i<-j false //similar to >= case but before comparing "[object object]" will be negated 
           //and will become NaN. Comparing anything with NaN will be false 
           //as per the abstract equality comparison algorithm 

You mentioned that i<-jwill be rated as true. But this is wrong, it will be judged as false. See the above reasons.

+7
source

, , . , . , :

var a = {};
var b = a;

a == b

, , .

0

All Articles