Under the covers, when Javascript performs bitwise operations, it converts to a 32-bit integer and uses it, and then converts the result back to its internal decimal representation.
So your input value 010001 becomes 00000000 00000000 00000000 00010001 .
Then it is inverted:
~00000000 00000000 00000000 00010001 => 11111111 11111111 11111111 11101110
Converted to hexadecimal, inverted value 0xFFFFFFEE , which is equivalent to the decimal value -18.
Since this is a signed integer with a value of -18, this value is converted to a base decimal representation of -18 by Javascript.
When Javascript tries to print it as base number-2, it sees a negative sign and a value of 18 and prints it as -10010 , since 10010 is a binary representation of positive 18.
source share