JavaScript bitwise masking

This question is similar to this other question ; however, I would like to understand why this works as it is.

The following code:

console.log((parseInt('0xdeadbeef', 16) & parseInt('0x000000ff', 16)).toString(16)); console.log((parseInt('0xdeadbeef', 16) & parseInt('0x0000ff00', 16)).toString(16)); console.log((parseInt('0xdeadbeef', 16) & parseInt('0x00ff0000', 16)).toString(16)); console.log((parseInt('0xdeadbeef', 16) & parseInt('0xff000000', 16)).toString(16)); console.log((parseInt('0xdeadbeef', 16) & parseInt('0x000000ff', 16)).toString(16)); console.log((parseInt('0xdeadbeef', 16) & parseInt('0x0000ffff', 16)).toString(16)); console.log((parseInt('0xdeadbeef', 16) & parseInt('0x00ffffff', 16)).toString(16)); console.log((parseInt('0xdeadbeef', 16) & parseInt('0xffffffff', 16)).toString(16)); 

Return:

 ef be00 ad0000 -22000000 ef beef adbeef -21524111 

When what I expect from .string (16) will be:

 ef be00 ad0000 de000000 ef beef adbeef deadbeef 

What is going on with this?

Thank you in advance for your help.


Thanks to the respondents and commentators below, as well as to the following sources:

Here is a solution that works by providing utility functions for converting a 32-bit radix-16 number to and from a signed 32-bit integer:

 // Convert 'x' to a signed 32-bit integer treating 'x' as a radix-16 number // cf http://speakingjs.com/es5/ch11.html function toInt32Radix16(x) { return (parseInt(x, 16) | 0); } // Convert a signed 32-bit integer 'x' to a radix-16 number // cf /questions/791168/javascript-c-style-type-cast-from-signed-to-unsigned function toRadix16int32(x) { return ((x >>> 0).toString(16)); } console.log(toRadix16int32(toInt32Radix16('0xdeadbeef') & toInt32Radix16('0x000000ff'))); console.log(toRadix16int32(toInt32Radix16('0xdeadbeef') & toInt32Radix16('0x0000ff00'))); console.log(toRadix16int32(toInt32Radix16('0xdeadbeef') & toInt32Radix16('0x00ff0000'))); console.log(toRadix16int32(toInt32Radix16('0xdeadbeef') & toInt32Radix16('0xff000000'))); console.log(toRadix16int32(toInt32Radix16('0xdeadbeef') & toInt32Radix16('0x000000ff'))); console.log(toRadix16int32(toInt32Radix16('0xdeadbeef') & toInt32Radix16('0x0000ffff'))); console.log(toRadix16int32(toInt32Radix16('0xdeadbeef') & toInt32Radix16('0x00ffffff'))); console.log(toRadix16int32(toInt32Radix16('0xdeadbeef') & toInt32Radix16('0xffffffff'))); 

Which gives the expected result:

 ef be00 ad0000 de000000 ef beef adbeef deadbeef 

Like some good training on my part about the behavior of integer JavaScript code.

+1
source share
2 answers

In JavaScript, all bitwise operations (and & among them) return a signed 32-bit integer as a result , in the range - 2 31 through 2 31 -1 inclusive. So you have this extra bit ( 0xde000000 greater than 0x7ffffff ), now representing a sign, meaning that instead you get a negative value.

One possible fix:

 var r = 0xdeadbeef & 0xff000000; if (r < 0) { r += (1 << 30) * 4; } console.log( r.toString(16) ); // 'de000000' 
+3
source

Since bitwise operations (e.g. & ) are performed on signed 32-bit integers in javascript. 0xFFFFFFFF , where all the bits are set, is actually -1 , and 0xde000000 , which you expect, is actually -570425344 .

+1
source

All Articles