Int32 storage in memory

I have a question about int32 storage (c#).

32 bits means that the largest number for int is 2 ^ 32.

2^32 = 4294967296if you divide it by 2, you will get the maximum value for int32:

4294967296 / 2 = -2147483648 to 2147483648

So, I thought half of the bits are for negative digits and the other for positives. But this cannot be true because 2^16 = 65536.

Now my question is :

How is it really tuned in memory?

I am really interested to know your answers.

+3
source share
1 answer

Only one bit is used for a sign (negative or positive)

Int32 31 , . . MSDN

Int32.MaxValue =  2^31 - 1 = 01111111111111111111111111111111                
Int32.MinValue = -2^31     = 10000000000000000000000000000000

, .

0xFFFFFFFF. :

1111 1111 1111 1111 1111 1111 1111 1111

? ( ) 1, , . , : 1 , , 0 , 0 .

, , . ? , (0 1 1 0) .

, :

0000 0000 0000 0000 0000 0000 0000 0000

.

0000 0000 0000 0000 0000 0000 0000 0001

, 0xFFFFFFFF 0x00000001, 1. , 0xFFFFFFFF -1.

, . -30 2 , 30:

0000 0000 0000 0000 0000 0000 0001 1110

.

1111 1111 1111 1111 1111 1111 1110 0001

.

1111 1111 1111 1111 1111 1111 1110 0010

, 0xFFFFFFE2

,

CPU . 8- . 4 .

7 = 00000111
4 = 00000100
4

1 00000100 0 1 1 0

00000100 -> 11111011

2

11111011
00000001
========
11111100

7-4 = 7 + ( 4)

00000111 (binary representation of 7)
11111100 (binary representation after Two complement of 4)
========
00000011  (binary representation of 3)    
+8

All Articles