Can anyone help spot the error? Here is the code:
byte[] oriBytes = { 0xB0, 0x2D }; // oriBytes -> 0xB0, 0x2D string oriInStr = Encoding.ASCII.GetString(oriBytes); // oriInStr -> "?-" oriBytes = Encoding.ASCII.GetBytes(oriInStr); // oriBytes -> 0x3F, 0x2D
I can not return to the original byte values 0xB0, 0x2D.
0xB0
0x2D
0xB0 is not a valid ASCII code. You can read here :
Any byte greater than hex 0x7F is decoded as a Unicode question mark ("?")
This is because .NET does not support the Extended ASCII table . Each value above 127 will produce ?what makes up 63.
?
63
, ? 63.
UTF8 , , newBytes 4 2:
newBytes
byte[] oriBytes = { 0xB0, 0x2D }; string oriInStr = Encoding.UTF8.GetString(oriBytes); byte[] newBytes = Encoding.UTF8.GetBytes(oriInStr);
[] 0xB0 176 0x2D 45. ASCII, 128 176, ? (undefined) 45 -.
, .
ahaah.. ! Encoding.Unicode ASCII. ...;)
Encoding.Unicode
byte[] oriBytes = { 0xB0, 0x2D }; // oriBytes -> 0xB0, 0x2D string oriInStr = Encoding.Unicode.GetString(oriBytes); // oriInStr -> "?-" oriBytes = Encoding.Unicode.GetBytes(oriInStr); // oriBytes -> 0xB0, 0x2D
.Net ascii. , char, int, .
byte[] oriBytes = { 0xB0, 0x2D }; string oriInStr = ""; for (int a = 0; a < oriBytes.Length; a++) oriInStr += (char)(oriBytes[a]); oriBytes = Encoding.ASCII.GetBytes(oriInStr);