I have an interesting problem handling 8-bit "ASCII" characters in LINQ-to-Entities, and I hope someone can give me a hint.
I inherited a SQL Server 2000 database that has some pseudo-encrypted columns, where they simply XOR'd contain a row with 0xFF . I donβt know why, and I know that it is lame, but that we are now.
These columns have the SQL data type char(7) and char(14) . When you use XOR with 0xFF , you get the 8th bit in each case, so you end up with characters other than ASCII (as defined by Microsoft). UTF-8 seems to be indicated here, but decoding is getting corrupted.
I can read and decode these lines as follows:
- Get the field using LINQ as
String . - Get
byte[] using System.Text.Encoding.GetEncoding(1252).GetBytes() - Decoding with XORing each byte with
0xFF - Return the decoded string using
System.Text.Encoding.GetEncoding(1252).GetString()
This works great.
The problem I am facing is that I cannot put the ENCODED string in SQL Server using LINQ.
I basically follow the reverse process and do:
- Get bytes using
ASCIIEncoding.GetBytes() . (CodePage 1252 is not needed here, as it is a direct line.) - Encode bytes with
0xFF . - Returns the encoded string using
GetEncoding(1252).GetString() .
If I look at my line, this is exactly what I expect. But if I use this in my essence and do SaveChanges() , the resulting value in SQL Server is always "?????" some length.
I am sure that something is missing here, but I have tried everything that I can think of and cannot get it. For now, I just returned to the old way of using SqlCommand and doing UPDATE with encoded strings as SqlParameters . No problem, it works every time.
Thanks in advance for any help.
Update:
I tried a suggestion from JamieSee, and I didnβt even get a good decoding with its method. I have:
static void Main(string[] args) { Encoding characterEncoding = Encoding.GetEncoding(28591); HCBPWEBEntities ent = new HCBPWEBEntities(); var encUser = (from users in ent.tblEmployer where users.ipkEmpId == 357 select users.sKey).First(); Console.Out.WriteLine("Original XOR Encoded PW: {0}", encUser.ToString().Trim()); byte[] originalBytes = (from character in characterEncoding.GetBytes(encUser.ToString().Trim()) select (byte)(character)).ToArray(); Console.Write("Original Bytes:\t"); foreach (byte b in originalBytes) { Console.Write("{0:x} ", b); } Console.WriteLine(String.Empty); byte[] decodedBytes = (from character in characterEncoding.GetBytes(encUser.ToString().Trim()) select (byte)(character ^ 0xFF)).ToArray(); Console.Write("Decoded Bytes:\t"); foreach (byte b in decodedBytes) { Console.Write("{0:x} ", b); } Console.WriteLine(String.Empty); string decoded = characterEncoding.GetString(decodedBytes); Console.WriteLine("Decoded PW: {0}", decoded); ent.Dispose(); }
But the result of this are:
Original encoding XOR PW: z? o> Original bytes: 7a 9d 6f 3e Decoded bytes: 85 62 90 c1 Decoded PW :? B A
Actually the password is "abcd"