C # System.Security.Cryptography.HMACSHA1.ComputeHash () does not return the expected result

I am trying to implement an OTP solution in C # based on RFC 4226: http://tools.ietf.org/html/rfc4226

I found an example implementation and looks like this:

using System; using System.Collections.Generic; using System.Linq; using System.Text; using System.Security.Cryptography; namespace OTP { class Program { static void Main(string[] args) { System.Text.UTF8Encoding encoding = new System.Text.UTF8Encoding(); byte[] secretKey = encoding.GetBytes("12345678901234567890"); byte[] counter = encoding.GetBytes("1"); Console.WriteLine(CalculateHotp(secretKey, counter)); Console.ReadKey(); } public static int CalculateHotp(byte[] key, byte[] counter) { var hmacsha1 = new HMACSHA1(key); byte[] hmac_result = hmacsha1.ComputeHash(counter); int offset = hmac_result[19] & 0x0f; int bin_code = (hmac_result[offset] & 0x7f) << 24 | (hmac_result[offset + 1] & 0xff) << 16 | (hmac_result[offset + 2] & 0xff) << 8 | (hmac_result[offset + 3] & 0xff); int hotp = bin_code % 1000000; return hotp; } } } 

The problem is that the call:

 byte[] hmac_result = hmacsha1.ComputeHash(counter); 

does not return the expected result, and therefore, the returned OTP will be incorrect. Reading the RFC4226 D application (http://tools.ietf.org/html/rfc4226#appendix-D), there are some test values ​​to use, and the result will not match them:

 From the RFC 4226, Appendix D: The following test data uses the ASCII string "12345678901234567890" for the secret: Secret = 0x3132333435363738393031323334353637383930 Table 1 details for each count, the intermediate HMAC value. Count Hexadecimal HMAC-SHA-1(secret, count) 0 cc93cf18508d94934c64b65d8ba7667fb7cde4b0 1 75a48a19d4cbe100644e8ac1397eea747a2d33ab 2 0bacb7fa082fef30782211938bc1c5e70416ff44 3 66c28227d03a2d5529262ff016a1e6ef76557ece 4 a904c900a64b35909874b33e61c5938a8e15ed1c <snip> Table 2 details for each count the truncated values (both in hexadecimal and decimal) and then the HOTP value. Truncated Count Hexadecimal Decimal HOTP 0 4c93cf18 1284755224 755224 1 41397eea 1094287082 287082 2 82fef30 137359152 359152 3 66ef7655 1726969429 969429 4 61c5938a 1640338314 338314 <snip> 

Since I used β€œ12345678901234567890” as the key and β€œ1” as the counter in my example above, I would expect the result of ComputeHash (): 75a48a19d4cbe100644e8ac1397eea747a2d33ab and OTP: 287082

But I get OTP: 906627

I really can't understand what I'm doing wrong here, has anyone successfully implemented counter-based OTP in C # using the HMACSHA1 class?

+7
source share
1 answer

You are using the counter incorrectly. The counter should not be an ASCII string, it should be a numeric (long) value in big-endian.

Using

 var counter = new byte[] { 0, 0, 0, 0, 0, 0, 0, 1 }; 

for this test, and your code will return the correct OTP.

+12
source

All Articles