What algorithm does CreditCardAttribute use to check the format of a credit card number

.NET 4.5 includes a new validation attribute called CreditCardAttribute , and this attribute indicates that the data field value is a credit card number. When I decompile the assembly that contains this class, I can see the following code to check the credit card number:

 public override bool IsValid(object value) { if (value == null) { return true; } string text = value as string; if (text == null) { return false; } text = text.Replace("-", ""); text = text.Replace(" ", ""); int num = 0; bool flag = false; foreach (char current in text.Reverse<char>()) { if (current < '0' || current > '9') { return false; } int i = (int)((current - '0') * (flag ? '\u0002' : '\u0001')); flag = !flag; while (i > 0) { num += i % 10; i /= 10; } } return num % 10 == 0; } 

Does anyone know which algorithm is used here to check the format of numbers? Algorithm Moon? Also, is it an ISO standard? Finally, do you think this is a correct and 100% correct implementation?

MSDN does not provide much information about this. In fact, they have incorrect information, as shown below:

Notes

The value is checked using a regular expression. The class does not confirm that the credit card number is valid for purchases, only that it is well-formed.

+3
c # algorithm validation credit-card
source share
2 answers

Last line:

 return num % 10 == 0; 

It is a very strong hint that this is the Moon Algorithm.

+4
source share

This algorithm is indeed a Luhn algorithm. Unfortunately, not all card numbers can be checked using this algorithm, so it is not a 100% method. However, Mastercard and Visa product card numbers that allow you to enter a key number must pass this check.

The only 100% way to check for a card number is to complete the transaction. Typically, the Hosters Host System protocols used for PoS connections have provisions confirming that the card is not in the stop lists and exists in the routing tables.

+2
source share

All Articles