.NET 4.5 includes a new validation attribute called CreditCardAttribute , and this attribute indicates that the data field value is a credit card number. When I decompile the assembly that contains this class, I can see the following code to check the credit card number:
public override bool IsValid(object value) { if (value == null) { return true; } string text = value as string; if (text == null) { return false; } text = text.Replace("-", ""); text = text.Replace(" ", ""); int num = 0; bool flag = false; foreach (char current in text.Reverse<char>()) { if (current < '0' || current > '9') { return false; } int i = (int)((current - '0') * (flag ? '\u0002' : '\u0001')); flag = !flag; while (i > 0) { num += i % 10; i /= 10; } } return num % 10 == 0; }
Does anyone know which algorithm is used here to check the format of numbers? Algorithm Moon? Also, is it an ISO standard? Finally, do you think this is a correct and 100% correct implementation?
MSDN does not provide much information about this. In fact, they have incorrect information, as shown below:
Notes
The value is checked using a regular expression. The class does not confirm that the credit card number is valid for purchases, only that it is well-formed.
tugberk
source share