First of all, the message you are linking is only about random numbers for security reasons. Therefore, he does not claim that Random bad for unsafe purposes.
But I affirm that this is so. The implementation of .net 4 Random has several drawbacks. I recommend using it only if you do not care about the quality of your random numbers. I recommend using the best third-party implementations.
Defect 1: sowing
The default serial constructor uses the current time. Thus, all instances of Random created using the default constructor for a short period of time (about 10 ms) return the same sequence. It is documented and "by design". This is especially annoying if you want code multithreading because you cannot just create a Random instance at the beginning of each thread execution.
The workaround should be especially careful when using the default constructor and, if necessary, manually if necessary.
Another problem is that the seed space is quite small (31 bits). Therefore, if you create 50k instances of Random with completely random seeds, you will probably get one sequence of random numbers twice (due to a paradoxical birthday ). Thus, manual seeding is also not easy to obtain.
Error 2: distribution of random numbers returned by Next(int maxValue) is biased
There are parameters for which Next(int maxValue) clearly uneven. For example, if you count r.Next(1431655765) % 2 , you get 0 about 2/3 of the samples. (Sample code at the end of the answer.)
NextBytes() 3: The NextBytes() method is inefficient.
The cost of each byte of NextBytes() about the same as the cost of generating a complete integer pattern using Next() . From this, I suspect that they do create one pattern for each byte.
A better implementation using 3 bytes from each sample will speed NextBytes() almost 3.
Due to this drawback, Random.NextBytes() only 25% faster than System.Security.Cryptography.RNGCryptoServiceProvider.GetBytes on my machine (Win7, Core i3 2600MHz).
I am sure that if someone checks the source / decompiled bytecode, they will find even more flaws than I found in my analysis in the black box.
Code examples
r.Next(0x55555555) % 2 strongly biased:
Random r = new Random(); const int mod = 2; int[] hist = new int[mod]; for(int i = 0; i < 10000000; i++) { int num = r.Next(0x55555555); int num2 = num % 2; hist[num2]++; } for(int i=0;i<mod;i++) Console.WriteLine(hist[i]);
Performance:
byte[] bytes=new byte[8*1024]; var cr=new System.Security.Cryptography.RNGCryptoServiceProvider(); Random r=new Random(); // Random.NextBytes for(int i=0;i<100000;i++) { r.NextBytes(bytes); } //One sample per byte for(int i=0;i<100000;i++) { for(int j=0;j<bytes.Length;j++) bytes[j]=(byte)r.Next(); } //One sample per 3 bytes for(int i=0;i<100000;i++) { for(int j=0;j+2<bytes.Length;j+=3) { int num=r.Next(); bytes[j+2]=(byte)(num>>16); bytes[j+1]=(byte)(num>>8); bytes[j]=(byte)num; } //Yes I know I'm not handling the last few bytes, but that won't have a noticeable impact on performance } //Crypto for(int i=0;i<100000;i++) { cr.GetBytes(bytes); }