Why use the C # System.Random class in general, and not System.Security.Cryptography.RandomNumberGenerator?

Why would anyone use the “standard” random number generator from System.Random in general, and not always use the cryptographically secure random number generator from System.Security.Cryptography.RandomNumberGenerator (or its subclasses, since RandomNumberGenerator is abstract)?

Nate Lawson tells us in his Google Tech Talk presentation “Crypto Strikes Back ” at 1:11 p.m. to not use the “standard” random numbers generators from Python, Java and C # and instead use a cryptographically secure version.

I know the difference between the two versions of random number generators (see question 101337 ).

But what is the rationale for not always using a secure random number generator? Why use System.Random? Is performance possible?

+60
c # random cryptography
Aug 10 '09 at 21:25
source share
13 answers

Speed ​​and intention. If you generate a random number and do not need security, why use a slow crypto function? You do not need security, so why does someone else think that this number can be used for something safe when it will not?

+105
Aug 10 '09 at 21:29
source share

In addition to speed and a more useful interface ( NextDouble() , etc.), it is also possible to make a repeatable random sequence using a fixed initial value. This is very useful, including during testing.

 Random gen1 = new Random(); // auto seeded by the clock Random gen2 = new Random(0); // Next(10) always yields 7,8,7,5,2,.... 
+55
Aug 10 '09 at 21:30
source share

First of all, the message you are linking is only about random numbers for security reasons. Therefore, he does not claim that Random bad for unsafe purposes.

But I affirm that this is so. The implementation of .net 4 Random has several drawbacks. I recommend using it only if you do not care about the quality of your random numbers. I recommend using the best third-party implementations.

Defect 1: sowing

The default serial constructor uses the current time. Thus, all instances of Random created using the default constructor for a short period of time (about 10 ms) return the same sequence. It is documented and "by design". This is especially annoying if you want code multithreading because you cannot just create a Random instance at the beginning of each thread execution.

The workaround should be especially careful when using the default constructor and, if necessary, manually if necessary.

Another problem is that the seed space is quite small (31 bits). Therefore, if you create 50k instances of Random with completely random seeds, you will probably get one sequence of random numbers twice (due to a paradoxical birthday ). Thus, manual seeding is also not easy to obtain.

Error 2: distribution of random numbers returned by Next(int maxValue) is biased

There are parameters for which Next(int maxValue) clearly uneven. For example, if you count r.Next(1431655765) % 2 , you get 0 about 2/3 of the samples. (Sample code at the end of the answer.)

NextBytes() 3: The NextBytes() method is inefficient.

The cost of each byte of NextBytes() about the same as the cost of generating a complete integer pattern using Next() . From this, I suspect that they do create one pattern for each byte.

A better implementation using 3 bytes from each sample will speed NextBytes() almost 3.

Due to this drawback, Random.NextBytes() only 25% faster than System.Security.Cryptography.RNGCryptoServiceProvider.GetBytes on my machine (Win7, Core i3 2600MHz).

I am sure that if someone checks the source / decompiled bytecode, they will find even more flaws than I found in my analysis in the black box.




Code examples

r.Next(0x55555555) % 2 strongly biased:

 Random r = new Random(); const int mod = 2; int[] hist = new int[mod]; for(int i = 0; i < 10000000; i++) { int num = r.Next(0x55555555); int num2 = num % 2; hist[num2]++; } for(int i=0;i<mod;i++) Console.WriteLine(hist[i]); 

Performance:

 byte[] bytes=new byte[8*1024]; var cr=new System.Security.Cryptography.RNGCryptoServiceProvider(); Random r=new Random(); // Random.NextBytes for(int i=0;i<100000;i++) { r.NextBytes(bytes); } //One sample per byte for(int i=0;i<100000;i++) { for(int j=0;j<bytes.Length;j++) bytes[j]=(byte)r.Next(); } //One sample per 3 bytes for(int i=0;i<100000;i++) { for(int j=0;j+2<bytes.Length;j+=3) { int num=r.Next(); bytes[j+2]=(byte)(num>>16); bytes[j+1]=(byte)(num>>8); bytes[j]=(byte)num; } //Yes I know I'm not handling the last few bytes, but that won't have a noticeable impact on performance } //Crypto for(int i=0;i<100000;i++) { cr.GetBytes(bytes); } 
+39
Jul 27 2018-11-11T00:
source share

System.Random is much more efficient since it does not generate cryptographically secure random numbers.

A simple test on my machine that fills a 4-byte buffer with random data takes 1,000 ms for Random 1,000,000 times, but 2,845 ms for RNGCryptoServiceProvider. Please note: if you increase the size of the buffer that you fill, the difference narrows, since the overhead for RNGCryptoServiceProvider is less relevant.

+23
Aug 10 '09 at 21:27
source share

The most obvious reasons have already been mentioned, therefore it is more obscure here: cryptographic PRNGs usually must constantly yield to “real” entropy. Thus, if you use CPRNG too often, you can deplete the entropy system pool, which (depending on the CPRNG implementation) will either weaken it (which allows an attacker to predict it), or it will block when you try to fill its entropy pool (thus it becomes an attack vector for a DoS attack).

In any case, your application has now become an attack vector for other, completely unrelated applications, which, unlike yours, actually depend on the cryptographic properties of CPRNG.

This is a real-world BTW problem that has been observed on mute servers (which naturally have rather small entropy pools because they lack entropy sources like mouse and keyboard input) under Linux, where applications misuse /dev/random kernel CPRNG for all kinds of random numbers, while the correct behavior should be to read a small initial value from /dev/urandom and use it to align their own PRNG.

+17
Aug 10 '09 at 22:11
source share

If you are programming an online card game or lottery, you need to make sure that the sequence is almost impossible. However, if you show users, say, a quote of the day, performance is more important than security.

+11
Aug 10 '09 at 21:36
source share

This has been discussed in some detail, but ultimately the performance issue is a secondary consideration when choosing an RNG. There is a huge amount of RNG, and canned Lehmer LCG, which consists of most system RNGs, is not the best and not even the fastest. On older, slow systems, this was a great compromise. This compromise is rarely relevant these days. This thing is preserved in modern systems primarily because: A) the thing is already built, and in this case there is no real reason to "reinvent the wheel", and B) for what a huge mass of people will use it, "good enough."

Ultimately, choosing an RNG comes down to a risk / reward ratio. In some applications, such as a video game, there is no risk. RHG Lehmer is more than sufficient and small, concise, fast, well understood and out of the box.

If the application is, for example, an online poker game or a lottery where there are real prizes and real money comes into play at some point in the equation, Lehmer is no longer adequate in the field. In the 32-bit version, it has only 2 ^ 32 possible valid states before it starts a loop at best . These days, that open door to brute force attack. In this case, the developer will want to switch to something like RHG Very Long Period of some types and, possibly, launch it with a cryptographically strong provider. This gives a good compromise between speed and security. In this case, the person will look for something like a Mersenne Twister or a multiple recursive generator .

If the application is something like transferring large amounts of financial information over the network, there is now a huge risk, and it greatly outweighs any possible reward. There are still armored cars, because sometimes armed people are the only security that is adequate, and believe me, if a team of special people with tanks, fighters and helicopters were financially feasible, this would be the method of choice. In such a case, using a cryptographically strong RNG makes sense, because any level of security you can get is not as much as you want. Thus, you will take as much as you can find, and cost is a very distant second-place problem, either in time or in money. And if this means that each random sequence takes 3 seconds to generate on a very powerful computer, you will wait 3 seconds, because in the scheme of things this is a trivial cost.

+9
Sep 22 '09 at 19:10
source share

Note that the System.Random class in C # is incorrectly encoded, so it should be avoided.

https://connect.microsoft.com/VisualStudio/feedback/details/634761/system-random-serious-bug#tabs

+8
Mar 20 '14 at 0:35
source share

Not everyone needs cryptographically secure random numbers, and they can benefit from faster simple ppngs. Perhaps more importantly, you can control the sequence for System.Random numbers.

In a random number simulation that you might want to recreate, you re-run the simulation with the same seed. This can be useful for tracking errors if you also want to recover this scenario with errors - starting your program using the same sequence of random numbers that broke the program.

+4
Aug 10 '09 at 21:33
source share

If I do not need security, i.e. I just want a relatively vague value not to be cryptographically strong, Random has a much easier interface to use.

+2
Aug 10 '09 at 21:31
source share

Different needs require different RNGs. For crypto, you want your random numbers to be as random as possible. For Monte Carlo simulations, you want them to fill the space evenly and to start the RNG from a known state.

+2
25 Oct '09 at 9:09
source share

Random not a random number generator, it is a deterministic pseudo-random sequence generator, which takes its name for historical reasons.

The reason for using System.Random is that you want to get these properties, namely a deterministic sequence that is guaranteed to get the same sequence of results when initialized with the same seed.

If you want to improve "randomness" without sacrificing an interface, you can inherit from System.Random by overriding several methods.

Why do you need a deterministic sequence

One reason for having a determinate sequence rather than a true randomness is that it is repeatable.

For example, if you use numerical simulation, you can initialize the sequence with a (true) random number and record which number was used.

Then, if you want to repeat the same simulation, for example. for debugging purposes, you can do this by initializing the sequence with the recorded value instead.

Why do you need this, not very good, sequence

The only reason I can think of is backward compatibility with existing code that uses this class.

In short, if you want to improve the sequence without changing the rest of the code, go ahead.

+2
Nov 02 '16 at 14:27
source share

I wrote a game (Crystal Sliders on iPhone: Here ) that would put a “random” series of gems (images) on the map, and you would change the map as you wanted it and select them and they would leave. - Like Bejeweled. I used Random (), and it was seeded with a number of 100ns ticks from the time the phone was loaded, a rather random seed.

It seemed surprising to me that he would generate games that were almost identical to each other - from 90 or so gems, from two colors, I would get two EXACTLY the same, with the exception of 1 to 3 gems! If you flip 90 coins and get the same drawings, except for 1-3 flips, this is very unlikely! I have several screenshots that show them the same way. I was shocked at how bad System.Random () was! I suggested that I SHOULD write something terrible in my code and misused it. I was wrong, although it was a generator.

As an experiment - and a final solution, I went back to the random number generator that I have been using since 1985 or so - which is best. It is faster, has a period of 1.3 * 10 ^ 154 (2 ^ 521) before it repeats itself. The original algorithm was seeded with a 16-bit number, but I changed it to a 32-bit number and improved the initial sowing.

The original one is here:

ftp://ftp.grnet.gr/pub/lang/algorithms/c/jpl-c/random.c

Over the years, I threw every random test number I could think of, and he walked past all of them. I do not expect it to matter as cryptographic, but it returns the number as fast as "return * p ++;" until it ends out of 521 bits, and then it quickly performs a bit on the bits to create new random ones.

I created a C # wrapper, calling it JPLRandom (), implemented the same interface as Random (), and changed all the places where I named them in the code.

The difference was VASTLY better - OMG I was amazed - there should be no way, I could say just by looking at screens of 90 or so gems in the template, but I made an emergency release of my game after that.

And I would never use System.Random () again for anything else. I ALLOW that their version is reset by something who is now 30 years old!

-Traderhut Games

-one
Mar 26 '14 at 23:59
source share



All Articles