I read the C FAQ and found out in the question that it recommends me to use rand() / (RAND_MAX / N + 1) instead of the more popular way that rand() % N
The reason for this is that when N is a low number of rand() % N will only use a few bits from rand() .
I tested various approaches with N 2 both Windows and Linux, but did not notice the differences.
#include <stdio.h> #include <stdlib.h> #include <time.h> #define N 2 int main(void) { srand(0); printf("rand() %% N:\n"); for (int i = 0; i < 40; ++i) { printf("%d ", rand() % N); } putchar('\n'); srand(0); printf("rand() / (RAND_MAX / N + 1):\n"); for (int i = 0; i < 40; ++i) { printf("%d ", rand() / (RAND_MAX / N + 1)); } putchar('\n'); return 0; }
The output is this (on my gnu / linux machine):
rand() % N: 1 0 1 1 1 1 0 0 1 1 0 1 0 1 1 0 0 0 0 0 1 0 1 1 0 0 0 1 1 1 1 0 0 0 1 1 1 0 1 0 rand() / (RAND_MAX / N + 1): 1 0 1 1 1 0 0 1 0 1 0 1 0 1 1 1 1 1 0 1 0 0 0 1 0 0 0 0 1 0 1 1 1 0 1 1 0 1 0 1
Both alternatives seem completely random to me. The second approach seems to be worse than rand % N
Should I use rand() % N or rand() / (RAND_MAX / N + 1) ?
c algorithm random numbers
wefwefa3
source share