The improved noise version keeps coming back 0

I am trying to implement improved noise in my XNA game, but the improved noise function keeps returning 0.0f. This is the same code as Ken Perlin ( http://mrl.nyu.edu/~perlin/noise/ ), just ported to C #.

I tried to rewrite the class and even copy and paste directly from the site (and, of course, port it to C #, of course), but it just will not output any value other than 0.

Here is the code I'm using:

public class PerlinNoise { private int[] permutations = new int[512]; private Random random; public PerlinNoise() : this(Environment.TickCount) { } public PerlinNoise(int seed) { random = new Random(seed); for (int i = 0; i < 256; i++) { permutations[i] = i; } for (int i = 0; i < 256; i++) { int k = random.Next(256 - i) + i; int l = permutations[i]; permutations[i] = permutations[k]; permutations[k] = l; permutations[i + 256] = permutations[i]; } } private int fastfloor(float x) { return x > 0 ? (int)x : (int)x - 1; } private float fade(float t) { return t * t * t * (t * (t * 6 - 15) + 10); } private float lerp(float t, float a, float b) { return a + t * (b - a); } public float grad(int hash, float x, float y, float z) { int h = hash & 15; float u = h < 8 ? x : y, v = h < 4 ? y : h == 12 || h == 14 ? x : z; return ((h & 1) == 0 ? u : -u) + ((h & 2) == 0 ? v : -v); } public float noise3d(float x, float y, float z) { int X = fastfloor(x) & 0xff, Y = fastfloor(y) & 0xff, Z = fastfloor(z) & 0xff; x -= fastfloor(x); y -= fastfloor(y); z -= fastfloor(z); float u = fade(x); float v = fade(y); float w = fade(z); int A = permutations[X] + Y, AA = permutations[A] + Z, AB = permutations[A + 1] + Z, B = permutations[X + 1] + Y, BA = permutations[B] + Z, BB = permutations[B + 1] + Z; return lerp(w, lerp(v, lerp(u, grad(permutations[AA], x, y, z), grad(permutations[BA], x - 1, y, z)), lerp(u, grad(permutations[AB], x, y - 1, z), grad(permutations[BB], x - 1, y - 1, z))), lerp(v, lerp(u, grad(permutations[AA + 1], x, y, z - 1), grad(permutations[BA + 1], x - 1, y, z - 1)), lerp(u, grad(permutations[AB + 1], x, y - 1, z - 1), grad(permutations[BB + 1], x - 1, y - 1, z - 1)))); } public float noise2d(float x, float y) { return noise3d(x, y, 0f); } } ` 

To test this, I simply did:

 string[] args = Console.ReadLine().Split(' '); PerlinNoise noise = new PerlinNoise(); int x = args[0]; int y = args[1]; int z = args[2]; Console.WriteLine(noise.noise3d(x, y, z)); 

And, as I said above, it will always output 0.

+4
source share
2 answers

The output seems to be 0.0f if all arguments are integers. Change your test code to

 var input = Console.ReadLine() .Split(' ') .Select(s => float.Parse(s, System.Globalization.CultureInfo.InvariantCulture)) .ToArray(); 

and try entering, for example, 4234.2123 3123.12312 423.2434 .

I'm not quite sure if this is the desired behavior, but

  x -= Math.Floor(x); // FIND RELATIVE X,Y,Z y -= Math.Floor(y); // OF POINT IN CUBE. z -= Math.Floor(z); 

will always do x , y and z = 0 if they are integers; fade(0.0f) also always zero.

+1
source

Multiply your inputs by (1 / MAX_VALUE) in your input case, just multiply by 1/256 or so and never give it anything else. When used in a game, multiply the input by (1 / MAXIMUM_CHOORD_VALUE) .

+1
source

All Articles