I tried so many different strategies to get a useful noise function, and none of them work. So how do you apply perlin noise on an ATI graphics card in GLSL?
Here are the methods I tried: I tried to put the permutation and gradient data in the GL_RGBA 1D texture and call the texture1D function. However, one call to this noise implementation results in 12 texture calls and kills the frame rate.
I tried loading the permutation and gradient data into a single vec4 array, but the compiler will not let me get the element in the array if the index is not constant. For instance:
int i = 10; vec4 a = noise_data[i];
will give a compiler error:
ERROR: 0:43: Not supported when using the indirect index of a temporary array.
Value I can only get the data as follows:
vec4 a = noise_data[10];
I also tried to program the array directly into the shader, but I have the same problem with the index. I heard that NVIDIA graphics cards will indeed enable this method, but ATI will not.
I tried to make a function that returned a certain hard-coded data point depending on the input index, but a function called 12 times and having 64 if statements made the binding time unbearable.
ATI does not support βbuilt-inβ noise functions for glsl, and I cannot just pre-comprehend the noise and import it as a texture, because I am dealing with fractals. This means that I need infinite precision in calculating noise at runtime.
So the main question ...
How?
source share