HDR Bloom rendering traffic using OpenGL / GLSL

I have built-in HDR rendering using OpenGL and GLSL ... At least I think! I am not sure about that.

I followed the tutorial on the intel website:

https://software.intel.com/en-us/articles/compute-shader-hdr-and-bloom 

And about the impact of Gaussian blur, I scrupulously follow all the performance tips on the following website:

 https://software.intel.com/en-us/blogs/2014/07/15/an-investigation-of-fast-real-time-gpu-based-image-blur-algorithms 

According to the first site:

“The output of the bright passage is then reduced to half by 4 times. Each of the outputs of the bright passage down the screen is washed out by a separable Gaussian filter, and then added to the next output bright passage of a higher resolution. The final result is a color of size ¼ that is selected and added to the HDR output before displaying tones. "

Here's the outline of the flower (the images above were taken from the NVIDIA NVIDIA report).

The window resolution in my test is 1024x720 (for this algorithm, this resolution will be reduced 4 times).

Step 1:

Skipping lighting (mixing skipping material + skipping a shadow mask + skipping skybox):

enter image description here

Step 2:

Extracting information about light illumination into a bright passage (more precisely, 4 mipmaps textures are generated ("The output of the bright passage is then reduced by half 4 times" → 1/2, 1/4, 1/8 and finally 1/2)):

enter image description here

Step 3:

"Each of the bright passages that were revealed is washed out by a separate Gaussian filter, and then added to the next output with a high resolution.

I want to be precise that bilinear filtering is enabled (GL_LINEAR), and pexilization in the images above is the result of resizing the texture in the NSight debugger window (1024x720).

a) Resolution 1 / 16x1 / 16 (64x45)

"1 / 16x1 / 16 blurry output"

enter image description here

b) Resolution 1 / 8x1 / 8 (128x90)

"1 / 8x1 / 8 with reduced lighting, combined with 1 / 16x1 / 16 blurry output"

enter image description here

"1 / 8x1 / 8 blurry output"

enter image description here

c) Resolution 1 / 4x1 / 4 (256x180)

"1 / 4x1 / 4 with reduced lighting, combined with 1 / 8x1 / 8 blurry output"

enter image description here

"1 / 4x1 / 4 blurry output"

enter image description here

d) Resolution 1 / 2x1 / 2 (512x360)

"1 / 2x1 / 2 with reduced lighting, combined with 1 / 4x1 / 4 blurry output"

enter image description here

"1 / 2x1 / 2 blurry output"

enter image description here

To program the desired mipmap level, I use FBO resizing (but it might be wiser to use separate FBOs that were already set during initialization rather than resizing the same thing several times. What do you think of this idea?).

Step 4:

Display display display tone:

enter image description here

So far, I would like to receive external advice on my work. Is this right or wrong? I am not very sure of the result with respect to step 3 (part of the scaling and blurring).

I think the blur effect is not very pronounced! However, I use a 35x35 convolutional core (that would be enough, I think :)).

But I'm really intrigued by the pdf article. Here is the presentation of the flower conveyor (the presentation is the same as me).

enter image description here

Link

https://www.google.fr/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&uact=8&ved=0CCMQFjAA&url=https%3A%2F%2Ftransporter-game.googlecode.com%2Faltesight pdf & ei = buBhVcLmA8jaUYiSgLgK & usg = AFQjCNFfbP9L7iEiGT6gQNW6dB2JFVcTmA & bvm = bv.93990622, d.d24

As you can see in the picture, the bleeding blur effect is much stronger than mine! Do you think the author uses several convolution kernels (higher resolutions)?

The first thing I don’t understand is how the Gaussian blur algorithm creates different colors than white (grayscale values) in the third image. I looked very closely (high zoom) at a bright picture (second), and all the pixels seemed to be close to white or white (shades of gray). One thing is certain: there are no blue or orange pixels on the bright texture. So, how can we explain this transition from picture 2 to picture 3? This is very strange for me.

The second thing I don’t understand is the high difference of the bleeding blur effect between images 3, 4, 5 and 6! In my view, I use a 35x35 convolution core, and the end result is close to the third image here.

How can you explain this difference?

PS: Please note that I use GL_HALF_FLOAT and the internal pixel format GL_RGBA16F to initialize the rendering transfer texture (all other rendering passes are initialized as GL_RGBA and GL_FLOAT data types).

Is there something wrong with my program?

Many thanks for your help!

+7
shader opengl glsl
source share
1 answer

Blurry textures of small size do not seem blurry. I think there is some kind of problem regarding the filter width (not the number of samples, but the distance between the samples) or the framebuffer size.

Let's say you have 150x150 of the original fbo, a 15x15 version for scaling for flowering. And you use a 15x15 blur filter.

The blurry, high-resolution version will affect the 7px stroke around the bright parts. But when blurring a low-resolution image, the core width would practically affect the entire image area. With a low value, a 7px stroke means the entire image area. Thus, all pixels in the blurry low-resolution version will have some contribution to the final composite image. Thus, a high-resolution blurred image would blur for 7px strokes around vibrant details, while a low-resolution blurred image would significantly change the entire image area.

Your low-resolution images just don't seem to be well blurred, because they still remain within the 35 / 2px stroke around the bright parts, which is wrong.

I hope I was able to explain what is wrong. What needs to be changed exactly, maybe the size of the viewport when blurring low-resolution images, but I just can’t be 100% sure.

+1
source share

All Articles