I use OpenGL for rendering, and when I write linear values ​​to a standard framebuffer (without any gamma correction), they seem linear on my monitor. This contradicts everything that I thought about gamma correction (as described here: http://gamedevelopment.tutsplus.com/articles/gamma-correction-and-why-it-matters--gamedev-14466 ). Without gamma correction, I would expect the middle colors to be darkened non-linearly on my monitor.
But here is what I actually see; first without gamma correction on my part, then with gamma correction: 
Here is my shader fragment without gamma correction (drawn on the full-screen quad-core image by default for the framebuffer). This results in a linear image on the left:
out vec4 fsOut0; void main( void ) {
And here is a shader with added gamma correction (from linear space to sRGB). This results in a brighter image on the right:
out vec4 fsOut0; void main( void ) {
I check if the colors are linear by simply looking at them and using the color set in Photoshop and looking at the differences in RGB values ​​between the color bars. For a linear-looking image, the difference between each color is (mostly) constant.
I also tried requesting an sRGB framebuffer. In this case, recording linear values ​​without gamma correction looks like a second image (non-linear).
What am I doing wrong? Or maybe my two monitors are both distorted and that Photoshop does not select colors in linear space? Or is my “non-linear” image really the correct linear result, but it just doesn't seem linear to my eyes?
My question is what duplicates: Is gamma necessary to correct the final color on a modern computer / monitor Unfortunately, the accepted answer is extremely confusing, and its parts that I could track seem contradictory or at least not fully explained to someone less knowledgeable than the defendant.
source share