Writing HLSL4 Pixel Shaders to Perform a 2Dtexture Search

I am a budding author of pixel shaders and I have problems. I want to take an image with 256x256, 16-bit input (DXGI_FORMAT_R16_UINT) and pass it through a 256x256 search texture (DXGI_FORMAT_R8_UNORM) to convert it to 256-bit 256x256 8-bit output.

Unfortunately, I seem to be facing a lot of problems, and the output seems to always be clamped to black or white.

Also, I'm not sure which DXGI formats I should use, as well as which data type correlates with each format.

// Global Variables Texture2D<uint> imgTexture : register( t0 ); Texture2D lutTexture : register( t1 ); SamplerState SampleType : register( s0 ); // Structures struct PS_INPUT { float4 Pos : SV_POSITION; float2 Tex : TEXCOORD0; }; // Pixel Shader float4 PS( PS_INPUT input) : SV_Target { uint pixelValue = imgTexture[input.Tex]; uint2 index = { pixelValue / 256, pixelValue % 256 }; // uint row = pixelValue / 256; // uint col = pixelValue % 256; float4 output = lutTexture[index]; output.g = output.r; output.b = output.r; output.a = 1.0f; return output; } 

Should I normalize pixelValue before trying to turn it into a 2D index?

Should I normalize the index before using it?

Should a sample be taken instead?

Am I even on the right track here?

I would be grateful for any help, thanks!

+4
source share
1 answer

You are definitely on the right track. But, as Valmont mentioned, the value of pixelValue will be in the range [0..1].

How exactly is the LUT configured? I assume that the first axis is the value to be converted, but what is the second? As soon as I know this, I can give a solution with the code.

0
source

All Articles