DirectX 9 HLSL vs DirectX 10 HLSL: Is the syntax the same?

Over the past month or so, I went broke trying to learn DirectX. So I got mixed up between DirectX 9 and 10. One of the main changes that I saw in these two is how to handle vectors on a video card.

One of the radical changes that I notice is how you get the GPU to recognize your structures. In DirectX 9, you define flexible vertex formats.

Your typical setup would be this:

#define CUSTOMFVF (D3DFVF_XYZRHW | D3DFVF_DIFFUSE) 

In DirectX 10, I believe the equivalent is a description of the top of the input:

 D3D10_INPUT_ELEMENT_DESC layout[] = { {"POSITION",0,DXGI_FORMAT_R32G32B32_FLOAT, 0 , 0, D3D10_INPUT_PER_VERTEX_DATA, 0}, {"COLOR",0,DXGI_FORMAT_R32G32B32A32_FLOAT, 0 , 12, D3D10_INPUT_PER_VERTEX_DATA, 0} }; 

I noticed in DirectX 10 that it is more descriptive. Other than this, what are some of the significant changes and the HLSL syntax is the same for both?

+6
c ++ c directx
source share
3 answers

I would say that there were no radical changes in the HLSL syntax between DX9 and DX10 (and the DX11 extension).

As Kodka said, change is a matter of clearing the API and the path to generalization (for the sake of GPGPU). But there are really noticeable differences:

Noticeable differences:

  • To go to shaders, you now need to go through constant buffers.

  • Common Shader kernel: all types of shaders have access to the same set (with some exceptions, for example, for the GS stage). Integer and bitwise operations are now fully compatible with IEEE (and are not emulated with floating point). Now you have access to binary methods to interpret int as float, float as uint, etc.

  • Textures and samplers were dissociated. Now you use the syntax g_myTexture.Sample( g_mySampler, texCoord ) instead of tex2D( g_mySampledTexture, texCoord )

  • Buffers: a new type of resource for accessing data that does not require random access filtering using the new Object.Load function.

  • Semantics of system value: generalization and extensions of the semantics of POSITION , DEPTH , COLOR , which are now SV_Position , SV_Depth , SV_Target and the addition of new semantics of a new level, for example SV_InstanceID , SV_VertexId , etc.

What I see now. If something new tells me, I will clarify my answer.

+7
source share

The biggest change I noticed between DX9 and DX10 is that in DX10 you need to install the entire renderstate block, where in DX9 you can change individual states. This violated my architecture a bit, because I rather relied on the ability to make small changes and leave all other states the same (this really becomes a problem when setting states from a shader).

Another big change is the fact that under the vertex declarations DX10 are tied to a compiled shader (in CreateInputLayout). In DX9, this was not so. You just set the ad and set the shader. In DX10, you need to create a shader, and then create an input layout attached to this shader.

As Codeka points out, D3DVERTEXELEMENT9 has been the recommended way to create shader signatures since the advent of DX9. FVF has already depreciated, and through FVF you cannot do something like adjusting the tangent. Vertical walkways are much more powerful and do not force you to snap onto the layout. You can place vertex elements wherever you want.

If you want to know more about DX9 input layouts, I suggest you start with MSDN .

+1
source share

FVFs were (kind of) deprecated in favor of D3DVERTEXELEMENT9 (aka Vertex Declarations ), which in any case is surprisingly similar to D3D10_INPUT_ELEMENT_DESC . In fact, most of what in DirectX 10 is surprisingly similar to what was in DirectX 9 minus the fixed function pipeline.

The biggest change between DirectX9 and DirectX10 was to clean up the API (in terms of separation of problems, which greatly simplified what happens at which stage of the pipeline, etc.).

0
source share

All Articles