HLSL Render To Texture / Stream Output in Unity

I am trying to implement an HLSL shader in Unity 5 that shifts the vertices of a sphere.

What I'm trying to solve is how to efficiently transfer information from one frame to another so that I can "save" the calculated speeds of each vertex.

On Nvidia dev site I found an interesting article that discussed the modeling of fluid dynamics in GPUs. This article says:

However, on the GPU, the output of the fragment processors is always written to the frame buffer. Think of the frame buffer as a two-dimensional array that cannot be read directly. There are two ways to get the contents of the frame buffer into a texture that can be read:

  • Copy to texture (CTT) copies from the frame buffer to the texture.
  • Render to texture (RTT) uses the texture as a frame buffer, so the GPU can write directly to it.

I remember reading (the source lost) that in Unity, RenderTextureis FBO, but after several hours of searching I still don’t understand how to implement it (even if this is the right way to achieve this), My current thought was that I would create 2 passes, one for calculating new speeds from existing speeds, and then the second for grid manipulation.

I was thinking of writing a GPGPU Shader, but I desperately want all these features to be contained in one Shader.

I'm at a dead end, someone can just point me in the right direction.

Clarification:

In Unity Compute ShaderI can display / output everything that I want for the rendering texture (s):

Render Textures , " " ( " " DX11), . RenderTexture.enableRandomWrite. http://docs.unity3d.com/Manual/ComputeShaders.html

:

DirectX10 , Stream Output, . OpenGL 3.0 Transform Feedback. http://ogldev.atspace.co.uk/www/tutorial28/tutorial28.html

? Unity 5.1+. ?

+4

All Articles