All the Stage3D examples I've seen create a projection matrix of the model representation in AS3 for each rendering event. eg:
modelMatrix.identity();
modelMatrix.translate/rotate/scale
...
modelViewProjectionMatrix.identity();
modelViewProjectionMatrix.append( modelMatrix );
modelViewProjectionMatrix.append( viewMatrix );
modelViewProjectionMatrix.append( projectionMatrix );
context3D.setProgramConstantsFromMatrix( Context3DProgramType.VERTEX, 0, modelViewProjectionMatrix, true );
...
And one line in the vertex shader converts the vertex into screen space:
m44 op, va0, vc0
Is there a reason for this? Are these calculations not designed for what the GPU was created for?
Why not instead update the matrix of vision and projection when they change and upload each into separate registers:
context3D.setProgramConstantsFromMatrix(Context3DProgramType.VERTEX, 0, projectionMatrix, true);
context3D.setProgramConstantsFromMatrix(Context3DProgramType.VERTEX, 4, viewMatrix, true);
Then on each frame and for each object:
modelMatrix.identity();
// Create model matrix here
modelMatrix.translate/rotate/scale
...
// Model matrix to vertex constant register 8
context3D.setProgramConstantsFromMatrix(Context3DProgramType.VERTEX, 8, modelMatrix, true);
...
And instead, the shader will look like this:
m44 vt0 va0 vc8
m44 vt0 vt0 vc4
m44 op vt0 vc0
UPDATE
, , , :
DirectX - GPU CPU