OpenGL 3 (LWJGL) LookAt Matrix Confusion

I am learning OpenGL 3 using LWJGL. I tried to implement the gluLookAt() equivalent, and although it works, I am somewhat confused why.

I admit that I just copied this code from various sources on the Internet, but after much research, I think I understand the math behind it, and that I understand what LWJGL does.

However, the โ€œcorrectโ€ gluLookAt behaved incorrectly in my application, as the camera seemed to be mistaken. I managed to run my code by transferring the orthonormal vectors forward , side and up (I hope that I am using the correct terminology!), That I am sure that this is wrong ...

 private static final Vector3f forward = new Vector3f(); private static final Vector3f side = new Vector3f(); private static final Vector3f up = new Vector3f(); private static final Vector3f eye = new Vector3f(); public static Matrix4f lookAt(float eyeX, float eyeY, float eyeZ, float centerX, float centerY, float centerZ, float upX, float upY, float upZ) { forward.set(centerX - eyeX, centerY - eyeY, centerZ - eyeZ); forward.normalise(); up.set(upX, upY, upZ); Vector3f.cross(forward, up, side); side.normalise(); Vector3f.cross(side, forward, up); up.normalise(); Matrix4f matrix = new Matrix4f(); matrix.m00 = side.x; matrix.m01 = side.y; matrix.m02 = side.z; matrix.m10 = up.x; matrix.m11 = up.y; matrix.m12 = up.z; matrix.m20 = -forward.x; matrix.m21 = -forward.y; matrix.m22 = -forward.z; matrix.transpose(); // <------ My dumb hack eye.set(-eyeX, -eyeY, -eyeZ); matrix.translate(eye); return matrix; } 

I donโ€™t think I should do transpose, but without it it wonโ€™t work. I put transpose() because I couldnโ€™t bother to retype all the positions of the btw matrix cell!

I understand that the lookAt matrix form should be as follows:

 [ side.x up.x fwd.x 0 ] [ 1 0 0 -eye.x ] [ side.y up.y fwd.y 0 ] [ 0 1 0 -eye.y ] [ side.z up.z fwd.z 0 ] [ 0 0 1 -eye.z ] [ 0 0 0 1 ] [ 0 0 0 1 ] 

And I think that the LWJGL Matrix4f class represents matrix cells as m<col><row> . The translate(Vector3f) method performs the following actions

 public static Matrix4f translate(Vector3f vec, Matrix4f src, Matrix4f dest) { ... dest.m30 += src.m00 * vec.x + src.m10 * vec.y + src.m20 * vec.z; dest.m31 += src.m01 * vec.x + src.m11 * vec.y + src.m21 * vec.z; dest.m32 += src.m02 * vec.x + src.m12 * vec.y + src.m22 * vec.z; dest.m33 += src.m03 * vec.x + src.m13 * vec.y + src.m23 * vec.z; ... } 

So I am left very confused as to which part of it I messed up. Is this my understanding of the lookAt matrix, the main or main row of a column / row (is that a word ?!) Matrix4f or something else? Is the rest of my code left? Is this really correct and am I just too worried? Am I just an idiot?

Thanks.

+5
source share
1 answer

You do not have to transpose anything. You must cancel the lookAt () matrix "eye" vector and the search direction. "EYE" corresponds to the position of the camera, which should always be inverted. All this is done inside lookAt ()

Here is the lookAt () method from the Java port of the famous GLM math lib .

  public static Mat4 lookAt(Vec3 eye, Vec3 center, Vec3 up) { Vec3 f = normalize(Vec3.sub(center, eye)); Vec3 u = normalize(up); Vec3 s = normalize(cross(f, u)); u = cross(s, f); Mat4 result = new Mat4(1.0f); result.set(0, 0, sx); result.set(1, 0, sy); result.set(2, 0, sz); result.set(0, 1, ux); result.set(1, 1, uy); result.set(2, 1, uz); result.set(0, 2, -fx); result.set(1, 2, -fy); result.set(2, 2, -fz); return translate(result, new Vec3(-eye.x,-eye.y,-eye.z)); } 

I use it with my LWJGL-based OpenGL 4 rendering, and it works like a charm :)

+10
source

All Articles