Hi!
As I understand OpenGL correct, when a Vector is being rendered it is firstly multiplied automatically by the current Modelview- and Projectionmatrix?!
So I don´t have to calculate all vertices by hand with my own matrices?!

But now I really want to get a vector v1 out of a given vector v2 and the modelview matrix.
How can I do this??? I think loading the ModelviewMatrix and then Multiply it by the vector…how ist it done???
Please help!

Originally posted by TheBlob:
[b]Hi!
As I understand OpenGL correct, when a Vector is being rendered it is firstly multiplied automatically by the current Modelview- and Projectionmatrix?!
So I don´t have to calculate all vertices by hand with my own matrices?!

[/b]

yup, puts it all in hardware TnL if available as well.

How can I do this??? I think loading the ModelviewMatrix and then Multiply it by the vector…how ist it done???

2 solutions…

perform all matrix transformations youself for the whole scene. You’ve then got the matrix to do your maths with, and can then use glMultMatrix() to multiple the current model view matrix with this one.

carry on as you are, but use glGetFloat to read the model view matrix into a float array.

Both will work, I personally prefer the first one because reading data back from openGL tends to be slow…