I am reading “Essential Mathematics for Games & Interactive Applications” by Van Verth and Bishop, where authors say :
‘And even in OpenGL, despite the fact that the documentation is written using column vectors, the internal representation premultiplies the vectors; that is, it expects row vectors as well.’
Ok, I understand that matrices in OGL are stored in column major order to pretranspose a matrix in the storage representation. But still - in OpenGL one always combines transformations in the opposite order, so how is it possible, that ‘internal representation premultiplies the vectors’?
I would be thankful for making this one clear for me.