Up-Vector of gluLookAt( ... );

Please help me to resolve the following issue:

Trying to make the camera follow an object in 3d space.

For every DrawGLScene() frame, I’m calculating 6 positions w.r.t to the object’s coords(x,y,z) for 6 Views of the object. i.e Top, Bottom, Left, Right, Front & Rear Views respectively.

The user can “switch” ( i.e to say jump ) between different views.

I’m trying to interpolate (linear) b/w these jumps so that Camera “softly” shifts b/w 2 camera postions.

Problem is with the “Up-Vector”, which I’m trying to calculate for every frame.

Its a bit tilted and “attached” to the object’s co-ordinate system.

For the Other 4 views, UP_VECTOR = 0,1,0 ;

I’m performing the following operations for calculating the “Up-Vector” and passing it to gluLookA(…) function :-

Vector Camera(cx,cy,cz);
Vector Object(ox,oy,oz);

// projection of the camera-location on
// object plane ( || to X-Z plane)
Vector Project(cx,oy,cz);

Vector Look = Camera - Object;

// This is from projected point
// to the object.
Vector Base = Project - Object;

// This is from Camera to Projected point.
Vector Height = Camera - Project;

Now after calculating the vectors, I’m first calculating the right vector like this:
( ‘^’ - overloaded for cross - product )
Right = Height^Look;
Up = Right^Look;
Up.Normalize(); —> Pass this (Up.x, Up.y, Up.z ) to gluLookAt.