Android: How to find the position of the mouse pointer hit on the object in 3d space

I have trouble of finding the hitpoint on the 3d space. I currently have viewport data which is the screen size, the depth from camera to the object (object is = plane parallel with the screen), and the mouse pointer on the screen.

I am using opengl es 2.0, how do if ind the hit point on the object which was parallel with the camera.

Setup for my camera:

App app = App.getInstance();

   float widthDP = width;
   float heightDP = height;

   float ratio = widthDP / heightDP ;

   mShader = new SimpleShader();
   Polygon poly = Polygon.Builder()
            .addVertex(new Point(widthDP*6,heightDP*2))
            .addVertex(new Point(widthDP*6, 0))
            .addVertex(new Point(0, heightDP*2))
            .addVertex(new Point(0, 0)).build();

    Environment environment = Environment.getInstance();
    environment.setScreenSize((int)widthDP, (int)heightDP);

    Camera camera = Camera.getInstance(); 
    camera.setEye(new Point3D(widthDP*2,heightDP, -heightDP));
    camera.setLook(new Point3D(widthDP*2,heightDP,0));
    camera.setUp(new Point3D(0,1,0));

    Matrix.frustumM(camera.getProjectionMatrix(), 0, -ratio, ratio, -1, 1, 1, 1000);

This is the code I followed from the link:

App app = App.getInstance();
Vector3D lookAt = new Vector3D(look.x, look.y, look.z);
Vector3D position = new Vector3D(eye.x, eye.y, eye.z);
Vector3D up = new Vector3D(this.up.x, this.up.y, this.up.z);

	Vector3D view = Vector3D.sub(lookAt, position);
	Vector3D h = view.cross(up);
	Vector3D v = h.cross(view);
	float rad = (float) (45.0f * Math.PI / 180.0f);
	float vLength = (float) Math.tan( rad / 2 ) * 1;
	float hLength = vLength * (app.getWidth() / app.getHeight());
	//point.x = app.getWidth() - point.x;
	//point.y = app.getHeight() - point.y;
	point.x -= app.getWidth()/2;
	point.y -= app.getHeight()/2;
	point.x *= -1;
	point.y *= -1;
	Log.w("point", "Width x="+point.x +" y="+point.y);
	point.x /= app.getWidth();
	point.y /= app.getHeight();
	Vector3D pos = Vector3D.add(position, view);
	Vector3D dir = Vector3D.sub(pos, position);
	float s = -1*pos.z / dir.z;
	float[] newPoint = new float[2];
	newPoint[0] = pos.x + dir.x *s;
	newPoint[1] = pos.y + dir.y *s;

The two lines below, i did that because the screen coordinates starting from the center of the screen and towards left as a positive x-axis and upwards as positive for y-axis. whereas towards the right was negative and toward bottom is negative.

	point.x *= -1;
	point.y *= -1;

Hi, i have the same problem.
The problem of the screen coordinate i fixed with the :

float x = mouseWindowPosition.x;
float y = mouseWindowPosition.y;

float windowWidth  = self.bounds.size.width  * 0.5;
float windowHeigth = self.bounds.size.height * 0.5;

NSPoint newCoordinate = NSMakePoint((x - windowWidth) / windowWidth, (y - windowHeigth) / windowHeigth);

i am able to select the “cube” if is close to the camera, but as it goes far from the camera, the coordinates are very incorrect.
I have to shoot a ray, but i can’t figure how to retrieve the intersection.
I am trying to work it out with an inverse matrix from the mouse position to the position of the object, this way my matrix will not move to center of the screen when is moving far from the camera, but it will move in the other direction.

mousePosiotion = point(-0.3, 0.0) ;
for ( ObjModel *obj in ObjArray )
mouseMatrix = vec3 (-0.3, 0.0, obj.ZPos);
// if intersection here

this is just theory, i’m still working on it.
If you find something else, let me know.