I am having trouble constructing an orthographic matrix for 2D drawing. I am using this code to make my matrix:

```
float right = width;
float left = 0;
float top = 0;
float bottom = height;
orthomatrix[0].x = -2.0/(right-left);
orthomatrix[0].y = 0;
orthomatrix[0].z = 0;
orthomatrix[0].w = -(right+left)/(right-left);
orthomatrix[1].x = 0;
orthomatrix[1].y = 2.0/(top-bottom);
orthomatrix[1].z = 0;
orthomatrix[1].w = -(top+bottom)/(top-bottom);
orthomatrix[2].x = 0;
orthomatrix[2].y = 0;
orthomatrix[2].z = 2.0/(Zfar-Znear);
orthomatrix[2].w = -(Zfar+Znear)/(Zfar-Znear);
orthomatrix[3].x = -1.0;
orthomatrix[3].y = +1.0;
orthomatrix[3].z = 0;
orthomatrix[3].w = 1;
```

I would like to have it so that vertex float coordinates correspond perfectly to pixels, so the user can create a vertex array and use the pixel coordinates for vertex XY positions.

I can get correct values by adding 0.5 to the vertex positions and dividing by the graphics viewport width and height. Can anyone help me make a matrix so that a vertex at position 5,7,0 will appear at pixel 5,7, where the top pixel of the viewport is 0?

Thanks.

glViewport(0,0,width,height);

glMatrixMode(GL_PROJECTION);

glLoadIdentity();

glOrtho(0,width,0,height,-1,1);

glMatrixMode(GL_MODELVIEW);

You will have to tweak a bit more so that top is 0 instead of height-1.

I’m using OpenGL 3, so I have to do my own matrix multiplications. The formula I used appears to be correct, but it is giving weird results. Here’s my vertex shader:

```
#version 140
uniform mat4 projectionmatrix;
in vec3 in_Position;
in vec3 in_Color;
out vec3 ex_Color;
void main(void)
{
gl_Position = projectionmatrix * vec4(in_Position, 1.0);
ex_Color = in_Color;
}
```

Whoops, I had the matrix wrong. Should be like this, which makes more sense:

```
orthomatrix[0].x = 2.0/(right-left);
orthomatrix[0].y = 0;
orthomatrix[0].z = 0;
orthomatrix[0].w = 0;
orthomatrix[1].x = 0;
orthomatrix[1].y = 2.0/(top-bottom);
orthomatrix[1].z = 0;
orthomatrix[1].w = 0;
orthomatrix[2].x = 0;
orthomatrix[2].y = 0;
orthomatrix[2].z = 2.0/(Zfar-Znear);
orthomatrix[2].w = 0;
orthomatrix[3].x = -(right+left)/(right-left);
orthomatrix[3].y = -(top+bottom)/(top-bottom);
orthomatrix[3].z = -(Zfar+Znear)/(Zfar-Znear);
orthomatrix[3].w = 1;
```

A vertex at 5.5,6.5 will appear at pixel 5,6, where 0,0 is the upper-left corner.

I think the equations you want are these:

x_max = viewport_width - 1.0;

y_max = viewport_height - 1.0;

x_new = 2.0 * x / x_max - 1.0;

y_new = -(2.0 * y / y_max - 1.0);

z_new = 2.0 * (z - zNear) / (zFar - zNear) - 1.0;

w_new = 1.0;

So, your projection matrix could look like:

```
| 2/x_max 0 0 -1 |
| 0 -2/y_max 0 1 |
| 0 0 2/(zFar-zNear) (zNear+zFar)/(zNear-zFar) |
| 0 0 0 1 |
```

This should map a vertex at (5.0, 6.0) to the pixel at (5, 6), where (0, 0) is the upper-left corner. There is one difference between mine and yours: I subtracted one from the height and width (which should take care of the fractional bit you are adding to the vertex coordinates). I did that because viewport_width and viewport_height are the number of pixels wide and tall, but the pixels are addressed beginning with zero, so the lower-right corner is pixel (viewport_width - 1, viewport_height - 1).

I have found that the +0.5 value is needed for drawing lines. It should not be used when drawing solid rectangles.

Additionally, when I call my DrawPixel() command with width-1 and height-1 as the coordinate, a pixel is drawn in the very bottom-right corner. This indicates that using the full viewport width is correct.

Your code might work for lines, but the programmer will then have to subtract 0.5 when drawing solid rectangles.

I don’t know, it’s working and this stuff makes my head hurt. XD

I have found that the +0.5 value is needed for drawing lines.

Is this with or without multisample antialiasing?