Basic Question: How to map NDC (1,1) to window coordinates

I am interested in how rasterization works. That’s why I started to program a software rasterizer. I found following equation to map NDCs of range [-1 1] to window coordinates:

Xw = (xndc+1)*(width/2)+x
yw = (yndc+1)*(height/2)+y

Both equations seem reasonable until I take the framebuffer into consideration.

For example, if I map a vertex with xndc=yndc=1 to window coordinates, this would result in:

Xw = width+x
yw = height+y

If I would use this coordinate to address the framebuffer, this would result in a buffer overflow since the pixel address width*height is outside the framebuffer address range 0 <= address < width*heigth-1.

Can someone explain to me how OpenGL deals with ndc coordinates equal to 1?

What are the x and y values that don’t have a suffix in this example?

As far as I know, x and y are related to the window position on a screen/display. In detail, the equations should be a part of a viewport transformation function.

In those equations, x, y, width and height are the position and size of the viewport, in window coordinates. The lower-left corner of the viewport is [x,y], the upper-right corner is [x+width,y+height]. So the NDC range [-1,1]×[-1,1] is mapped to the viewport.

The viewport isn’t inherently constrained to the bounds of the target (window, pixmap, framebuffer object), although it’s unusual to intentionally set a viewport which extends beyond the target; it may happen unintentionally if a window is resized and the viewport isn’t changed accordingly.

Writes to pixels which aren’t owned by the target are ignored. This includes pixels which are outside of the bounds of the target, but also includes parts of a window which are outside the bounds of the screen or which are obscured by another window.

Thanks for the answer. Anyhow, I am still confused. NDC (1,1) should result in a pixel coordinate of the target window.This also means that the corresponding framebuffer address should also be part of the target address space. According to the viewport transformation function, the address should be width * height. However, the address space can only be in the range 0 <= address < width*height-1.

So, does this mean that OpenGL actually does not Render NDCs of value 1?

When you specify a viewport, you specify a width in window-space pixels. This width represents a width; if you ask for 640 pixels across, starting from 0, that means all pixels from 0 to 639 inclusive. That’s 640 distinct pixels.

Pixel 639 will be drawn to. 640 won’t be, because of how rasterization works. A vertex would have to have an NDC space coordinate of exact 1.0 for its window-space coordinate to be 640. But does that count as the center of the pixel’s area of 640?

No. The center of that pixel has a window-space coordinate of 640.5. Therefore, it won’t be rasterized to (maybe multisampled rendering could hit it, but probably not). You would have to have a vertex with an NDC-space coordinate of 1 + 1/640… which can’t happen because that vertex would have been clipped.

So it’s not a problem.

As I said, the viewport isn’t constrained to the target drawable. It’s entirely possible to generate window coordinates outside of the target, either because x or y are negative or x+width or y+height are larger than the dimensions of the target.

Aside from the possibility of the viewport extending beyond the bounds of the target, rasterisation can generate fragments outside of the viewport. The viewport determines the bounds for clipping geometry, but points and lines are centred on their geometry, not bound by it, so fragments generated when rasterising points and lines can fall outside of the viewport.

OK, thanks for your explanations. That helped alot. :slight_smile:

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.