# What coordinates does OpenGL use for GL_POSITION?

I was writing the code for a deferred rendering engine when I found something extremely odd

In my vertex shader, i’m computing the location of the vertex in -what I thought was- normalized device coordinates, gl_Position

In my fragment shader, I have passed that value down with smooth interpolation between vertices and i’ve discovered that apparently it goes well outside the bounds of -1,1 despite being visible!

I’m in OpenGL 4.5, btw

none of this immediate mode stuff

``````#version 330

//Vertex Attributes
layout( location = 0 ) in vec3 vPosition;
layout( location = 1 ) in vec2 intexcoord;
layout( location = 2 ) in vec3 Normal;
layout( location = 3 ) in vec3 VertexColor;

out vec2 texcoord;
out vec3 normout;
//IGNORE THIS it is currently being phased out,
out vec3 vert_to_light; //The vector going from our vertex to our light, which will be interpolated to each fragment.
out vec3 vert_to_camera; //The vector going from our vertex to the camera. Interpolated over the fragments. 100% accurate.
out vec3 Smooth_Vert_Color;
flat out vec3 Flat_Vert_Color;
out vec3 ND_out;

vec3 worldpos; //Position of the fragment in the world!

uniform vec3 LightPosition; //The location of our light in WORLD SPACE!
uniform mat4 World2Camera; //the world to camera transform. I figure this is faster than calculating MVP seperately per vertex.
uniform mat4 Model2World; //Model->World
uniform mat4 viewMatrix; //Result of glm::lookAt();

void
main()
{
//The position of this vertex in the world coordinate system.
worldpos = (Model2World * vec4(vPosition,1.0)).xyz;

vec4 big_gay = World2Camera * Model2World * vec4(vPosition,1.0);
texcoord = intexcoord; //this is faster
gl_Position = big_gay;
normout = (Model2World * vec4(Normal, 0.0)).xyz;
//This vertex to the light
vert_to_light = LightPosition - worldpos;
//This vertex to the camera
vert_to_camera = (inverse(viewMatrix) * vec4(0.0,0.0,0.0,1.0)).xyz  - worldpos;
ND_out = big_gay.xyz;
Smooth_Vert_Color = VertexColor;
Flat_Vert_Color = VertexColor;
}
``````

``````#version 330
//out vec4 fColor[2];

in vec2 texcoord;
in vec3 normout;
flat in vec3 Flat_Vert_Color;
in vec3 Smooth_Vert_Color;
in vec3 ND_out;

uniform sampler2D diffuse; //This is actually the texture unit. limit 32. This one happens to be for the literal object's texture.

uniform float ambient;
uniform float specreflectivity;
uniform float specdamp;
uniform float emissivity;

uniform float diffusivity;
uniform uint flags;

vec2 bettertexcoord;

void uncool_main()
{
gl_FragData[0] = vec4(Smooth_Vert_Color,0);
}

void main()
{
//Just normalizing our vectors...
vec3 UnitNormal = normalize(normout);

bettertexcoord = vec2(texcoord.x, -texcoord.y); //Looks like Blender

//OPENGL PROGRAMMERS LOOK HERE! This is the line that is giving me trouble. Apparently, on the screen, ND_modified's x, y, and z attributes all go well over 1 or well under 0 all across most of the screen.
vec4 ND_Modified;
ND_Modified = vec4(
ND_out.x * 0.5 + 0.5,
ND_out.y * 0.5 + 0.5,
ND_out.z * 0.5 + 0.5,
1.0);
// if (ND_Modified.x > 1 || ND_Modified.x < 0 ||
// ND_Modified.y > 1 || ND_Modified.y < 0 ||
// ND_Modified.z > 1 || ND_Modified.z < 0)
// ND_Modified = texture2D(diffuse, bettertexcoord);
if (ND_Modified.x > 1 ||ND_Modified.y > 1 || ND_Modified.x > 1)
ND_Modified = texture2D(diffuse, bettertexcoord);

// (diffuse component * texture) + specular
gl_FragData[0] = (texture2D(diffuse, bettertexcoord));
//gl_FragData[0] = (vec4(betterdiffuse,1.0) + vec4(ambient,ambient,ambient,1.0)) * vec4(Smooth_Vert_Color,0) + vec4(finalSpecular, 1.0);
gl_FragData[1] = vec4(UnitNormal,1.0);
//gl_FragData[2] = vec4(1.0,1.0,1.0,1.0); //2 is set by GkScene to be the Mask.
gl_FragData[2] = ND_Modified; //3 is the CAMERA SPACE writeout.
gl_FragData[3] = vec4(
ambient,
specreflectivity,
specdamp/100.0, //This should give a wide range...
emissivity
);
gl_FragData[4] = vec4(
diffusivity,
1.0,
1.0,
1.0 //We will use this as the mask. No mask needed. That's why we only have 6 outputs.
);
gl_FragData[5] = vec4( //For cubemap reflect out. We have to make cubemaps first, don't we?
ND_Modified.x,
ND_Modified.y,
ND_Modified.z,
ND_Modified.w
);
}
``````

Explain this, computer nerds

[QUOTE=Geklmin;1291150]What coordinates does OpenGL use for GL_POSITION?
In my vertex shader, i’m computing the location of the vertex in -what I thought was- normalized device coordinates, gl_Position[/QUOTE]

Clip space.

It’s the space before normalized device coordinates (NDC).

Also if the names of your vertex shader transforms are correct, you’re passing an eye-space position down in gl_Position, not clip space nor NDC space.

The legendary Dark Photon, guru of the OpenGL forums himself has replied?

Thank you very much. You’re right and I fixed it. It now works perfectly
(EDIT: I did have to some refactoring but that was because I needed the screen size)

By the way…

I’ve seen you around these forums for a couple years now, before I actually got around to joining

You’ve actually answered hundreds of my questions before I even asked them. You’re kind of a… role model? Yeah.

you really get around!

Could we chit-chat some time?

Regardless of whether or not you reply, thanks. Your comments have really helped me for a long time.

Glad to help! I’m just one of many here though.