Std::vector in GLSL?

I have a 3D terrain, and I edit it with a rectangular brush defined by a position, width, and length. I’m trying to get this brush to show up as a green rectangle under the mouse. I do this by creating two std::vectors containing all the X coordinates and Z coordinates of the terrain covered by the brush, then every time the brush moves I recalculate all of the terrain vertices’ positions and colors, and if they fall within the brush I set their colors to green, and if not then to clear. Then I upload all the terrain data to the GPU with glBufferData.

It works okay, but the trouble is that it’s really inefficient since I’m sending data about every single vertex to the GPU every time the mouse’s position changes even by one pixel, which could be multiple times a second. As a result, other code is not being run unless I hold the mouse motionless.

This is the code for changing a vertex’s color:

// Check if this vertex's X and Z coordinates are in the lists of X and Z coordinates within the brush:
if ((std::find(brushXCoords.begin(), brushXCoords.end(), (float)xCoor) != brushXCoords.end()) && (std::find(brushZCoords.begin(), brushZCoords.end(), (float)zCoor) != brushZCoords.end())){
this->backgroundSolidVertices[iterBase].RGBAdata = glm::vec4(0.0f, 1.0f, 0.0f, 0.7f);
else {
this->backgroundSolidVertices[iterBase].RGBAdata = glm::vec4(0.0f, 0.0f, 0.0f, 0.0f);

This code executes every time the mouse moves, for every single vertex. brushXCoords and brushZCoords are std::vectors containing the xCoords and zCoords within the brush. backgroundSolidVertices is an std::vector of all the vertices.

What I hope to do is move this into the vertex shader. If I provide the shader with the brush’s current position and dimensions, I could then write code to determine whether a particular vertex is in the brush or not, and if so set its’ color to green before passing it to the fragment shader. The trouble is that brushXCoords and brushZCoords are std::vectors and I don’t know if GLSL has any data type that acts like an std::vector. I know it has arrays, but I can’t use those to hold brush coordinates since the dimensions aren’t guaranteed to be constant and the array might be overrun.

I’m using GLSL 1.3 (the most recent my machine can support) with OpenGL 3.1. Is there some way I could pull off std::vector-like functionality in this version of GLSL? Or is there some other way to compute this, without using std::vectors to hold the brush coordinates?

You’re thinking about this completely backwards.

On the CPU, you look at the entire vertex array, comparing each vertex in turn to some value.

On the GPU, OpenGL processes the entire vertex array, computing output values from the vertex shader for each. OpenGL handles the “for each vertex” part; you just provide the stuff to do for each vertex.

In short, your vertex shader doesn’t need to loop over each vertex, because it already is. It’s being called once for each vertex. So take the current vertex, compare it against the brush, and output a different color as needed.

I realize the vertex shader operates for each vertex. It’s comparing the vertex against the brush I’m having trouble with. On the CPU I would construct a vector of X and Z coordinates that lie inside the brush based on the position of the brush’s center and its dimensions, then check to see if the vertex’s X and Z coordinates are included in those vectors. When trying to move the code to the GPU, I’m not sure how to do the “construct a vector of X and Z coordinates” part, since I don’t know exactly how many coordinates there’ll be. I can pass the position of the brush’s center, and the brush dimensions, but I’m not sure how to construct lists of brush coordinates from that data to check the current vertex against.

On the CPU I would construct a vector of X and Z coordinates that lie inside the brush based on the position of the brush’s center and its dimensions

What does that mean? Presumably your brush has a size, right? Can’t you just check to see if the vertex position is within that particular size range in the X and Z dimensions?

Whatever you do to decide if a position lies “inside the brush” is what you put in your vertex shader. There’s no need for building an array of anything; you do the test for the current vertex you’re given.

Ah, you’re right, I can just check to see if it’s within the size range. I wrote up some GLSL code to do this:

	float minX;
	float maxX;
	float minZ;
	float maxZ;
	vec2 brushPos;
	if (((brushRealPos.x/8.0) - floor(brushRealPos.x/8.0)) < (ceil(brushRealPos.x/8.0) - (brushRealPos.x/8.0))){
		brushPos.x = floor(brushRealPos.x/8.0);
	else {
		brushPos.x = ceil(brushRealPos.x/8.0);
	if (((brushRealPos.z/8.0) - floor(brushRealPos.z/8.0)) < (ceil(brushRealPos.z/8.0) - (brushRealPos.z/8.0))){
		brushPos.y = floor(brushRealPos.z/8.0);
	else {
		brushPos.y = ceil(brushRealPos.z/8.0);
	brushRealPos /= 8.0;
	if (brushDimens.x % 2 == 1){
		minX = brushPos.x - float(brushDimens.x);
		maxX = brushPos.x + float(brushDimens.x);
	else {
		minX = ((float(floor(brushRealPos.x))) - ((float(brushDimens.x)/2.0) - 1.0));
		maxX = ((float(floor(brushRealPos.x))) + (float(brushDimens.x)/2.0));
	if (brushDimens.y % 2 == 1){
		minZ = brushPos.z - float(brushDimens.y);
		maxZ = brushPos.z + float(brushDimens.y);
	else {
		minZ = ((float(floor(brushRealPos.z))) - ((float(brushDimens.y)/2.0) - 1.0));
		maxZ = ((float(floor(brushRealPos.z))) + (float(brushDimens.y)/2.0));

	if (in_Position.x <= maxX && in_Position.x >= minX && in_Position.z <= maxZ && in_Position.z >= minZ){
		vec4 brushColor = vec4(0.0, 1.0, 0.0, 0.7);
		ex_Color = brushColor;

brushDimens is an ivec2 and brushRealPos is a vec3. I calculate the minimum and maximum X and Z coordinates for the brush based on whether or not the brush dimensions along that axis are odd or even. Then I check to see if the vertex’s position (in_Position) is within that range, and if so, pass a green color to the fragment shader.

It’s not quite working; the vertices all show up white. I think I have a syntax error somewhere, but I’m not used to writing GLSL code, so I’m not sure where it could be. I read on NeHe that you couldn’t assign a float to an integer or vice versa, so I tried to cast my brush dimensions (which are ints) as floats before I added them to other floats. I’m not sure I did that correctly. Or maybe there’s some other error I can’t see. The code itself is adapted from what I had CPU-side, which was working fine, so I think the problem is with my GLSL syntax.

The other possibility is that I’m uploading the uniforms wrong. I have this:

void terrain::updateBrushDataUniforms(glm::vec2 brushDimens, glm::vec3 brushRealPos){
    GLuint dimenUniLoc = glGetUniformLocation(this->progId, "brushDimens");
    GLuint posUniLoc = glGetUniformLocation(this->progId, "brushRealPos");
    glUniform2fv(dimenUniLoc, 1, &brushDimens[0]);
    glUniform3fv(posUniLoc, 1, &brushRealPos[0]);

Where brushDimens is a glm::vec2 of ints and brushRealPos is a glm::vec3 of floats. Particularly the part I’m worried about is glUniform2fv and glUniform3fv. I know there’s a number of versions of glUniform; I’m not sure if I used the right one, and used it correctly. The online documentation is confusing. :frowning:

How do I used glUniform properly? And do I have the right GLSL syntax?

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.