Hello! I am writing a basic voxel-based raytracing program in OpenGL 4.3. For now I’m keeping it as simple as possible and the raytracing algorithm is just a small fragment shader that repeatedly calculates the next voxel location it will cross into, checks if there is a cube occupying that space, and colors the pixel black if so. When I tried to implement this, however, I get these crazy looking interference patterns on the surface of the cube, suggesting that not all the rays are properly sensing their location. Here’s a screenshot: (the site wouldn’t let me post more than one)

Here’s the fragment shader code (I don’t think that the C++ code that implements the shader is relevant in this case):

#version 430 core

```
in vec3 virtualScreenCoords; // Basically the endpoints of the transformed virtual screen are
out vec4 fragColor; // passed from the vertex shader so that I don't have to code the
// linear interpolation myself
uniform vec3 cameraPos;
void main() {
vec3 rayDir = normalize(virtualScreenCoords - cameraPos); // Direction of the ray
vec3 cubeStart = vec3(10.0, 3.0, 10.0); // Start and end points of the cube to render
vec3 cubeFinish = vec3(20.0, 13.0, 20.0);
vec3 pos = cameraPos; // Position of the 'tip' of the ray as it gets moved forward
vec3 temp;
float t;
fragColor = vec4(1.0);
for (int iii = 0; iii < 64; iii++) {
temp.x = (rayDir.x > 0.0) ? floor(pos.x + 1.0) : ceil(pos.x - 1.0);
temp.y = (rayDir.y > 0.0) ? floor(pos.y + 1.0) : ceil(pos.y - 1.0);
temp.z = (rayDir.z > 0.0) ? floor(pos.z + 1.0) : ceil(pos.z - 1.0);
// Now, temp contains all of the possible cube walls that the ray could hit next
temp = (temp - cameraPos) / rayDir;
t = min(temp.x, min(temp.y, temp.z));
pos = cameraPos + t * rayDir;
// Finds the closest wall and moves onto it
fragColor = (
pos.x >= cubeStart.x && pos.x <= cubeFinish.x &&
pos.y >= cubeStart.y && pos.y <= cubeFinish.y &&
pos.z >= cubeStart.z && pos.z <= cubeFinish.z
) ? vec4(0.0, 0.0, 0.0, 1.0) : fragColor; // Checks if ray is now within cube bounds
}
}
```

At first I assumed it was a floating point precision issue, but I tried working around that, along with pretty much everything else I can think of, and I just can’t seem to get rid of this problem. This was supposed to be the easiest part of the project and it’s driving me crazy that I can’t get it to work correctly. I’ve been stuck on this forever and if anyone more experienced could provide me with some insight, I’d greatly appreciate it!!

P.S. I know my code isn’t very well optimized, I mainly just wrote this shader as a proof of concept and I was planning on polishing it up later.