We are seeing incorrect output from arctan(y, x). It is being called from a fragment shader using GLSL version 450. The y value is always hardcoded at 1.0, and the x value has a range of 0.1-15.0 inclusive. The same shader is being run across multiple physical computers, and the machines return different values from the arctan(y, x) function for the same inputs. Clearing our shader cache and running the application again causes different outputs. I believe this is a decent indication that garbage values are being returned.
We changed the code to use arctan(y / x), and the output works properly. We are going to proceed forward with that approach, but I wanted to throw this out there in case anyone else is having this issue.
We’ve confirmed that the same issue is present in the following graphics drivers:
While this issue was observed in the context of a large system, it was tested pretty thoroughly. We ran our application many times where the only difference was changing between arctan(y, x) and arctan(y / x). The results consistently confirmed that arctan(y, x) was working incorrectly.
We are using CentOS 6.4 and each machine has a GeForce GTX 980.
I’ve cross-posted this on the NVidia developer forums, here is the link for reference: