In OpenGL Reference Manual, it is written that
“Not all widths can be supported when line antialiasing is enabled. If an unsupported width is requested, the nearest supported width is used. Only width 1 is guaranteed to be supported; others depend on the implementation.”.
How can I see if a line width can be supported or not?
GL_ALIASED_LINE_WIDTH_RANGE, GL_SMOOTH_LINE_WIDTH_RANGE, GL_SMOOTH_LINE_WIDTH_GRANULARITY.
I can get values like 1.0-7.5(range) and 0.125(granularity) but what are these values? I can give very large linewidths and they are drawn as is. Also linewidths are rounded to integer values.
I need a relation between the line width ranges and granularity values.
All linewidths in the closed interval [antialisedLineWidthRange, antialiasedLineWidthRange] with antialiasedLineWidthRange * i * granularity are supported for antialised lines and distinguishable in the rasterization.
Aliased lines are only rastered with integer sizes. Granularity is smaller than 1.0 normally, this means there are more distinguishable linewidths using antialiased lines.
Note that the minimum antialised line width doesn’t need to be 1.0, it can be thinner.
Some lame implementations don’t support min and max linewidths other than 1.0, which is the lowest requirement in the specs you cited.
Lets say aliased ranges are 1.0 to 10.0 then alised linewidth of 1,2,3,4,5,6,7,8,9,10 are rastered.
Lets say antialiased ranges go from 0.5 to 10.0 and granularity is 0.125 then linewidths
0.5, 0.625, 0.75, 0.875,…, 10.0 are possible.
Some implementation lie about the min/max/granularity, though. For example GF2MX/GF4MX under Mac OS X. So don’t depend on the reported values.