Occasional OpenGL errors under Windows OS

I’ve been attempting to create a small 3D graphics library for OpenGL. I’ve been working under Linux OS for some time but then had to move to Windows, where I’ve met a few problems. One of them is that I get runtime errors for glBindTextures function. However, absolutely the same code works correctly under Linux.

General info:
Language: Python 3.7.4
OS: Windows 10 (v10.0.18363.535)
Python interpreter: Anaconda
OpenGL version: 4.6

This is the code sample where the error happens:

    glUniform1i(glGetUniformLocation(main_context.shader_texture.get_shader(), "tex_sampler"), 0)
    for k in range(len(main_context.lights)):
        glUniform1i(glGetUniformLocation(main_context.shader_texture.get_shader(),
                                         "shadowMap[" + str(k) + "]"), k + 1)
        glBindTextures(k + 1, k + 2, main_context.lights[k].depth_map)

main_context.shader_texture” shader code (fragment shader part) (only part for textures is shown):

    const int MAX_LIGHTS = 10;

    in vec2 textures;
    in vec3 fragment_normal;
    in vec3 fragment_pos;
    in vec4 FragPosLightSpace[MAX_LIGHTS];

    out vec4 color;
    uniform sampler2D tex_sampler;
    uniform sampler2D shadowMap[MAX_LIGHTS];

Code explanation:
I’m attempting to implement shadows. For this, I use shadow maps, which are depth textures. In this code, I’m trying to pass created depth (shadow) maps to the shader before drawing objects. Because a user may have multiple light sources, I use an array of input textures. I index them using glUniform1i, and them one by one fill using ‘glBindTextures’.

Problem:
This code crashes with this error output:

Traceback (most recent call last):
  File "C:\Users\artem\Anaconda3\lib\site-packages\OpenGL\latebind.py", line 41, in __call__
    return self._finalCall( *args, **named )
TypeError: 'NoneType' object is not callable

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "\Desktop/TRON/test_code.py", line 21, in <module>
    bla.main_loop()
  File "\Desktop\TRON\TRON.py", line 976, in main_loop
    window.draw()
  File "\Desktop\TRON\TRON.py", line 945, in draw
    i.real_draw()
  File "\Desktop\TRON\TRON.py", line 676, in real_draw
    glBindTextures(k + 1, k + 2, main_context.lights[k].depth_map)
  File "\Anaconda3\lib\site-packages\OpenGL\latebind.py", line 45, in __call__
    return self._finalCall( *args, **named )
  File "\Anaconda3\lib\site-packages\OpenGL\wrapper.py", line 677, in wrapperCall
    pyArgs = tuple( calculate_pyArgs( args ))
  File "\Anaconda3\lib\site-packages\OpenGL\wrapper.py", line 438, in calculate_pyArgs
    yield converter(args[index], self, args)
  File "\Anaconda3\lib\site-packages\OpenGL\converters.py", line 135, in __call__
    return self.function( incoming )
  File "\Anaconda3\lib\site-packages\OpenGL\arrays\arraydatatype.py", line 149, in asArray
    return cls.getHandler(value).asArray( value, typeCode or cls.typeConstant )
  File "\Anaconda3\lib\site-packages\OpenGL\arrays\arraydatatype.py", line 53, in __call__
    typ.__module__, type.__name__, repr(value)[:50]
TypeError: ('No array-type handler for type numpy.type (value: 6) registered', <OpenGL.converters.CallFuncPyConverter object at 0x000001717E1BAC08>)

Note:
Some code, like shader binding, is not shown.
File locations are shortened

This doesn’t make sense. The second argument should probably be 1, and the third argument should be an array (the wrapper code might convert scalars to single-element arrays automatically, but it’s better not to rely upon it).

Also: have you checked that all of the main_context.lights[k].depth_map values are valid?

I’ve checked the docs for glBindTextures - and… Well, yes - my code really didn’t make any sense. However, for some reason, it worked under Linux. And I think I know why.

If we take for granted, that glBindTextures, while looping just used one texture that I passed (main_context.lights[k].depth_map) for each iteration, then this probably happened:

This is how the array of textures looked after each call of glBindTextures:

-1 -1 -1 -1 -1 -1 -1 -1 -1 -1
 1  1 -1 -1 -1 -1 -1 -1 -1 -1
 1  2  2  2 -1 -1 -1 -1 -1 -1
 1  2  3  3  3 -1 -1 -1 -1 -1

-1  - no texture specified
1 - first texture (shadow map for first light source)
2 - second texture
...

(example given for three light sources)

So some of the unused texture positions in the array were also filled. However, because I was using only up to 3 lights in my tests, the glBindTextures function never attempted to address an element outside the texture array, so everything worked ‘correctly’.

As for Windows -

Your suggested solution for rewriting the binding call as:
glBindTextures(k + 1, 1, main_context.lights[k].depth_map)
Gave out the same error.

And, yes, I’ve checked texture maps - they are correct.

Found solution:

Yes, the third argument had to be an actual array.

I rewrote the code to the following:

    glUniform1i(glGetUniformLocation(main_context.shader_texture.get_shader(), "tex_sampler"), 0)
    for k in range(len(main_context.lights)):
        glUniform1i(glGetUniformLocation(main_context.shader_texture.get_shader(),
                                         "shadowMap[" + str(k) + "]"), k + 1)
    textures_array = []
    for i in main_context.lights:
        textures_array.append(i.depth_map)
    glBindTextures(1, len(main_context.lights), textures_array)

And everything worked fine. Thank you so much for your suggestions.

As I wrote before, it seems that under Linux one texture that I passed was just duplicated many times to form an array. The same didn’t happen under Windows OS.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.