Signal 11 (SIGSEGV) crash using OpenTK

Hello

I have a 3D graphics Xamarin project written in C# using Visual Studio 2019 that makes use of OpenTK 1.0. Xamarin is version 5.0.0.1905 and OpenTK is version 4.0.30319.
The app works perfectly well on some Android devices but crashes on others when the DrawArrays method is called with texture applied to the surfaces. As all of the stack traces refer to ‘libGLESv2_adreno’, I’m guessing the affected devices all have Qualcomm Adreno GPUs.

Here is a typical stack trace:

signal 11 (SIGSEGV), code 1 (SEGV_MAPERR)

backtrace:
  #00  pc 000000000001dc18  /system/lib64/libc.so (memcpy+232)
  #00  pc 00000000001d0d64  /vendor/lib64/egl/libGLESv2_adreno.so (EsxVertexArrayObject::UpdateInternalVbos(EsxDrawDescriptor const*, unsigned int, EsxAttributeDesc const*)+1644)
  #00  pc 00000000003374f4  /vendor/lib64/egl/libGLESv2_adreno.so (A5xVertexArrayObject::CalcVfdRegs(EsxDrawDescriptor const*, A5xVfdRegs*, int)+892)
  #00  pc 000000000034ff54  /vendor/lib64/egl/libGLESv2_adreno.so (A5xContext::ValidateState(EsxDrawDescriptor const*)+1212)
  #00  pc 000000000033c820  /vendor/lib64/egl/libGLESv2_adreno.so (A5xContext::HwValidateGfxState(EsxDrawDescriptor const*)+16)
  #00  pc 0000000000115178  /vendor/lib64/egl/libGLESv2_adreno.so (EsxContext::ValidateGfxState(EsxDrawDescriptor const*)+1968)
  #00  pc 00000000000fd240  /vendor/lib64/egl/libGLESv2_adreno.so (EsxContext::DrawArraysInstanced(EsxPrimType, int, unsigned int, unsigned int)+416)
  #00  pc 0000000000014f08  <anonymous>

And here are two code snippets that seem to be related to the problem. They have been extracted from the project in question so some variables may require further clarification (please let me know).

Firstly setting up the texture:

using OpenTK.Graphics.ES31;

   Bitmap b = null;
    if (modelTexture.ImageBitMap != null)
    {
        b = BitmapFactory.DecodeByteArray(modelTexture.ImageBitMap, 0, modelTexture.ImageBitMap.Length);
        int x = Convert.ToInt16(modelTexture.LeftCutoff * b.Width);
        int y = Convert.ToInt16(modelTexture.TopCutoff * b.Height);
        int w = b.Width - Convert.ToInt16(modelTexture.RightCutoff * b.Width) - Convert.ToInt16(modelTexture.LeftCutoff * b.Width);
        int h = b.Height - Convert.ToInt16(modelTexture.BottomCutoff * b.Height) - Convert.ToInt16(modelTexture.TopCutoff * b.Height);
        b = Bitmap.CreateBitmap(b, x, y, w, h);
    }

    int[] textures = new int[1];

    GL.GenTextures(1, textures);

    GL.BindTexture(TextureTarget.Texture2D, textures[0]);

    if (b != null)
    {
        //allocate memory on the graphics card for the texture.
        GLUtils.TexImage2D((int)All.Texture2D, 0, b, 0);
        GL.GenerateMipmap(TextureTarget.Texture2D);
        GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureMagFilter, (int)All.Nearest);
        GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureMinFilter, (int)All.NearestMipmapLinear);
        GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureWrapS, (int)All.Repeat);
        GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureWrapT, (int)All.Repeat);
        b.Recycle();
    }

    GL.BindTexture(TextureTarget.Texture2D, 0);

And secondly, drawing the surfaces:

    worldGLView.vertices = vertexLists[bufferIndex].ToArray();
    worldGLView.texture_coordinates = textureVertexLists[bufferIndex].ToArray();

    unsafe
    {
        fixed (Vector4* pvertices = worldGLView.vertices)
        {
            // Prepare the triangle coordinate data
            GL.VertexAttribPointer(worldGLView.tShaderProgram.mPositionHandle, Vector4.SizeInBytes / 4, VertexAttribPointerType.Float, false, 0, new IntPtr(pvertices));
        }

        fixed (Vector2* ptexturecoordinates = worldGLView.texture_coordinates)
        {
            // Prepare the triangle texture data
            GL.VertexAttribPointer(worldGLView.tShaderProgram.mTextureCoordinatesHandle, Vector2.SizeInBytes / 4, VertexAttribPointerType.Float, false, 0, new IntPtr(ptexturecoordinates));
        }
    }

    GL.DrawArrays(BeginMode.Triangles, 0, worldGLView.vertices.Length);

Can anyone suggest what might be causing the problem and how I can fix it?

Please let me know if more information is required.

Thanks in advance.

The most common cause of this type of crash is that you’re attempting to use an array that hasn’t been enabled. I see that you have glVertexAttribPointer calls, but I don’t see the corresponding glEnableVertexAttribArray calls.

Thanks very much for your quick response.

I am enabling (and disabling) the arrays but outside the scope of the code fragments above:

    GL.UseProgram(worldGLView.tShaderProgram.mProgramHandle);

    // Enable a handle to the triangle vertices
    GL.EnableVertexAttribArray(worldGLView.tShaderProgram.mPositionHandle);

    // Enable a handle to the triangle textures
    GL.EnableVertexAttribArray(worldGLView.tShaderProgram.mTextureCoordinatesHandle);

    ///////////// code to draw arrays goes here 

    // Disable vertex array
    GL.DisableVertexAttribArray(worldGLView.tShaderProgram.mPositionHandle);

    // Disable texture array
    GL.DisableVertexAttribArray(worldGLView.tShaderProgram.mTextureCoordinatesHandle);