How does one interpolate opengl colors yourself?

It seems that my code is messing up a different texture I have for text in my application. Can you use more than one texture in an application?
However, my starting point now will be getting you 1d texture working is c#. That will be a good starting point.

Well I was trying out the 1d texture example and just ran into this:

Unsupported GLSL type layout when I attempt to create the shader programs.

Otherwise everything was OK to this point!

I have the latest opengl drivers for the intel HD4000 on Windows. Version 4.3 as determined by glGetString…

post the shader source code, otherwise we can only guess the error

It is identical to what you posted. Here it is:

    public static string VertexShader = @"
        #version 450 core


        in layout (location = 0) vec3 in_position;
        in layout (location = 1) float in_texcoord;
        out float texcoord;
        void main () {
        gl_Position = vec4(in_position, 1);
        texcoord = in_texcoord;
        }
    }";

    public static string FragmentShader = @"
        #version 450 core

        in float texcoord;
        uniform layout (binding = 3) sampler1D tex1;
        out layout (location = 0) vec4 out_color;
        void main () {
        out_color = texture(tex1, texcoord);
        }
    }";

So close…I also tried changing the version to 430 - no difference.

Hi,
Well I’ve spent at least 20 hours on this, trying various examples from the web and yours and I have had no luck. The closest I got is using your example except for the error on creating the program. If I comment out the in layout (location) lines it will compile otherwise not.
So if you can suggest something for this I’m all ears. I’m not sure if it’s something messed up in the libraries I use or not and the developer of the libraries is not available.

However, I do have a 2D texture working the way I want. It colors the peaks in the 3d spectrum correctly. The only issue is that I have to send the texture coordinates for the color. Now this means I have to send all the coordinates over and there are about 200,000 every few milliseconds. It works fine, but this is very inefficient. It does have one advantage, in that I can change the colors of the graph by just supply a bitmap.
I am wondering however, if this is really necessary. The vertex shader gets a vector3 that has my vertices for the graph anyways. The vertexCoordinate.y value is all I need to plot the data since the x and z coordinate are always the same. Can I not just pass this to the fragment shader without sending all the vector2 coordinates over?

So here is a snippet of the code.

            Gl.UseProgram(plottingProgram);
            Gl.BindTexture(myTexture); //this is the 2D texture

            plottingProgram["model_matrix"].SetValue(Matrix4.CreateRotationY(yangle) * Matrix4.CreateRotationX(xangle));

            UpdateSpectrum();

            spectrumVBO.BufferSubData(vertexBuffer); //vertexBuffer is an array with x,y,z data.  x, is always the same
            myUV.BufferSubData(colorBuffer); //colorBuffer is a vector2 with x always the same and y the same as vertexBuffer.y values
         
            Gl.BindBufferToShaderAttribute(spectrumVBO,  plottingProgram, "vertexPosition");
            Gl.BindBufferToShaderAttribute(myUV, plottingProgram, "vertexUV"); 
            
            Gl.BindBuffer(spectrumIndicesVBO);
            Gl.DrawElements(BeginMode.TriangleStrip, spectrumIndicesVBO.Count, DrawElementsType.UnsignedInt, IntPtr.Zero);


And here are the vertex and fragment shaders

        public static string VertexShader = @"
        #version 130
        in vec3 vertexPosition;
        in vec2 vertexUV;
        out vec2 uv;
        

        uniform mat4 projection_matrix;
        uniform mat4 view_matrix;
        uniform mat4 model_matrix;

        void main(void)
        {
            uv = vertexUV;
            gl_Position = projection_matrix * view_matrix * model_matrix * vec4(vertexPosition, 1);
            
        }
        ";

        public static string FragmentShader = @"
        #version 130

        uniform sampler2D texture;
        in vec2 uv;

        out vec4 fragment;

        void main(void)
        {
            fragment = texture2D(texture,uv);
        }
        ";

So is it possible to not send the vec2 which contains the texture coordinates and instead just send something from the vertex shader to the fragment shader? I could even get the coordinates as vertexPosition.x and vertexPosition.y. I tried a bunch of things and it didn’t work.
Thanks a bunch for your help…

#version 450 core

I also tried changing the version to 430 - no difference.

#version 130

Why so many different versions ? At some starting point, supporting a single version is the way to do.

What OpenGL version are you running ? I’m not totally sure but I have doubts that intel HD 4000 could run OpenGL 4.5 (at least on a machine I have, I’m stuck with OpenGL 3.3, but that’s true, under Linux).

Last thing, do things simply. Start again with a simple program, make it run, and then insert it into your bigger project.

Hi,
Still learning. I thought that the version was specific to one version and was just trying the code provided.
I did start simple, just trying what John provided but the shaders would not compile and the error message was “Unsupported GLSL type layout”. There is NO reference to this error anywhere. Sigh…
Anyhow look at my next answer. I solved my requirements with a 2D texture with no need to send texture coordinates from my application.

I finally have a solution to provide colors to my height map in C#. The code is quite simple and I use the vertex shader to send the texture coordinates to the fragment shader. I no longer need to send 200k texture coordinates. Here is the code:


            Gl.UseProgram(plottingProgram);
            Gl.BindTexture(myTexture); //this is the 2D texture
 
            plottingProgram["model_matrix"].SetValue(Matrix4.CreateRotationY(yangle) * Matrix4.CreateRotationX(xangle));
 
            UpdateSpectrum();
 
            spectrumVBO.BufferSubData(vertexBuffer); //vertexBuffer is an array with x,y,z data.  x, is always the same
      
            // This is no longer needed
        //  myUV.BufferSubData(colorBuffer); //colorBuffer is a vector2 with x always the same and y the same as vertexBuffer.y values
 
            Gl.BindBufferToShaderAttribute(spectrumVBO,  plottingProgram, "vertexPosition");

          // This is no longer needed
         //  Gl.BindBufferToShaderAttribute(myUV, plottingProgram, "vertexUV"); 
 
            Gl.BindBuffer(spectrumIndicesVBO);
            Gl.DrawElements(BeginMode.TriangleStrip, spectrumIndicesVBO.Count, DrawElementsType.UnsignedInt, IntPtr.Zero);

And now the shaders:


        public static string VertexShader = @"
        #version 130

        in vec3 vertexPosition;
        
        out vec2 textureCoordinate;
       
        uniform mat4 projection_matrix;
        uniform mat4 view_matrix;
        uniform mat4 model_matrix;

        void main(void)
        {
            textureCoordinate = vertexPosition.xy;
            gl_Position = projection_matrix * view_matrix * model_matrix * vec4(vertexPosition, 1);           
        }
        ";

        public static string FragmentShader = @"
        #version 130

        uniform sampler2D texture;

        in vec2 textureCoordinate;

        out vec4 fragment;

        void main(void)
        {
            fragment = texture2D(texture,textureCoordinate);
        }
        ";

This works really nice with the added benefit that I can just change the bitmap to change the colors drawn.
The whole program will now draw 200,000 vertices in one opengl call. If I turn off vertical sync, it takes roughly 3 ms to plot and render a frame.

Many thanks to everyone who has helped out here, John GClements and Silence. First for pushing me towards Modern openGL and then guiding me in the right directions.
Best regards, Tom

maybe its not really “correct” in my example, bt here you can find another example:
https://www.khronos.org/opengl/wiki/Layout_Qualifier_(GLSL)#Binding_points

maybe you have to put “layout (… some qualifiers etc…)” right ip front (before “uniform”)

“layout” is not a type, its a keyword to be able to provide some specifications, like the (texture) “binding” from which to sample