I did a read to docs and forums but I still have doubts on the relation between shaders and the Framebuffers.
The question is, the shaders consume the Framebuffer or they also change it at rendering time?
Starting from the beginning what I’m looking for is just send data to the shaders, make some computations, draw pixels with some colors and then read the result from the webGL api by readPixels (or any other way).
Quite easy, or it seemed so.
The result of my tests always return only zero-values using readPixels, checking the doc, readPixels seems read data only from the default Framebuffer and if a non-default one is used, from GL_COLOR_ATTACHMENT0.
In my test I use an ArrayBuffer to send the vertex coordinates to the shaders, I don’t use any Framebuffer, so the default one is used.
The shaders just get the vertex, make some calculations and then set some pixels coordinates and color.
The canvas is drawn correctly.
At this point, I’ve the canvas, I see the pixels (so they are there), I call readPixels and… nothing. only zeros.
Due to the doc says readPixels works only on Framebuffer, I was wondering if the default Framebuffer is filled by the rendering of the canvas (shaders) or if readPixels expects me to fill data into it (the color buffer of the default Framebuffer should contains the canvas data I’ve generated, or not?)