I want to make a Game Splash Screen, one of those in games where it appears when you start a game and the company logo shows up etc. I’m also using SDL if this is important and OpenGL 3.
One of my ideas was to simply drawn a image the size of the screen, because I’m running the game in fullscreen mode, therefore I searched about it and I found out about “BlitFramebuffer”:
It works, despite the image being drawn upside down (simple to solve, just use some image editor and flip the image) and the alpha being drawn as black for some reason, something that I didn’t found a fix. But ok I guess.
The next step would be a fading effect, were I’m stuck, because I didn’t found anything about how to make a fading effect using “BlitFramebuffer”…
There might be a way to achieve this with glBlitFramebuffer, but it is perhaps not the right tool for this job.
Draw your splash screen image with a rectangle mesh that fills the screen and the image applied as texture. In that case you can use a fragment shader that manipulates the image colors (or alpha value, if you use blending) to achieve the fading effect (I assume you are using shaders and not the ancient fixed function pipeline functionality).
“Per the spec”? I don’t care about the standard. If that is what “per the spec” means.
I guess I can handle a quad, if it’s really necessary, but what is a fragment pipeline? If it’s a shader, I already said that I’m not gonna use shaders, only if it’s the only way to do it.
I read somewhere that BlitFramebuffer “renders the texture directly to the screen/window”, “copy the pixels directly to the screen/window” or something like that, but I could copy it to somewhere else, edit it and then render it, and do that in a loop to achieve the fading effect, but of course, as always, the person didn’t explain how to do it…
The people implementing OpenGL care about the standard. It’s safe to assume that an implementation is going to closely follow the specification. So if the specification says (or implies) that you can’t do something, then it’s likely that you can’t actually do that with an actual implementation.
The set of operations which are performed to arrive at the actual colour value written to the framebuffer for each pixel. In legacy OpenGL (before shaders), this is a fixed set of operations, some of which can be disabled and most of which can be controlled by setting parameters. In modern OpenGL this is has largely been replaced by shaders, although some fixed steps (e.g. blending) remain. But in OpenGL 3.1+ core profile, a shader is mandatory for all draw calls (glDraw*). In versions prior to 3.1 and in the compatibility profile, shaders are optional. In that case, you can achieve the desired result simply using glTexEnv or blending.
Apparently I can read pixels from the frame buffer, I searched for a code and find out this, but it doesn’t work:
GLubyte pixels[width][height][3];
glReadPixels( 0, 0, width, height, GL_RGB, GL_UNSIGNED_BYTE, pixels);
My plan was to maybe read the pixel, modify then and try somehow to convert this pixels to a image or texture, and draw them again with glBlitFramebuffer()…
std::cout << "r: " << static_cast< int >(pixels[0]) << ‘\n’;
std::cout << "g: " << static_cast< int >(pixels[1]) << ‘\n’;
std::cout << "b: " << static_cast< int >(pixels[2]) << ‘\n’;
std::cout << "a: " << static_cast< int >(pixels[3]) << std::endl;
x = 480 and y = 377 is a white pixel and this prints “255, 255, 255, 255”. If I do x = 0 and y = 0 (transparent pixel in the image) it will print “0, 0, 0, 0” which should be a transparent pixel, as in the texture but for some reason is being rendered as black. But anyways, all I need to discover now is how to read the whole screen and edit each pixel and draw it again…
Edit2: I manage to change the image, but the problem of the alpha being rendered as black that I mentioned in the beginning, I still can’t fix…