Can anyone please advize me on how to apply 2x, 3x, or 4x nearest-neighbour filtering (or something else?) to make it pixel clear?
You need to use GL_NEAREST and make sure your fragments fall on whole pixel coordinates. To do that, translate everything by 0.375f in the x and y directions. Obviously, you’ll need to use a pixel-perfect orthographic projection.
If you are still getting blur, then the problem is something else entirely. Are you using FBOs?
GL_NEAREST should do what you want.
I would check that you have the correct texture bound when you set the texture settings.
I would also check for OpenGL errors with a glGetError() somewhere in the code.
I bind my texture right before I set the texture settings (then call glTexImage2d, and all of them have the same filter setting. I know it works because I get white blocks insteads of the images when set to gl_nearest_mipmap_linear, etc, plus for 640x480 no blur is visible.
I checked with glGetError() now and no error was found when loading nor rendering the textures nor initializing the projection and gl initialization. I also translated everything by 0.375f like said. Didn’t work.
I must be doing something silly somewhere. Will alpha textures be the problem? I load my textures as transparent pngs (with power of 2 width and height). I call
Well when I printscreen the program and paste it in a graphics program is turns up pixel-clear. It’s only blurry when in program running. When I up res to 640x480 its as clear as day. Maybe I should write screenshot to file from inside my app instead (instead of screen capture on keyboard)? Will that capture the blurryness?
I played around with the nvidia control panel settings, especially forcing mipmaps, but to no avail. I tried a game engine called AGS, and a game below 320x240 is also blurry, but you can turn nearest neighbour filtering on (2x, 3x, or 4x), and that works perfect on my pc, so I assume its not a card error, but in my code. I’ve mailed the creator (though I think he uses directx).
Anyways, if I ever turn up with an answer I’ll report it here. Thanks for your responses
The only thing I can think of is that you might be running some sort of “override” filter effect from the driver. Nvidia and ATI have previously supplied options in their drivers to adjust the final render with post FX.
Ok, do you setup a fullscreen window on a 320x240 resolution? If so, be advised that most modern video cards and monitors don’t support anything lower than 640x480 and will upscale.
Your best bet is to use 640x480 and adjust your projection matrix to account for that (trivial). Even that will entail some amount of blurring on LCD monitors. Best solution? Keep the current resolution and make a “best effort” fit to the aspect ratio.
Yes, I run my app fullscreen at 320x240, and also tried at 400x300 at first which was the actual res I’m going for. I use the opengl in combination with SDL, though I don’t think SDL makes the difference, I’ll take a look in that direction.
I have changed to 640x480 and then scaled the graphics which works, but I want a generic solution that will work on all systems consistently. There must be a solution to my problem. Hopefully the maker of AGS comes back to me. Someone must have had this problem somewhere before, or do nobody else use opengl for such small res apps
bdude, did you actually read what just said Stephen A ?
This blurring is not caused by OpenGL, but by your LCD monitor.
To avoid it, either run your program at the native LCD resolution, or never use fullscreen, or play with your monitor settings to disable blurring.
In the general case, the best solution is the first.
You can do your rendering on a 400*300 viewport inside your bigger fullscreen window, then glCopyTexSubImage2D it to a texture, then render your texture as a full screen quad in GL_NEAREST mode. Guaranteed without blur. Idealy the fullscreen windo size should be an integer multiple of width and height. Ie, for 400x300, 800x600 will do perfectly, 1200x900 too (but as it is non-standard, you will have to add some black borders to fill up to 1280x1024).
Also, beware of aspect ratio :
400x300 is 4:3, only seen on CRT, AFAIK
LCD are often 5:4 1280x1024
And more and more monitors are widescreen nowadays, with sometimes 16:9 or 16:10 aspect ratios.
Yes I did read it. I do understand that. But I’m running the program on my old 17 inch CRT screen (I even disable the LCD screen, my bad for not mentioning it clearly). I understand completely that the LCD monitors and new graphics card upscale resolutions smaller than 640x480. I just figured that if someone elses 320x240 program is successfully unblurred and worked fine on my 17inch, there must be a solution code-wise, since this disproves that the monitor is responsible, unless the other program uses tricks like you mention (note that his program also blurs if I turn his filters off).
Thanks for your comments, I will do as you say.
(I did attempt to play with monitor settings and will continue to do so to try disable blurring, haven’t had success)
In my Nvidia driver settings there is an option that “doubles lines for low resolution modes”, as was done with 8/16bits consoles and oldschool 320x200 MCGA. Maybe that is the effect you are looking for.
Just out of curiosity, how do you know that someone else’s program is running the monitor at 320x240? (ie how do you know they are actually switching the monitor to this mode and not rendering to a small target and up-scaling with nearest filtering?)
Thanks zbuffer, I’m at work now but will try this when at home.
sqrt: I have no idea, that’s why I’ve tried to contact him. Hopefully, he will reveal a bit about his process. All I know is I can create a game in his engine with 320x240 and it blurs. Then, after compiled, once the game is distributed, a settings file is distributed with it where windowed-mode and stuff like that can be set. And the options for filtering is here, which when sets fixes the blurry-ness. I’ve have no idea what he does at this point.
Thanks for all of the suggestions, you’ve given me a lot of insight into this.