Originally posted by Korval:
[b] [quote]shaders would be so much more powerful and interesting if vertex shaders could sample textures as fluidly as pixel shaders… why is this not possible?
For the same reason that the initial vertex and fragment shaders couldn’t handle conditional branching, or only had a few instructions, or had limitted opcodes, or why we don’t have primitive programs: because the hardware isn’t done yet.
Texture units were never meant to be accessed from vertex programs, so building that hardware is non-trivial. And the initial implementations of it will be slow. Just accept that what you’re trying to do won’t be possible for the time being. Spend the time using the functionality that is there rather than pining for functionality that will eventually be there.[/b][/QUOTE]yes naturally… i’m still interested in any best guess projections though.
edit: all i was really saying… is it appears that modern cards have about 8 plus pixel shaders running in parallel and 3 vertex shaders as i read somewhere. if 8 pixel shaders can be poking at the same texture at the same time, it seems like the vertex shader should be able to at the same time as well relatively easilly. i wouldn’t even care if the texture was duplicated so that it can be closer to the vertex shader, if of course the vertex and pixel shaders are poking at the same textures.
the thing for me though is i always assumed this was possible… so even though i make it a point not to get bogged down working with things like shaders… and certaintly not depending on them. i have given some thought to different shaders in the sense that ‘i could do this if i wanted to’… and i never ruled out sampling textures from vertex shaders. but anyhow, silver lining, that functionality it seems will get her soon enough, so all is not lost.
i’m actually fairly relieved, i was looking at shelling out around 800~1000$… now i will probably wait until this functionality is up to snuff so i can get everything i want in one swoop rather than thinking about canabalizing perfectly good setups. i wasn’t ready to make the jump, and now i don’t have a good enough reason to. so i’m happy with that in the end.
for the application i have immediately in mind, i will probably just settle for a slightly cheesier effect, or lower performance, and just do the filtering myself and settle on one float per vertex which i will probably try to find some what to stick in the w component. i will have to do the rest in the pixel shader, which is not good for my overall plans, but should produce cleaner images.
i was basicly planning on doing a two pass cloud rendering algorithm in the immediate time frame. the is basicly a central tesselated plane. on the first pass the vertex program would basicly grab a couple displacement offset from a texture, and then take another couple of textures and get the width of the cloud at that vertex in the positive and negative world axies, interpolate between them relative to the camera to get the alpha blending compoent of the vertex.
now i will probably do the first displacement on the cpu, then settle for clouds that are symetricly fluffed up and down though to a different scale, and look up the densities in a pixel program, and do the alpha perpixel there, which will probably look a lot better than vertex colouring anyhow.
the vertex program might also be responsible for looping the texture coordinates slowly to create a cheesy effect of clouds crawling through the sky.
if i could do the texture lookup in the vertex shader though, i could potentially slowly animate the actual form of the clouds by uploading so many lines slowly to the cloud textures. presumably the animation would be suddle enough that the scan line would not be noticible. can’t do that yet though, because the offsets are done on the cpu for now. not that i know of any great way of producing cloud animations (offline more than likely)