Possible improvements: Increased resolution, antialiasing, lighting, global illumination, high fidelity BRDF, caustics, atmospherics & volumetric illumination, weather, motion blur, HDR & adaption, depth of field, hair & cloth with full collision, global physics and materials, fluid dynamics, all of the above interracting, the list could go on ad nauseum and all are problems that may be solved on the graphics cards of the future but always with compromises because the computational requirements are effectively limitless.
Do not assume that LOTR is the pinnacle of graphics achivement, it won’t be despite how impressive we all find it. Movies make all sorts of short cuts that true 3D environments cannot, in addition they are often hand crafted in many ways, rendered in pieces and composited.
More importantly LOTR was a movie, it was not ray traced, or rendered, it was predominantly FILMED, it could not have been made entirely with CGI even with todays best technology. Looking at a movie and citing it as an example of where ‘real-time’ graphics can go shows that even the current offline rendering technology doesn’t meet your standard for ‘viability’. It reinforces the belief that it probably never will.
I’m not even going to comment on the jaded gameplay criticism except to say you don’t get a lot of people playing space invaders these days. There’s a good reason for that.
It just boggles the mind that someone would say we will be waiting on advances in the CPU rather than the graphics and that the CPU will outstrip GFX performance. Ignoring the contradiction, we already wait for CPU memory and bus advances. See all the comments above as to why this is unlikely to happen any time soon, none of which you’ve directly addressed.
[This message has been edited by dorbie (edited 02-05-2004).]