This is likely really OT, but I hope moderators cut me some slack based on previous posts in this thread.
Originally posted by Ysaneya:
There is no doubt UE3 is an excellent and impressive engine. But categorizing Sweeney, as much as i respect him, as a programming god is IMO going a step too far.
I agree 100%. As far as “programming $DEITY” goes, at least in the gaming arena, I see no more people than the number of fingers on one hand. Carmack is one of them, for obvious reasons. I also think some people from Looking Glass, and maybe even Bethesda, are worth mentioning (but as I don’t know their names, I leave it at this). Going back, I could mention DMA, but that would put us back to the 2D era (but just for kicks, while in 2D mode, think “parallax scroll”).
They were the ones creating the “leaps” I think worth mentioning, and now we are (after “standing on the shoulders of giants”) at a level where just about anyone (obvious simplification) can create a 3D engine for just about any 3D game.
This is obviously an overly broad simplification, but once an effect has been discovered and implemented, it’s mostly just about putting it in context - something level designers and artists are much better suited to do. We have become more of database designers, and data shufflers. Has anyone ever considered a creator of an especially effecient database a “programming $DEITY”? 
What do I think is impressive about the UT3 engine then? To be honest, I don’t think anything by itself is to gaze upon. I think what is worthy attention is what the engine can do in combination with the tools. If the design tools weren’t there, or sucked, the engine might be able to produce real-time physics at the atom level, and visuals making a close-up of a tree make you think you looked at a photo - but without the tools to edit that media it would quite possibly be an engine able to render a Q1 level, but at 40-500 times more CPU, memory and gfx card requirements…
I mentioned this last part, since I myself know how fun it can be going overboard with effects in research and proof-of-concept stuff, that would force all potential users to also buy the “latest and greatest” gfx cards for $500 when if the programmer (me) had stopped and given some thought to what they did, the game would have run perfectly fine with less visual bells on a gfx card much older (which, I’m told, most user except the most hard-core gamers have, which translates to reaching a much large audience).
Personally, on one of my computers at home, I just upgraded to an ATI 9250 (I really need to get in touch with ATI re. their drivers…), and use so “outdated” hardware for the sole reason to be forced to use code-paths allowing lower-end hardware to work decent too (not to mention, many potential users would consider a 92xx card high-end…).