Ptex API released as open source

Did not dive very deep in it, but the concept of getting rid of explict UV texcoords seem very interesting.

This is targetted at software offline rendering, but they mention how OpenGL realtime interface is done with nearest texture filtering to avoid filtering artifacts across textures… but it should be doable !

With that and Megatexture it really looks like explicit UV will go the way of the dodo in coming years…

Thoughts ?

I also found it very interesting and seeing it in action in a real-time app would be more so.

But i think it won’t be anywhere near id’s mega-texture, simply because it is meant for offline renderers and thus certainly does no texture compression and especially does not pre-load mipmaps and does not delay loading mipmaps (give you a lower mipmap-level that is available and continues loading the rest in the background), because in offline renderers you don’t want such behaviour.

Though it could most certainly be extended to do such a thing.

Also i don’t really want to know the storage-overhead right now, those textures must be HUGE.

Anyway, since it is open-source we will certainly see some very interesting projects to use or integrate this technology.


Mighty snazzy.

If I’ve got the gist of it…

  • Quad subdivision meshes only
  • Texture parameterization implicit in subdivision
  • Each control mesh face gets its own texture
  • Huge filters supported via cached face connectivity info in efficient new data structures & file format

Definitely a step in right direction as texture parameterization is for the birds (to which I’m sure any artist would attest).

Quad subdivision meshes or Triangle subdivision meshes, but no mix-and-match inside a given object.
Quite a limitation, but quad subdivision is preferred among artists AFAIK.

Looks like they have an algorithm that assigns a texture surface and then you get to paint on the texture by painting on the surface. It’s nothing new. Why should we get excited?

Dunno. Maybe the novelty is that is no algorithm to assign textures, as each quad gets its own texture ?
The only slight cleverness is about how to filter across edges, and currently this is still on the TODO list for realtime rendering.

You know, any CG technique done since the last 40 years is just about some algorithm to assign colors.

Did anyone read

What is a subd? subdivision surface? Are they using NURBS?

TODO list for realtime rendering

That is an insane number of textures. Thousands of texture for each model. Why not just have an algorithm that does texture atlas automatically, which also exist in certain CA software.