Why Does My glTF Model Look Different Depending on Where I Am?

The glTF format tries to characterize physical properties of a surface (metallic or dielectric? smooth or rough? which parts of the visible spectrum are reflected under diffuse lighting?) but the formation of an image or picture presented on your device’s screen is — intentionally, I believe — beyond the scope of the data format, and subject to decisions and preferences of the rendering engine and operating system. The human perceptual system certainly also plays a part.

As examples:

  • a rendering engine may change the exposure, tone mapping, or color grading of the image based on changes in the rendered scene (e.g. a bright light source within the image may affect auto-exposure) or viewing environment (an increase/decrease in room brightness may require changes to tone mapping).
  • an operating system may modify the image in response to ambient light (e.g. True Tone on iOS)
  • you may perceive more or less contrast and colorfulness in response to the same stimulus from the screen, based on the surrounding environment, (e.g. Hunt Effect and Stevens Effect, among other color appearance phenomena)