Best practices for real-time sensor data visualization with OpenGL/WebGL?

Hi everyone, I’m working on a small project where I want to visualise real-time sensor data (e.g., motion or environmental data) in a graphics application using OpenGL/WebGL, and I’m wondering what approaches work best for streaming and rendering updates without stalling the main render loop. For context, I’ve been following this ESP32 IoT motion example that shows how an ESP32 can send sensor data over HTTP: https://www.theengineeringprojects.com/2022/03/iot-based-motion-detection-with-email-alert-using-esp32.html. I’ve also seen Arduino forum threads and Raspberry Pi community projects where folks stream sensor values to dashboards, and some IoT discussions about using WebSockets or MQTT for real-time feeds. For those experienced with OpenGL or Vulkan, do you recommend double buffering data, using compute shaders, or offloading processing to a separate thread for smooth visualisation? What patterns have worked best for you?

Assuming your sensors produce data at a volume to make this worthwhile/necessary you may want to look at Buffer Object Streaming.
For Vulkan you have lower level synchronization primitives so that you can set up a system that updates data without stalling your GPU - but of course you have to assemble all the necessary pieces.