Please excuse the possible simplicity and vagueness of my question, as I have no or at most very little experience in this field.
I’ve created a LabView (National Instruments) program that processes data in real-time from my hardware/instruments. This data is then manipulated to get a series of normal vectors and other dimensions.
So say I have a file which contains a series of points or vectors that are constantly updated (more than once per second). Indeed these points will dictate the shape of a cylindrical object with a known diameter. As the points change, the shape of the cylinder will change (ie. bend).
If I create a model of this “cylinder” in say 3dsMax or any other program, how do I go about allowing these coordinates to dictate its shape in a “real-time”/low latency virtual environment?
At the moment, I have a simple graph in LabView contained within a loop that plots a curve of points that simulates my cylinder. Obviously, however, this is subpar because the visual quality is poor, and I would like to customize the features of my model.
Come to think of it, essentially my system is only slightly different from any video game controller. A video game controller sends a signal to the processor, and the processor controls the graphics. In my case, it’s not different: my data is a representation of a physical system and I want that data to control my 3d graphics.
Thank you for any help!