I am working on a project in which I have to create a virtual hand model and sync it with a robotic hand such that whenever i move my robotic hand my virtual hand will also follow the same motion(i.e. grab, hold etc) and then send the haptic feedback to the robotic hand for any hurdles encountered by my virtual hand to robotic hand.
But first I need to create a virtual human hand, I have google through many websites but didn’t got any clue how to start making one in OpenGL. Do I have to make my hand model in some other software like maya first.
Please let me know what I can try, I am stuck with this thing for way too long
so first of all, do you have a programm, that can render files (ex. obj files). if so, import your file as a model file, and render it!
if you dont have such a programm: make one. there are really good youtube tutorials on doing this by a person called “ThinMatrix”.
now you probably want a skeletal rig for your hand: in that case, implement something like that in your programm.
then take your values from your angle sensors, or whatever you are using, and feed them into your programm via serial-usb or whatever.
use these values to controll the roations and such of your “bones” in your skeletal rig.