I suspect it’s unlikely you’ll be able to access that kind of information from proprietary runtimes without working closely with the runtime vendor. There might be ways you could use the API better (such as choosing timestamps carefully, etc) but fundamentally, if you put a tracker designed for a hand on a bat, you’re likely to see some mismatch in motion tracking because the vendor most likely builds in assumptions about the kinds of motion that are likely. My favorite sample link for this type of thing is here: Valve Updated SteamVR Tracking Because 'Beat Saber' Players Were Too Fast
The only way OpenXR plays into this is in the API for getting a pose, which requires a time. The choice of timestamp to use is important: if you want the pose for rendering purposes, you use the estimated display time. If you want to know it for interaction purposes, you can use the timestamp of another event (a button state change, for instance) if applicable, otherwise the expected frame time is likewise probably your best bet. (If you have a fixed-rate physics simulation you might use a timestamp from that instead, but in any case, you should have timestamps flowing thru your whole system, there is no simple “now” in OpenXR intentionally.)
In terms of motion model mismatch, it seems like it might be useful for the tracker vendor to provide some way (whether thru OpenXR or an outside application akin to binding setup tools) to indicate or at least choose among options for motion model for a generic tracker. They might also offer some advice on tracker placement relative to tracked object: probably good rule of thumb is the closer to the object’s center of gravity, the better.