While working today I found a problem I can’t seem to resolve. I exported a model to test my transform hierarchy and animation code and I discovered that some of the nodes in the resulting DAE file have transforms arranged in an order I have not seen before (and hadn’t planned for unfortunately).
As you can see this node has a couple <translate/> elements, then the <rotate/> elements and then some more <translate/> elements. In the past I’ve typlically just seen a single translate then three rotates (and maybe sometimes there’s a scale element if I’ve modified the model scale). I understand why the exported model has the above sequence of transforms (I changed the rotation pivots on some of the nodes in the model) but my problem is determining what the ordering is through the DOM API. I know I can use domNode::getTranslate_array and domNode::getRotate_array but how can I tell when the translations and rotations are interleaved like they are above? I guess I could just use getContents and hope the contents array is ordered properly but I hate to do that if there’s a real solution I just can’t find.
Actually getContents is exactly what you are supposed to use.
And yes it is ordered correctly. the contents array is the way the DOM keeps order for elements with xs:choice groups (like node). The DOM, upon load, will even fix the order of elements that are incorrect, so when you work with the document they are in valid order and will save out in valid order.
When you iterate over the elements in the contents array it is pretty simple to figure out what they are. you can switch on element->getElementType(). The values returned are in the COLLADA_TYPE namespace, ie COLLADA_TYPE::ROTATE. Or you can strcmp the elements name or type, or compare metas with daeSafeCast< type >( element ). The later one works like dynamic_type< type > in that it will return NULL if the cast would fail.
Glad to hear I’m not too far off the mark then. This morning I ended up implementing a loop over all the contents with a dynamic_cast to determine object type.
Now I’ve got another issue, or rather, I’ve solved another problem I think. It’s a little off topic for this thread but since I’ve got you here. I’ve been working to integrate refinery in command line mode with some Makefiles to automate some of my asset processing. I tried executing a macro from the command line like this:
Even though refinery never printed any errors it also never created the file output.dae. Executing this through the debugger and setting breakpoints in LibLoader::loadDocument() and saveDocument() I found that saveDocument was failing because the docStr argument was always the same as the toStr argument. Since no document called output.dae existed in the database this always failed.
I checked out Refinery.java where this gets called from and found that in the case of macro parsing the code was always passing the output file name as both arguments to saveDocument. To fix this I copied some of the code from the case that handles execution of an individual conditioner. Here’s the relevant code with my changes highlighted (sort of):
macro is type PipelineMacro and PipelineMacro.setOutput does nothing but return false.
The second change I think is what did it. But I see a situation where it may assert / throw an exception. If on the command line you specify more output files than needed. You should break after macro.getOutputNumber(). For the inputs it doesn’t matter because all the extra inputs specified don’t do anything, the function just returns false.
Thanks for finding this. I’ll make the change and it will be updated on SF sometime soon.
I recently downloaded the new refinery version, in order to update my assets pipeline, and I just realized that the afore mentioned problem still occurs. Using refinery from command line to run a macro, the output is not written.
Is there a quick turn around for this particular problem?
Has anyone actually added this to the distro yet? I just wasted a few hours baffled at it not working, only to find this post from a year ago. Unless I’m missing how people are using the refinery, It seems like command line macro support is an essential component if you want the refinery to be part of a content pipeline.
While admittedly it looks like a minor fix, patching it means keeping our own version of the code and I can no longer just hand people on my team the standard refinery installer.