Saving data with new format.
Vasily Levin
Posts: 0
What is a correct way to save custom objects that forms hierarchy in new format?
It can not be deduced from documentation and headers, so I have to make some guessing here, correct me please if I am wrong.
Say I have a custom node class Alpha:
public DzNode {};
and helper class AlphaIO: public DzExtraNodeIO {};
and a
class Beta: public Alpha{};
I guess to implement saving of Beta I have to make BetaIO child of AlphaIO and explicitly call parent methods.
Is it how it works, or there is some underlying magic that would call all IO helpers and I just have to implement Independent IO classes?
Also it is unclear how save/load should work if several different object are referencing another one. Previously it worked through ReadPointer, now I do not see mechanism for that.
Post edited by Vasily Levin on
Comments
Here is a Jpeg of the error inside of DS...
Interesting thing here is I can save as a ParticleFX file format aka constructor(Which Vasily Programmed into export options) and everything is fine But when we try to save in .duf format you see the errors in the Jpeg...I have a zipped Log file from DS4.5 if needed...
The old way was quite a bit easier. Simply inherit from DzBase, implement your save and load, and use WritePointer and ReadPointer for your custom objects.
The .duf way requires serialization to a text format. Or in other words, no pointers. It is significantly more work. You can write all your objects and give each one a unique token of some sort. Then write that token where you have a pointer to said object. You read in all your objects and resolve your pointers.
Studio itself usually uses the ids. Here is how to write a url to DzNode dazNode. This would go in your writeExtraInstance
io->startObject();
io->addMember("node", DzAssetIOMgr::getAssetInstanceUri(dazNode).toString());
io->finishObject()
and do the save/load initiated automatically for every node or even DzBase objects or do I have to initiate it somehow?
Anything that derives from DzNode and is added to the scene via dzScene->addNode will get saved by the scene. samples\modifiers\DzBlackHole is an example of that.
What are you deriving from and who owns it? Scene, another node, etc?
In addition to node that have a DzNode as an ancestor I have objects derived from DzModifier and DzBase, modifiers are owned by objects that are nodes, added to them by call
DzBase based objects are owned by one of plugin class, and stored in container.
Every node added to the scene and every modifier that is a child of that node will get the write initiated on it.
Do you have custom data on the derived DzModifier?
yes, I have.
yes, I have.
Then you will need to do it like MyCustomModifier does. How far are you getting? Or in other words, looking at the duf file what is getting written out that looks correct and what does not?
Do not know yet, it is half way to by ready to run, for now I am trying to understand what is a correct way to handle save when you have a hierarchy objects. will get back to you if I have difficult problems after making it running.
Do not know yet, it is half way to by ready to run, for now I am trying to understand what is a correct way to handle save when you have a hierarhy objects.
For the DzNode and DzModifiers that are known to the system and have their own custom data, then you will need to do it like the samples. For your own custom classes, its more or less up to you. So suppose your DzNode called CustomNode has a pointer to an object of type Happy. Happy is your own class and not known to Studio.
- if Happy is only ever known to your CustomNode , then you could have the writer and reader for CustomNode handle Happy in its entirety.
- If other objects might have pointers to Happy, then you have to decide who ownes it and who is just referencing it.
for me most interesting case is when you have CustomNode2 derived from CustomNode, that case is not in examples.
Should serialization helper classes such as
and be derived from corresponding helpers of CustomNode, or it will work if they are independent?Either way is fine and have pros and cons
Some confusion could be clear up by thinking of a DzNode as the following:
- The core part
- The extra part
The core part gets read and written automatically. CustomNodeIO is an ExtraNodeIO, it only writes the extra part. That is why you don't have to make calls the DzNode's reader or writter in your own derived functions like you used to. Example of chaining up the call that is no longer required:
But if you make your CustomNode2IO derive from CustomNodeIO, then you will want to chain up from CustomNode2IO to CustomNodeIO
Could resolve most problems, have several problems to solve though.
1. how should I implement save/load for objects derived from DzBase? in my case they are owned by another object of DzNode type. I would like their load to follow same architecture but do not see how to make DS loader to initiate writing/reading of those object. Maybe I should base them on other class? I first thought about DzSceneData, but examples has bogus comment about it that DzSceneData of a given type is singleton and can have only one exemplar. And also it is not clear if old scenes would load if I base class on the other type.
2. What is the difference between instance write/read and defenition write/read? when they are called and what supposed to be loaded on each state?
3. How can I prevent DS from saving some data? I have a nodes with auto-generated geometry and does not want to keep it in file I am getting compressed save file of size 10MB and about 200MB uncompressed, but actually it would be enough several kilobytes to store my data.
and 4. is just a complain, please make a documentation for your new serialization system at least comments to parameters and return values in the declarations, that already would be a huge help. also architecture overview with description of stages and on what steps what object are created and initialized. without that data implementing serialization to the new format is very discouraging process.
if you create a generic DzFacetMesh and put it on a DzNode, it 's going to save with the scene.
I know for sure you can do it if you implement your own Shape/Mesh. You may be able to do it by subclassing DzFacetMesh and overriding some of IDzSceneAsset, I will look into that.
Or you could watch for the following signals on dzScene and remove your geometry when the save happens:
void sceneSaveStarting( const QString &filename; );
void sceneSaved( const QString &filename; );
The current save and load paradigm is at least a 10 fold increase in complexity. The old .DAZ paradigm was great for the programmer and very orthogonal, but for the user it had severe limitations.
Again, the best place to understand it is to read the spec while looking at some duf and dsf files.
- http://docs.daz3d.com/doku.php/public/dson_spec/start
hopefully we will figure out a better way to convey the architecture and best practices.
thanks alot, that helps.
one more question on 1.
Most convenient way to go for me is MyCustomModifier way, but what asset type should I set to it? Modifier has a special constant for it, but my objects are custom and it seems that existing constants do not fit into the model.
return ModifierAsset;
still a problem I return owned assets but write write functions are never called; I register my custom type with
what am I missing?
thing to note, when it was registered by DZ_PLUGIN_REGISTER_NODE_EXTRA_OBJECT_IO DS crashed on saving scene somewhere on creationg IO object for extra modifier, and there were no any of my functions on stack trace. When I changed this to DZ_PLUGIN_REGISTER_MODIFIER_EXTRA_OBJECT_IO macros crashes gone, but still no writing for those objects.
can I treat
as PostLoadFile in previos saving scheme?It's is sort of implied by the header file but yes, finalizeInstance is the last thing called. Like the following
for all
- applyInstanceToObject
for all
- resolveInstance
for all
- finalizeInstance
and what about previous question about write functions never called?
maybe you need more details?
That I will have to investigate.
One more strange thing, after loading saved file hierarchy structure is lost, do I have to store it by hand and restore on load?
I am unsure what you mean by "saved file hierarchy structure"?
I mean that scene I save has Nodes, child nodes, etc, when I load it it flattens to the list of nodes, I do not see how I could screw it since no reparenting is done on load.
So question is is structure is kept automatically and I actually destroy it on load somehow, or I should specifically do something to restore parent-child relations?
I just double checked MyCustomNode put into a hierarchy of nodes, it is working.
Are you setting the names of your node to be unique? use setName on your dznode
it might be helpful to pm me your duf file so I can take a look at it.
from the smallest particle scene resulting file is still big, would not fit into PM, here is dropbox link.
https://dl.dropbox.com/u/21104304/particles.duf
it is big, but when most autogenerated data is folded it is ok to explore.
Search for the lines in the file that look like:
The first node is the camera and looks fine.
The second node looks fine
The third node has no id or name. I am pretty sure that is the problem. Please give all the nodes you create a name, it does not have to be unique.
I managed to move significantly futher in my save/load code.
Now I have a question, can I control order of finalize calls for sibling nodes? Or maybe there is some guaranteed order of calls to apply instance, finalize instance?