Red9_PoseSaver
Red9_PoseSaver is designed as a generic module for storing and dealing
with poses inside Maya. The PoseData object is the base for all of this
and is what’s wrapped by the AnimUI and the MetaData.poseCacheLoad() calls.
There’s also a powerful setup for testing a rigs current pose against a
previously stored pose file, or you can test poseObjectA==poseObjectB
or even poseFileA==poseFileB
This is a new implementation of the PoseSaver core, same file format
and ConfigObj but now supports relative pose data handled via a
posePointCloud and the snapping core
Note
I use the node short name as the key in the dictionary so
ALL NODES must have unique names or you may get unexpected  results!
 
Core Pose Classes
DataMap([filterSettings]) | 
New base class for handling data storage and reloading with intelligence | 
PoseData([filterSettings]) | 
The PoseData is stored per node inside an internal dict as follows: | 
PosePointCloud(nodes[, filterSettings, meshes]) | 
PosePointCloud is the technique inside the PoseSaver used to snap the pose into relative space. | 
PoseCompare(currentPose, referencePose[, ...]) | 
This is aimed at comparing a rigs current pose with a given one, be that a pose file on disc, a pose class object, or even a poseObject against another. | 
- 
getFolderPoseHandler(posePath) 
Check if the given directory contains a poseHandler.py file
if so return the filename. PoseHandlers are a way of extending or
over-loading the standard behaviour of the poseSaver, see Vimeo for
a more detailed explanation.
TODO: have this also accept a pointer to a handler file rather than a direct
poseHnadler.py file in each folder. This means we could point a folder to a generic handler 
inside our presets folder rather than having the code logic in each folder.
- 
class 
DataMap(filterSettings=None, *args, **kws) 
Bases: object
New base class for handling data storage and reloading with intelligence
The idea of the DataMap is to make the node handling part of any system generic.
This allows us to use this baseClass to build up things like poseSavers and all
we have to worry about is the data save / extraction part, all the node handling
and file handling is already done by this class ;)
Note that we’re not passing any data in terms of nodes here, We’ll deal with
those in the Save and Load calls.
- 
metaPose 
this flag adds in the additional MetaData block for all the matching code and info extraction.
True if self.metaRig is filled, self.settings.metaRig=True or self.metaPose=True
- 
filepath 
- 
setMetaRig(node) 
Master call to bind and set the mRig for the DataMap
| Parameters: | node – node to set the mRig from or instance of an mRig object | 
- 
hasFolderOverload() 
modified so you can now prefix the poseHandler.py file
makes it easier to keep track of in a production environment
- 
getNodesFromFolderConfig(rootNode, mode) 
if the poseFolder has a poseHandler.py file use that to
return the nodes to use for the pose instead
| Parameters: | 
- rootNode – rootNode passed to the search poseHandlers 
poseGetNodesLoad or poseGetNodesSave functions
 
- mode – ‘save’ or ‘load’
 
 
 | 
- 
getNodes(nodes) 
get the nodes to process
This is designed to allow for specific hooks to be used from user
code stored in the pose folder itself.
| Parameters: | nodes – nodes passed to the filter calls | 
Note
Update : Aug 2016
Because this calls FilterNode to get the data when useFilter=True it
allows for any mRig with internal filterSettings bound to it to dynamically
modify the settings on the fly to suit. This is a big update in handling
for ProPack to integrate without having to massively restructure
 
- 
getSkippedAttrs(rootNode=None) 
the returned list of attrs from this function will be
COMPLETELY ignored by the pose system. They will not be saved
or loaded.
Note
Currently only supported under MetaRig
 
- 
getMaintainedAttrs(nodesToLoad, parentSpaceAttrs) 
Attrs returned here will be cached prior to pose load, then restored in-tact afterwards
| Parameters: | 
- nodesToLoad – nodes that the pose is about to load the data too, 
this is the already processed nodeList
 
- parentSpaceAttrs – attributes we want to be ignored by the load system
 
 
 | 
- 
buildBlocks_fill(nodes=None) 
To Be Overloaded : What capture routines to run in order to build the DataMap up.
Note that the self._buildBlock_poseDict(nodes) calls the self._collectNodeData per node
as a way of gathering what info to be stored against each node.
- 
buildDataMap(nodes) 
build the internal dataMap dict, useful as a separate func so it
can be used in the PoseCompare class easily. This is the main internal call
for managing the actual pose data for save
Note
this replaces the original pose call self.buildInternalPoseData()
 
- 
processPoseFile(nodes) 
pre-loader function that processes all the nodes and data prior to
actually calling the load... why? this is for the poseMixer for speed.
This reads the file, matches the nodes to the internal file data and fills
up the self.matchedPairs data [(src,dest),(src,dest)]
Note
this replaced the original call self._poseLoad_buildcache()
 
- 
matchInternalPoseObjects(nodes=None, fromFilter=True) 
This is a throw-away and only used in the UI to select for debugging!
from a given poseFile return or select the internal stored objects
- 
saveData(nodes, filepath=None, useFilter=True, storeThumbnail=True, force=False) 
Generic entry point for the Data Save.
| Parameters: | 
- nodes – nodes to store the data against OR the rootNode if the 
filter is active.
 
- filepath – posefile to save - if not given the pose is cached on this 
class instance.
 
- useFilter – use the filterSettings or not.
 
- storeThumbnail – save a thumbnail or not
 
- force – force write the data even if the file is read-only
 
 
 | 
- 
loadData(nodes, filepath=None, useFilter=True, *args, **kws) 
Generic entry point for the Data Load.
| Parameters: | 
- nodes – if given load the data to only these. If given and filter=True 
this is the rootNode for the filter.
 
- filepath – file to load - if not given the pose is loaded from a 
cached instance on this class.
 
- useFilter – If the pose has an active Filter_Settings block and this 
is True then use the filter on the destination hierarchy.
 
 
 | 
- 
class 
PoseData(filterSettings=None, *args, **kws) 
Bases: Red9.core.Red9_PoseSaver.DataMap
The PoseData is stored per node inside an internal dict as follows:
>>> node = '|group|Rig|Body|TestCtr'
>>> poseDict['TestCtr'] 
>>> poseDict['TestCtr']['ID'] = 0   index in the Hierarchy used to build the data up 
>>> poseDict['TestCtr']['longName'] = '|group|Rig|Body|TestCtr' 
>>> poseDict['TestCtr']['attrs']['translateX'] = 0.5 
>>> poseDict['TestCtr']['attrs']['translateY'] = 1.0 
>>> poseDict['TestCtr']['attrs']['translateZ'] = 22 
>>> 
<<<<<<< HEAD
>>> # if we're storing as MetaData we also include:
=======
>>> #if we're storing as MetaData we also include:
>>>>>>> d7ab8a039c4da0838a07bf4a9ec3ad957667b21e
>>> poseDict['TestCtr']['metaData']['metaAttr'] = CTRL_L_Thing    = the attr that wires this node to the MetaSubsystem
>>> poseDict['TestCtr']['metaData']['metaNodeID'] = L_Arm_System  = the metaNode this node is wired to via the above attr
 
 
Matching of nodes against this dict is via either the nodeName, nodeIndex (ID) or
the metaData block.
New functionality allows you to use the main calls to cache a pose and reload it
from this class instance, wraps things up nicely for you:
>>> pose=r9Pose.PoseData()
>>> pose.metaPose=True
>>>
<<<<<<< HEAD
>>> # cache the pose (just don't pass in a filePath)
>>> pose.poseSave(cmds.ls(sl=True))
>>> # reload the cache you just stored
>>> pose.poseLoad(cmds.ls(sl=True))
 
 
We can also load in a percentage of a pose by running the PoseData._applyData(percent) as follows:
>>> pose=r9Pose.PoseData()
>>> pose.metaPose=True
>>> pose.filepath='C:/mypose.pose'
>>> 
>>> # processPoseFile does just that, processes and build up the list of data but doesn't apply it
>>> pose.processPoseFile(nodes='myRootNode')
>>> 
>>> # now we can dial in a percentage of the pose, we bind this to a floatSlider in the UI
>>> pose._applyData(percent=20)
 
 
=======
>>> #cache the pose (just don't pass in a filePath)
>>> pose.poseSave(cmds.ls(sl=True))
>>> #reload the cache you just stored
>>> pose.poseLoad(cmds.ls(sl=True))
 
 
>>>>>>> d7ab8a039c4da0838a07bf4a9ec3ad957667b21e
Note
If the root node of the hierarchy passed into the poseSave() has a message attr 
‘exportSkeletonRoot’ or ‘animSkeletonRoot’ and that message is connected to a 
skeleton then the pose will also include an internal ‘skeleton’ pose, storing all 
child joints into a separate block in the poseFile that can be used by the 
PoseCompare class/function.
For metaData based rigs this calls a function on the metaRig class getSkeletonRoots()
which wraps the ‘exportSkeletonRoot’ attr, allowing you to overload this behaviour
in your own MetaRig subclasses.
 
I’m not passing any data in terms of nodes here, We’ll deal with
those in the PoseSave and PoseLoad calls. Leaves this open for
expansion
- 
buildBlocks_fill(nodes=None) 
What capture routines to run in order to build the poseDict data
- 
poseSave(nodes, filepath=None, useFilter=True, storeThumbnail=True) 
Entry point for the generic PoseSave.
| Parameters: | 
- nodes – nodes to store the data against OR the rootNode if the 
filter is active.
 
- filepath – posefile to save - if not given the pose is cached on this 
class instance.
 
- useFilter – use the filterSettings or not.
 
- storeThumbnail – generate and store a thu8mbnail from the screen to go alongside the pose
 
 
 | 
- 
poseLoad(nodes, filepath=None, useFilter=True, relativePose=False, relativeRots='projected', relativeTrans='projected', maintainSpaces=False, percent=None) 
Entry point for the generic PoseLoad.
| Parameters: | 
- nodes – if given load the data to only these. If given and filter=True 
this is the rootNode for the filter.
 
- filepath – posefile to load - if not given the pose is loaded from a 
cached instance on this class.
 
- useFilter – If the pose has an active Filter_Settings block and this 
is True then use the filter on the destination hierarchy.
 
- relativePose – kick in the posePointCloud to align the loaded pose 
relatively to the selected node.
 
- relativeRots – ‘projected’ or ‘absolute’ - how to calculate the offset.
 
- relativeTrans – ‘projected’ or ‘absolute’ - how to calculate the offset.
 
- maintainSpaces – this preserves any parentSwitching mismatches between 
the stored pose and the current rig settings, current spaces are maintained. 
This only checks those nodes in the snapList and only runs under relative mode.
 
- percent – percentage of the pose to apply, used by the poseBlender in the UIs
 
 
 | 
- 
class 
PosePointCloud(nodes, filterSettings=None, meshes=[], *args, **kws) 
Bases: object
PosePointCloud is the technique inside the PoseSaver used to snap the pose into
relative space. It’s been added as a tool in it’s own right as it’s sometimes
useful to be able to shift poses in global space.
| Parameters: | 
- nodes – feed the nodes to process in as a list, if a filter is given
then these are the rootNodes for it
 
- filterSettings – pass in a filterSettings object to filter the given hierarchy
 
- meshes – this is really for reference, rather than make a locator, pass in a reference geo
which is then shapeSwapped for the PPC root node giving great reference!
 
 
 | 
- 
syncdatafromCurrentInstance() 
pull existing data back from the metaNode
- 
getInputNodes() 
handler to build up the list of nodes to generate the cloud against.
This uses the filterSettings and the inputNodes variables to process the 
hierarchy and is designed for overloading if required.
- 
getPPCNodes() 
return a list of the PPC nodes
- 
getTargetNodes() 
return a list of the target controllers
- 
static 
getCurrentInstances() 
get any current PPC nodes in the scene
- 
snapPosePntstoNodes() 
snap each pntCloud point to their respective Maya nodes
- 
snapNodestoPosePnts() 
snap each MAYA node to it’s respective pntCloud point
- 
generateVisualReference() 
Generic call that’s used to overload the visual handling 
of the PPC is other instances such as the AnimationPPC
- 
buildOffsetCloud(rootReference=None, raw=False, projectedRots=False, projectedTrans=False) 
Build a point cloud up for each node in nodes
| Parameters: | 
- nodes – list of objects to be in the cloud
 
- rootReference – the node used for the initial pivot location
 
- raw – build the cloud but DON’T snap the nodes into place - an optimisation for the PoseLoad sequence
 
- projectedRots – project the rotates of the root node of the cloud
 
- projectedTrans – project the translates of the root node of the cloud
 
 
 | 
- 
shapeSwapMeshes(selectable=True) 
Swap the mesh Geo so it’s a shape under the PPC transform root
TODO: Make sure that the duplicate message link bug is covered!!
- 
applyPosePointCloud() 
- 
updatePosePointCloud() 
- 
delete() 
- 
deleteCurrentInstances() 
delete any current instances of PPC clouds
- 
class 
PoseCompare(currentPose, referencePose, angularTolerance=0.1, linearTolerance=0.01, compareDict='poseDict', filterMap=[], ignoreBlocks=[], ignoreStrings=[], ignoreAttrs=[]) 
Bases: object
This is aimed at comparing a rigs current pose with a given one, be that a
pose file on disc, a pose class object, or even a poseObject against another.
It will compare either the main [poseData].keys or the [‘skeletonDict’].keys 
and for key in keys compare, with tolerance, the [attrs] block.
>>> #build an mPose object and fill the internal poseDict
>>> mPoseA=r9Pose.PoseData()
>>> mPoseA.metaPose=True
>>> mPoseA.buildDataMap(cmds.ls(sl=True))
>>> mPoseA.buildBlocks_fill()
>>> 
>>> mPoseB=r9Pose.PoseData()
>>> mPoseB.metaPose=True
>>> mPoseB.buildDataMap(cmds.ls(sl=True))
>>> mPoseB.buildBlocks_fill()
>>> 
>>> compare=r9Pose.PoseCompare(mPoseA,mPoseB)
>>> 
>>> #.... or ....
>>> compare=r9Pose.PoseCompare(mPoseA,'H:/Red9PoseTests/thisPose.pose')
>>> #.... or ....
>>> compare=r9Pose.PoseCompare('H:/Red9PoseTests/thisPose.pose','H:/Red9PoseTests/thatPose.pose')
>>> 
>>> compare.compare() #>> bool, True = same
>>> compare.fails['failedAttrs']
 
 
Make sure we have 2 PoseData objects to compare
:param currentPose: either a PoseData object or a valid pose file
:param referencePose: either a PoseData object or a valid pose file
:param tolerance: tolerance by which floats are matched
:param angularTolerance: the tolerance used to check rotate attr float values
:param linearTolerance: the tolerance used to check all other float attrs
:param compareDict: the internal main dict in the pose file to compare the data with
:param filterMap: if given this is used as a high level filter, only matching nodes get compared
others get skipped. Good for passing in a master core skeleton to test whilst ignoring extra nodes
| Parameters: | 
- ignoreBlocks – allows the given failure blocks to be ignored. We mainly use this for [‘missingKeys’]
 
- ignoreStrings – allows you to pass in a list of strings, if any of the keys in the data contain
that string it will be skipped, note this is a partial match so you can pass in wildcard searches [‘_’,’_end’]
 
- ignoreAttrs – allows you to skip given attrs from the poseCompare calls
 
 
 | 
Note
In the new setup if the skeletonRoot jnt is found we add a whole
new dict to serialize the current skeleton data to the pose, this means that
we can compare a pose on a rig via the internal skeleton transforms as well
as the actual rig controllers...makes validation a lot more accurate for export
- ‘poseDict’     = [poseData] main controller data
 
- ‘skeletonDict’ = [skeletonDict] block generated if exportSkeletonRoot is connected
 
- ‘infoDict’     = [info] block
 
 
- 
compare() 
Compare the 2 PoseData objects via their internal [key][attrs] blocks
return a bool. After processing self.fails is a dict holding all the fails
for processing later if required
- 
batchPatchPoses(posedir, config, poseroot, load=True, save=True, patchfunc=None, relativePose=False, relativeRots=False, relativeTrans=False) 
whats this?? a fast method to run through all the poses in a given dictionary and update
or patch them. If patchfunc isn’t given it’ll just run through and resave the pose - updating
the systems if needed. If it is then it gets run between the load and save calls.
:param posedir: directory of poses to process
:param config: hierarchy settings cfg to use to ID the nodes (hierarchy tab preset = filterSettings object)
:param poseroot: root node to the filters - poseTab rootNode/MetaRig root
:param patchfunc: optional function to run between the load and save call in processing, great for
fixing issues on mass with poses. Note we now pass pose file back into this func as an arg
| Parameters: | 
- load – should the batch load the pose
 
- save – should the batch resave the pose
 
 
 |