I am using Python in Maya 2012(64-bit) to iterate over large amounts of data and Maya appears to be just crapping out at some point during the process.
Where and when Maya hangs seems to be inconsistent and when I look at where Maya crashes and perform the operations on just that data, everything works ok, so it doesn’t (superifically at least) appear to be bad content causing the hang-up.
Anyone got any advice on how to better manage resources, memory, etc. on very large data sets?
if you use python, watch out for references hanging around taking up memory. Especially Python + Qt can lead to massive leaks if you don’t know what you’re doing (looking at you QImage and QPixmap!).
Look at your code where you create objects that allocate memory - sometimes re-using a single object leads to less waste then making a new one inside a loop (especially when for some reason a reference keeps remaining to that thing!)
Python garbage collection is a bit slow sometimes - I had a batch image processing tool (PyQt) where memory use would just spike before GC would kick in. In the worst case use “del” to force deletion of an object - but be careful with that! (also some people consider it bad style, but it did the trick)
Are you working on very large lists or big objects? Maybe you could switch to generators and use yields to work on smaller bits. That’s more memory efficient than having a big wad of data all in memory at once, and it has the nice side-effect of allowing you to bail at any time without leaving a big mess of data still in memory though it might cost you more on setup and IO.
And yes, del might be your friend if you have large objects with lots of other objects inside other objects.
If you’re doing something really processing intensive, maybe it’s a good candidate for compiled code instead of python.
Thanks for the advice, here’s a little more information about what I am doing
I have a maya scene which contains over 2000 individual objects and for each object I am running a series of operations, such as Planar Projections, Unfold and Layout operations.
I’m also iterating over the meshes collecting information about UV Shells, enabling and disabling selection constraints, converting components into other components, etc.
I’ll try turning off the undo queue and see how much that helps. I’ve been considering how to break the single list of 2000 into smaller chunks, but i am not sure how to do that from a single click…generators and yields sound intriguing, i’ll look into that…
[QUOTE=Theodox;15348]Are you working on very large lists or big objects? Maybe you could switch to generators and use yields to work on smaller bits. That’s more memory efficient than having a big wad of data all in memory at once, and it has the nice side-effect of allowing you to bail at any time without leaving a big mess of data still in memory though it might cost you more on setup and IO.
And yes, del might be your friend if you have large objects with lots of other objects inside other objects.
If you’re doing something really processing intensive, maybe it’s a good candidate for compiled code instead of python.[/QUOTE]
I am working with a large list of objects both very big and very small.
Multiple serial operations on 2000 objects is very likely to be related to the undo stack. The generator option might help just because it encourages good scoping, but this sounds like Maya is your enemy more than Python.
Turn off undo before running your operation (remember to put the turn on in a finally block so it always gets reset, even if the script cacks). While processing, try to run all your operations with ch=False so you don’t generate more history as you go along. If you’re doing some kind of complex analysis or data collection, dump it immediately to disk or print it to the listener or whatever instead of collecting it all in memory.
It turns out the source of my problem wasn’t due to the size of the data set, but rather with Maya’s Unfold command not always generating valid UV Coordinates in a non-deterministic manner.
Under certain circumstances, the Unfold produces coordinates with a value of “nan”. When these coordinates are passed to the Layout command, Maya crashes.