I’m curious how others handle the hierarchy of python files as it pertains to maya-required Python modules and stand-alone modules.
Back in the day before Python, our MEL scripts folder was fairly simple:
./tools/maya/scripts/<folders for various projects>
Introduce Python, and now we have an added:
./tools/maya/python/<folders for various projects>
And then we realize we should have Python modules that are not Maya-dependent be located outside the ‘maya’ folder, so we added:
./tools/python/<folders for various projects>
Keep in mind that this is a central tools area, independent of the actual project code and assets. How many of you have experience with organizing these folders and have any advice that can be given for what should go where? Let’s assume a blank slate, how would you organize assets, maya-specific python, stand-alone python and MEL to be used across three to four separate on-going projects (meaning, some of the scripts might be shared across projects)?
for projects that share the same scripts and don’t need any customizing of existing scripts:
./global/tools/maya/scripts/.mel
./global/tools/maya/python/.py
./global/tools/python/*.py
Our Perforce depot is divided up into ‘projects’ (roughly, products) and each project is divided into branches. Branches come and go - some exist for short duration (say, a prototype team working on a new feature) and some hang around forever in the ‘main’ branch. At the current stage of the codebase I fork the code with a perforce integration at the start of a new project, since we have to support multiple projects at the same time.
The source code looks like this:
//depot/project/branch/tools/python for common code
//depot/project/branch/tools/python/maya for maya specific code.
I avoid MEL like the plague it is - when I have to distribute it, it’s a resource file inside the python branch. I think I have 4 mel files total in the toolset right now. Other resources – icons, binary files, prefs or settings – get included in an ‘resources’ directory in the python tree.
We build a zip file of .pyc files for each combination of project and branch; Maya knows which project/branch a user is in at a given moment and grabs the latest one on startup. Resources – MEL files, etc-- are branched and included in the zip files the same way.
Good things:
We can support multiple concurrent setups easily: a special test exporter for the prototype group over here doesn’t affect the regular exporter over there. Users can switch back and forth by just changing an environment variable and restarting Maya.
No worries about deleting old stuff: the new zip file completely replaces the old so no problems with moved/renamed modules, leftover files on this machine but not that, and so on.
the project to project and branch to branch relationships are explicitly recorded in perforce: if I make a change in this branches version of file X, I can see what the change is and decide if I want to push it back up to the main line or not.
the downsides:
-1) It’s perforce, not Git or Mercurial or some other DVCS, so pushing changes from the branches in toward the 1) main line is a bit tedious. I have to be pretty disiciplined about pushing changes back early instead of letting them accumulate. I use a lot of unit tests to make it easier to do this safely – but it a branch hits a snag that requires a big refactor I have to change the tests in the branch and then push the code and the tests back into the main line, which is a pain.
-2) doing the integrations demands a bit of care. It’s not bad for the scale I have to support (2 projects, maybe 5 branches total) but a big team with more than that (and more people doing commits) would be harder to manage.
Theodox, what do you do with code that is not specific to a project? Maybe just some general modeling or rigging tools? Are they cloned to each project, or do you have a centralized repository for those?
In my case I just clone them - again, I’m only supporting two products and a handful of branches in a single perforce depot plus I don’t have a big team of people all working in parallel. I just use p4 integrate to push fixes back and forth between branches. I haven’t looked into the perforce git integration, which might be a better way to handle the case where each code line is evolving in parallel and needs to be resolved. Disk space is cheap, and it’s a lot easier to build toolsets out of a unified repository.
A key element here is that you need to practice very strong encapsulation so that you don’t break lots of stuff for the sake of a local change. It’s good practice for example to inherit a class from the base libraries and tweak it for your project instead of changing the base class where you can - that makes it super clear what’s a project specific change and what’s supposed to be global. And of course the usual stuff - like isolating constants and magic numbers into methods or classes that can be swapped out quietly, etc.