There’s a more detailed description of what I do here.
Basically I have a server which updates on every github checkin, runs unit tests, and then compiles all the .pys in my toolset to .pyc in a zip file. userSetup.py grabs the zip file if the local version is out of date – we currently keep it on a network share but I’m moving that to a web server for more control and security in the future. I’m very fond of this system because it makes sure that the state on user’s machines is monolithic: there is no chance of leftover files or old pycs messing things up.
I’ve been super happy with aggressive unit testing to make sure that bad things don’t go out into the wild, and git is great because the guys writing tools can still have local version control without triggering builds just because they want to do some version history on a file they are working on. The server only worries about the remote repo so I can update dozens of files locally, run them for myself, but only trigger a build when I push all of the commits up to github.
The single zip file is also outsource friendly: just drop our userSetup and zip file into the scripts folder and you have the entire tool set, and when your done just delete them and you have vanilla maya again.
Previous TAO discussions here: