What are you guys doing for tool deployment? Do you set up a network share and just have max load directly from the share any scripts/plugs? … Would it be better to write a deployment tool that is called on startup which copies any content to the artist machine? I’d guess that copying network plugins/scripts locally is a more robust way to do things but I thought I’d throw the question out to all you code genius’ and see what flys.
Obviously the size of your studio/firm will dictate the complexity of your solution but I’d definitely like to see how you all solve this age old problem…
Personally I’m using the AeronSetup package that was posted on cgtalk many months ago from Aearon at Soulpix, it loads scripts from a network location to the local macroscripts dir, and builds a sub menu in the main menu for each of the scripts copied.
We’ve had all sorts of hair-brained schemes over the years, but a couple of years ago we decided to bite the bullet and write an in-house tool to take care of it.
The tool runs when Windows starts and first updates itself. All the scripts/plugins live a mapped folder in Perforce so the tool sets up the Max paths directly in the plugin.ini. It gets latest on all our tools, the max plugins/scripts and the shaders.
We have the scripts packaged in a couple of .mzps and let Max take care of copying the .mcrs.
The tool also allows everyone to add their own get latest paths. For example, I have the game code, the scripts and tools source and the game assets set up to update while I’m making a cuppa.
I’d definitely recommend making something like this in your studio. It makes pushing out updates so much easier and we can do some quite complex Windows and tool chain set up changes automatically.
We wrote our own in-house tool that packages tools (well, anything really) and copies them locally. It was written on top of MSBuild.
For instance, our 3dsmax tools is a project that lives in Perforce. Whenever a tool needs to be updated in that project, we re-package it which gets distributed to the network, or wherever the target archive is. Whenever someone runs 3dsmax next, or reboots their machine, our tool will run and update their local copy of the project, in this case, our 3dsmax tools.
I’m glossing over many of the details that make this tool so flexible for us, but it has become so popular internally that we use it for all tools distribution. We’ve even used it for outsourcing, which has been a big plus.
Sadly, since there’s only 3 of us using the tools on a daily basis, it’s the old method of copy/paste. :D:
But I love the network script update idea. It would save me one less hassle when I tweak a tool. It’s quite sad that I didn’t think of it or know of it before!
We’ve a simple network share that’s synced by a max start script… yeah no plugin updates… Advantage over a system script, is that everytime max starts it has the latest of everything.
I’ve recently moved the syncing to an ini setup so I can control exactly what gets updated and it’s much faster than comparing all the dates on the server with all the local dates. Just the ini vs local times, and if the ini’s date isn’t different, the update doesn’t run at all.
I’ll have to look at AeronSetup, that’s cool to have dynamic menus that update when you add things!
Paired with this is a publishing / versioning script which updates the ini when you have something new to push out and auto versions it to a separate network share.
in a similar vein to the Aearon package, there is another script by andrei kletskov at http://andklv2.narod.ru/maxscripts/ak-maxscripts.html called SmartMenu that just does the menu creation if you already have your scripts managed and mirrored.
I don’t really have a good solution for plugins… I simply keep a working installed set of plugins on the network, and have placed a batch file in each machines startup with the “xcopy /s /y /D” to propagate any changes to them at boot up.
Recently I implemented a deployment system with the rollout of a new version of Max that works sucessfully. The process before required the artists to click a link to .bat file that synced perforce to the local machine and then copied files to the correct location. The problem I found was that the artists both internal and external were not running this script on a regular basis and more than once time was spent trying to debug something that was out of date.
The new system makes it so every time they start Max the get the latest, additionally I setup discipline specific loading of scripts and plugins. This was due to several of the plugins that the animators used taking a “long” time to load. This is done through a single configuration file that is parsed at startup and based on the first 4 numbers and an environment variable an entry is evaluated or not. Below is an example:
One obvious side effect of this implementation is that scripts and UI macros are not recursively sourced but have to be included manually.
One other thing I changed from the previous startup environment was to not copy files once synced to the C:\Program Files\Autodesk\3ds Max 2008 directory. They are syned to Z: and sourced from there as well. One handy thing I ran across was the use of an [Include] in the plugin.ini
This allows the individual user to have their own plugins listed in the .ini file but lets the TA’s to modify the BioWare specific includes to add new plugin\script directories that can be recursively sourced.
Its a beautiful thing to only have to restart Max to get the latest. Now if only I could make it dynamically push…
Thanks for replying, David. I just want to chime in and say, I love the script deployment system we have. David wrote it so I asked him to explain it, because I still don’t really understand the nitty gritty (there are lots of DOS scripting things going on in the initial launch BAT). BTW, we got the idea from one of the Tech Art Roundtables at GDC (the second, I think), someone mentioned the artists running Max by running another script file instead (that synched the latest tools, THEN launched max).
Dynamically pushing would be nice, but it would require taking control of the artist’s Max and Evaluating any new Macros. It would be cool to have some sort of ‘listener’ program that would allow us to update an artist’s perforce, though… but not for me, the temptation to abuse that power would be too great!
My super-low-tech solution: I have a batch file which syncs the scripts/plugins from Perforce, copies them to the correct locations and then loads the program.
Basically, I replace the user’s desktop shortcut to Maya/zBrush/Photoshop/etc with a link to a batch file (I then use the original icon so they really can’t tell the difference). It does a quick sync & copy in the background then calls the software. It’s great, because if we make a change to any script, I just tell them to “reload Maya” and it works.
There’s all sorts of batch file trickery going on, like asking Perforce where it’s copied the files to, pinging the registry to see where Maya is installed, using the environment variables to see where the user’s prefs are stored, etc.
The only downside, is that it requires a bit of manual setup at the beginning: a sync to the tools to get the batch and replacing the icons. Also, updating the batch file is hard, because it messes it up in mid run (run batch, it syncs a new batch over the one currently running: it seems to continue from whatever line it was on which can break the flow).
It’s handy though, and I used to do fun things like change the Maya splash screen every week or so with some funny image or comic.
We also use a low-tech solution of just having a bat file that syncs the tools from Perforce and xcopying the files from the P4 root to c:\3dsmax9 (or whatever). Problems that come from that approach are:
Max seems to lock the files for a “while” after it has been shut down. Thus, people have learnt to wait for a few seconds after quitting Max before they run the update bat file.
Deleting files from P4 is a ugly. Now I need to manually add their deletion to the update bat file.
I’ve sometimes thought of just setting up a separate P4 clientspec for the artists to deploy the scripts directly to their C:\ drive (like we have for programmers/tech artists who develop the stuff), but having multiple clientspecs as a concept seemed a bit confusing to the artists. Sure I could just “automate” that by hiding it into a script that selects the correct clientspec and syncs those files… Hmm, perhaps I should just do that.
We have a deployment package for the Max installs so everyone has their copy of Max always at the same location…
One thing that is worth it’s weight in gold is a global version number for the Max tools. I think someone who’s also here (Jeff Hanna?) mentioned it to me a few years back?
There’s a g_rmdMaxScriptToolsVersion = 57 global variable in the stdinit scripts. There is a separate file (T:\Software\Art_Tools\3dsmax\MaxScriptToolsVersion.ini) which is obviously always “live”.
Thus, when I want to “push” an update to all users, I just increment the version numbers and all users get a MOTD nag-box at Max startup that they need upgrade (and a short description of what was fixed/added etc…)
Works like a charm and artists don’t even complain about the setup…
Sami, that MOTD nag-box thing for updating on version change sounds pretty good, I was thinking of doing something similar for our Maya tools, just to notify the artists of an update automatically (instead of emailing it out and expecting them to read email / follow instructions), then automatically get the latest version either from SVN or a network location.
At BWE, we originally had used rsync to cut down on the transfer time instead of xcopy. Ultimately though, our maxscripts were integrated into the distribution system the rest of the toolset uses (which is a standalone app that performs rsync like ops on a network share).
I’m probably one of the few programmers out there that hate dynamic pushing though. Don’t interrupt a users workflow. If you really need to, make a resident trayapp that alerts users of critical bugs and let them shutdown/update/restart manually. It’s rather trivial to do trayapps in C#.
Yes, I’ve had users go out of sync on me - but in those cases, the first thing I always do is check what version their machine/output was at. It takes half a second to do that and gives me a reason to sit down and talk to them. Firewalling myself behind auto-updaters was the last thing I want to do. There’s an advantage to having users in-house and we might as well make the best of it.
I have integrated svn into all our art applications, and added in hooks for loading our tools. I’m currently moving the system over to Perforce using the same concept but now we can easily release/control our tools for 3rd parties.
I also included a way to version control application data eg. Maya scripts, so that I can fix any bugs/integration issue with the application. I wrote a standalone application that allows you to customize project configurations and have staff move from one project to another without worry about project specific exporters, tools or game connections.
we have a program that launchs a picker where you pick the project you want to work on - it then updates the correct version of max for that project with the latests plugins and scripts (stored on a share in a zip) and launches that version of max
Wow, i was just starting to struggle with the best way to get these out, then the thread popped up.
Currently we are using a batch file that is stored in a directory in perforce, each artist runs that and it places the scripts in the plugins folder. Though i am starting to see alot of issues from artists not updating it all the time. The max load sounds like a much better solution, ill look more in to that after crunch.
Awesome information here - it seems like having a slick tool deployment option built in isn’t the strongest point of Max (I love Maya’s Modules - they really work for me…)
Right now I’ve got a test setup using Perforce to sync a tool heirarchy. I can load my tools just fine, and it will be pretty simple to have a tool “Switchboard” up so the animators don’t need to load modeler tools or vice versa.
I haven’t figured out one thing though - does anyone know a good way of having the macroscript icons work correctly without having to copy them to the user’s icons directory? I’d really like the icons to live (and die) in the client-view of the Perforce depot – I hate “co-mingling” the files of 3dsMax, 3rd party tools, and our own scripts and plugins.
Let me put down my setup. I have build a dynamic menu, that can be updated easily.
First time setup you have to run a script from the server.
If you press update a local script is run removing everything installed (stored in inifiles), also the declared menu items and macroscripts (since a script cannot remove itself I have the cleanup script spawning a callback, removing and updating the cleanup script). After that it copies files from the server and builds an new menu structure from that.
I don’t make macroscripts no more, since I cannot copy a macroscript and declare it easily to a menu, so right now all my scripts are regular scripts spawning dialogs or tools etc. The menu script makes a macroscript wrapper file that does a fileIn on the scripts. I have a single line declaration on top of each script determining in what categorie a menu item(macrowrapped script) is placed.
The update function at the bottom is easy and I can just yell to people to press update and they are up to date. And people do it by themselves too, but we are a relatively small shop. I’m going to add the possibility to add plugin scripts and script libs. etc as well, but haven’t gotten around to it yet.
I’m pretty happy with the setup so far, because some people simply refuse to change shortcuts or assign macro’s to custom menu’s etc and miss the benefits of the tools.
“(I love Maya’s Modules - they really work for me…)” - ObsidianPC
How do you get the Maya Modules to use system env variables? in fact can you?
I wanted do use the system env’s within the Modules but can’t find any syntax to do it.
We’re in Maya so…
I put a palette system together that contains all the internal tools we have, the palette takes care of all the syncing and making sure all the required plugins/scripts are uptodate etc. it has functionality for a switch between teams working with different tools requirements as well, although this is not in use currently.
Its easy to update and allows the artist to build their own shelfs from the tear off buttons for tools they use alot.
Its been really handy and well worth the time investment.