Hi there. I’m hoping to launch a headless Maya instance via python subprocess and feed tasks to it in same manner you would dispatch jobs to a worker thread. Having the worker listen for new jobs once it becomes idle. I’m looking for the best way to go about this. Any suggestions would be greatly appreciated, thanks!
I think the most straightforward way to do that would be to use mayapy.exe
that ships with Maya, and the standard python multiprocessing
library.
You’d probably want to set up a multiprocessing.Pool
that uses an initializer which sets up maya by running maya.standalone.initialize()
https://knowledge.autodesk.com/support/maya/learn-explore/caas/CloudHelp/cloudhelp/2016/ENU/Maya/files/GUID-83799297-C629-48A8-BCE4-061D3F275215-htm.html
https://docs.python.org/2.7/library/multiprocessing.html#multiprocessing.pool.multiprocessing.Pool
Thanks tfox_TD! I’ll look into it.
@tfox_TD Worked like a charm, thanks for your help!
Here’s an example snippet I got running in case anyone else is curious. Pass a queue to the worker.
# Python 2.7
from multiprocessing import Process, JoinableQueue
from functools import partial
import os
def worker(q):
"""
:type q: JoinableQueue
"""
# Process ID
pid = os.getpid()
# Initialize Maya
import maya.standalone
maya.standalone.initialize(name="python")
# Keep trying to find tasks until queue is empty.
while True:
if q.empty():
print("\nQueue is empty, I'm out! PID: {}\n".format(pid))
return
# Keep trying until we get an item.
item = q.get()
if not item:
continue
try:
item.run()
print("\nPid {} working on item {}\n".format(pid, item.id))
except Exception as ex:
print("\nPid {} task failed # {}\n exception {}\n".format(pid, item.id, ex))
finally:
# Mark item as done.
q.task_done()
class Task(object):
def __init__(self, task_id):
self.id = task_id
@staticmethod
def run():
# Example Task, list Maya cameras in the scene.
exec "import maya.cmds as cmds;print(cmds.ls(type='camera'))"
if __name__ == '__main__':
my_queue = JoinableQueue()
# Create 10 tasks.
for id in range(10):
my_queue.put(Task(id))
# Create 3 processes to work on them.
for i in range(3):
p = Process(target=partial(worker, my_queue))
p.start()
my_queue.join() # Wait till all tasks are done.
print('\nAll tasks Done!\n')
I HIGHLY recommend not dealing with Process
and Queue
objects if you can help it. It’s incredibly easy to make very subtle bugs.
Pool
will very likely do everything you need it to. It allows passing arguments and getting returns, all while taking care of all the complicated mutexing/joining/process management
import os
import time
from multiprocessing import Pool
def initializer():
# This is run once per pool process
import maya.standalone
maya.standalone.initialize(name="python")
def worker(allArgs):
import maya.cmds as cmds
# TODO: This is *NOT* necessarily a new maya file
# so make sure to clean up after yourself
arg1, arg2 = allArgs
cmds.createNode('transform', name=arg1)
time.sleep(1)
tfms = cmds.ls(type='transform')
return arg2, tfms, os.getpid()
if __name__ == '__main__':
pool = Pool(3, initializer)
argses = [
("test1", "tfox"), ("test2", "Narwhal"), ("test3", "bob.w"), ("test4", "theodox")
]
print("Starting tasks")
for user, tfms, pid in pool.imap_unordered(worker, argses):
print(user, tfms, pid)
print('\nAll tasks Done!\n')
Note: I didn’t clean up after myself like I say to do in the #TODO
. That means if you run this, you will see that one of the processes prints out a transform that was created in a previous process.
Oh that’s way cleaner, thanks again!