A somewhat manual cut and paste of a slack chat about using PDG beyond just houdini
Pierre Augeard Today at 3:29 AM
Hello !
Does someone has some experience with PDG as a stand alone tool for pipeline ? Are there some postmortem from professional available out there ? Thanks
38 replies
+1, i’ve been curious but cautious about approaching this thing
I’ve been curious too but haven’t been able to find any answers
Ironically, because I’m building a node based tool, everyone keeps saying “why aren’t you using PDG?”
And yet the same people can’t tell me how it’s been working, or how it embeds in to other DCCs etc…
it works great
Let me ask if we have any customer stories
The jist of it is that PDG is simply a command builder, so it just calls other DCCs commandline, for things like Maya we even have the ability of making a loop where you can open Maya and issue a bunch of commands as a single little burst
does it integrate inside yet? Like if a user wanted to use it inside an open Maya session to automate tasks
I have full faith in it as an external scheduler. Tops is already great
I’ve only used it standalone to run jobs. Never inside another DCC. I guess I’ve used Houdini Engine for that stuff. Albeit I think we just extended it as a farm manager more than anything.
I think the Unity Engine is the only one that supports it right now
PDG as a Pipeline Tool for Small Teams | Pavel Smirnov | SIGGRAPH 2019
This talk is about how a small team could leverage PDG with mixed Maya-Houdini pipeline. We will examine a ROP-based pipeline Griot Groove has been using to get data in and out of Maya, and how the same goals could be easily achieved with PDG.Pavel Smirnov is a Houdini artist and educator based in Tokyo with twelve years of experience. He started his career in Moscow, Russia as a Renderman lighting TD working on commercials and feature films. After moving to Japan in 2011, he created and leads a team of Houdini artists, while developing pipeline tools at Griot Groove, a Tokyo-based VFX studio. In 2014-2017 he taught a Houdini course at Digital Hollywood Tokyo School.MORE HOUDINI HIVE SIGGRAP… Show more
This video is pretty good overview
Open Firehawk: Hybrid Open Cloud Infrastructure & PDG | Andrew Graham | SIGGRAPH Asia 2019
Open Firehawk is an initiative by Andrew Graham to automate direct access to cloud infrastructure at lower cost and with more control than prior solutions were able to offer. It also seeks to resolve issues of vendor lock in – it is open source, community focused and seeks to provide as much render power to artists as efficiently possible. Particularly for the Houdini user, it approaches heavy VFX scenarios by leveraging SideFX PDG to handle dependencies across multiple locations and avoids transport of heavy data unless necessary.Open Firehawk automates creation of VFX architecture with open source infrastructure as code. The design is fully automatable, and uses tools like Hashicorp Terraf… Show more
This one too touches on the cloud bits
Thanks!
thanks @lkruel
Do you have some knowledge of workflow management platform such as Apache Airflow ?
Because the scheduler and cloud backend is integrated
I am curious about the scope intersection between the two frameworks
Apache airflow doesn’t really work inside a DCC or locally per user right? Similar problems but different domain areas
no, not inside a DCC, that’s not my use case actually
Oh could you go into what your use case is? Pipelines an overloaded term
hey folks, when this discussion wraps up it would be a great idea to use /discourse post
to copy it up to the website for archiving !
@dhruv the generic use case is simple. Have a platform to define tasks, and theirs dependencies (as a graph), and to execute them in the cloud
When the task is “render an image” we call it renderfarm, the idea here is to have this specific use case but any other, such as 3D assets conversion (HD → LD, FBX → glTF/USDZ), declination, scene conformation, assembling, quality check, …
So I was aware of solutions such as OpenCue which was designed as renderfarm at first then can be tweaked to accept any kind of task, or Apache Airflow which was I think designed to handle a graph of task dependencies for data science
So it’s sort of apples and oranges IMHO.
- PDG could work great for both local and farm
- Airflow and OpenCue only work on the farm
- Airflow and PDG provide UI’s for managing the flow of tasks
- Only OpenCue afaik allows you to have full resource management etc whereas airflow and pdg would delegate it to the execution systems
I’m most familiar with OpenCue, since I worked at Sony and we used the in house version (Cue3) for 5+ years.It’s pretty easy to use it as a generic task manager. Just doesn’t have a dependency graph view to it.
But renders and tasks are the same thing for it. It’s just an outline script that gets run on a node
Yeah, so PDG still needs a scheduler, we ship HQueue and a Local Scheduler, but PDG essentially just builds graphs of commandline tasks
Seems like Airflow is both the task graph and the farm manager?
AFAIK airflow scheduler isn’t really a farm manager
I think you can just give it machines to work with but it’s not at the same level as OpenCue etc for example
yeah I think it’s more of a basic task scheduler
does seem like it has some scheduling capabilities?
but that was one of the pluses we saw on PDG, the earliest versions had us typing up graphs in Python, but the UI is the game changing part of it