Hey all,
I've been thinking a lot recently about USD workflows and how they could be best used across the whole VFX pipeline.
The idealized scenario I'd like to see, given the vast lead SideFX has with USD implementation, is Solaris being a hub for every department in a VFX company and even used as a real-time tool.
My hope would also be that other software developers and companies would embrace this to the point where hydra render delegates could be made for things like the Unreal Engine or even software like Notch/EmberGen that could run inside of Solaris.
Unless I'm fundamentally misunderstanding how the technology works (and PLEASE let me know if that is the case), in Solaris non-USD data (like that directly imported from SOPs) gets either converted live to USD prim data OR is left in its “native” format that is then up to the hydra render delegate to interoperate that data.
If that is correct it opens up a massive opportunity for Solaris to revolutionize workflows and give artists more options than ever before while drastically speeding up existing ones.
As I'll be referring to this a lot, I'm going to abbreviate “hydra render delegate” as HRD
I'm imagining a workflow that could look something like this:
- models get authored in whatever application works best here and exported as USD files
- artists can then start viewing those in Solaris with a GPU based HRD
- layout can be done directly in Solaris or even Unreal Engine or example, using USD in/out keep everything very clean/easy
- The layout USD scenes (regardless of where they are authored) then get viewed through Solaris, ideally here again a GPU HRD could be used for game or game-like real-time quality.
- animation is an interesting one, ideally would be great to get animators into Houdini but for now that is still mainly Maya domain
- Some companies have already shown you can replace Maya's viewport with a HRD, is it too much of a stretch then to imagine a Maya and Houdini session running side-by-side with Solaris being used as the sole viewport with just a handy “Maya Import” LOP?
- A HRD that either already has been developed specifically as a replacement for viewport 2.0 or a new one could be selected in Solaris to render this data, no need for any other slow data conversion into Houdini, just passing it through!
I could keep expanding that list, but I think most people should see now where I'm going and how this benefits everyone.
When I outlined all the separate HRDs above and being able to switch between them for different things, while this is great what would be even better is if you could composite them in real-time as its own viewport, A “compositing” HRD! I briefly asked about this at the LA siggraph launch and it sounded like it was do-able in principle.
So you have your layout environment done, its being rendered in Solaris through something like a game engine and looks amazing. On top of that you have your Maya viewport HRD with your animation rig (straight A/B with alpha mask), now your animators can real-time animate their low-poly rig but in context of the beautiful environment. Perhaps some trickery can even be done here to get shadows working.
That idea could be expanded in a lot of ways, with many different layers all being combined together in a single live view, beyond basic layering, all the other COP operations would be cool to run over each layer but that's for extra bonus credit!
Later all that data can be cooked out accordingly and rendered properly with an offline production renderer but in the meantime you have the best of all worlds being displayed contextually together.
It would be even more amazing if tools like Notch, EmberGen, Unity, Unreal Engine, TouchDesigner etc, etc could be incorporated as HRDs through Solaris the same way I described the Maya workflow. Obliviously that is entirely up to those developers but its the dream.
PDG is also a very powerful tool in this scenario that could make managing what I've described a lot easier, I'm specifically thinking of when its time to take all this live setup and start baking things out for final render.
USD is an amazing technology and what SideFX have done so far with Solaris is incredible, I'd really like to hear other peoples thoughts on what I've outlined as a potential step to take it further!