SideFX: Solaris and Karmahttps://www.sidefx.com/forum/82/2024-03-29T05:56:08+00:00Render gallery can't save when using linux
2024-03-29T05:56:08+00:00cuihaifu418278I can save hipfile , render image to server, cache usd or bgeo to server, but Can't save renderGallery.
Render gallery can't save when using linux
2024-03-29T05:55:24+00:00cuihaifu418277linux:centos8.5, houdini 20.0.590, server:win server<br><br><img src="/forum/attachment/f885a4178aaa387de23bad768fb71f43d647288d/">
Opacity difference XPU vs CPU
2024-03-28T22:12:20+00:00Ackiparait418233Hey folks !<br>Problem here driving me a bit crazy, I am trying to get a proper leaves material for differents kinds of trees, and as soon as I get a "thin" opacity map for pines needles for examples, I got a very annoying issue...<br><br>Not only the opacity does not react exactly the same, my leaves vanished when zooming out. Only rendering in xpu, no problem in cpu.<br><br>Cpu renders : <br><img src="/forum/attachment/427053f5612a23e67c7b3b00a2ae4d6a60d24e5f/"> <img src="/forum/attachment/bf057eb96317744f4cd549036f837bb484bf700a/"><br><br><br>Xpu renders : <br><img src="/forum/attachment/8b13b91e14b5dfe3f6a4d55598bc02192eedafc6/"> <img src="/forum/attachment/28e6337595eac36c20c0154c0e7b6d16c4fe112d/"><br><br>My Nvidia studio drivers are up to date, Houdini 20.0.653<br><br>I would think about some kind of mipmapping issue, but not sure.<br><br>-Coco
Correct way to get instancer using variants
2024-03-28T18:47:36+00:00mtucker418215<blockquote><em>am_wilkins</em><br>@mtucker: The goal is just to ensure that the incoming "prototypes" randomize between the variants. While having an intuitive/neat workflow. Cause right now, spamming the same asset ref USD multiple times seems really strange workflow.</blockquote><br>I'm not sure what might be going wrong in your setup, but it's certainly possible to make things work as expected (see attached file). The "trickiest" part was realizing that the instancer LOP have to turn off the option to only import the prototype prims... Because the per-variant prims are references to another prim in a different part of the scene graph, you can't _just_ copy the per-variant prims onto the instancer stage.
Correct way to get instancer using variants
2024-03-28T15:35:48+00:00mtucker418196<blockquote><em>flipsideza</em><br>I might be doing this wrong, but something I have been playing with is using a `for each` loop.</blockquote><br>The problem here is that you only have one "copy" or the asset in your scene. So in the for each loop, each iteration is setting the variant selection _on that same scene graph location_ to different values. So the last one wins. If you move the asset reference _inside_ the for each loop, and use the ITERATION context option to make a unique asset reference location for each iteration, then you will end up with multiple references to the asset, each with a different variant chosen.<br><br>If you take this approach, then you also don't want to feed the asset to the for each first input, because then you'll end up with one too many copies of your asset in the scene (the stage connected to the first input is "passed through"). So you'd want to also connect the asset reference to the _second_ input of the for each LOP, and iterate over variants on the second input. The stage connected to the second input is not made part of the output of the for each LOP.<br><br>I've attached a hip file that demonstrates this.
Custom Husk Procedurals
2024-03-28T15:23:11+00:00Ben Toogood Aardman418193Thanks Goldleaf - I'll take a look and see if we can bodge our way there.<br><br>If it helps for future idea/plans, our specific use-case is deforming costume pieces composed of many curves (woven cloth, knitted wool, fuzzy felt) etc where it's not feasible to cache them directly from the rig.<br><br>The 'Arbitrary Geometry' mode in SOPs land GuideDeform does the job - it'd just be a fantastic streamliner to have it happen as a render time procedural.<br><br>Many thanks<br>Ben
Correct way to get instancer using variants
2024-03-28T08:30:54+00:00flipsideza418153I might be doing this wrong, but something I have been playing with is using a `for each` loop.<br>It has an option `For Each Variant in First Input`<br><img src="/forum/attachment/0a4e0b61505e0b40b59710244855ce6ede2ca0c3/"><br><img src="/forum/attachment/b6d59ec5ec46e88bb5b64bdc43f0a48765c86b05/"><br>and then on a `setvariant` you specify `@ITERATIONVALUE`<br><br>but the outcome in the viewport only shows the last variant in the loop, and does not produce 4 unique prims that could then be sent to the instancer.<br><br>I also attempted to include a `restructure scene graph` node to output 4 unique prims on each loop, however that did not work either.
Auto assign material with shop_materialpath in vex
2024-03-28T07:35:57+00:00dyts418151<blockquote><em>eikonoklastes</em><br>Something like this?<br><a href="https://i.imgur.com/nU6qpYf.mp4" rel="nofollow noopener">https://i.imgur.com/nU6qpYf.mp4</a> [<a href="http://i.imgur.com" rel="nofollow noopener">i.imgur.com</a>]<br><br>I've attached a higher quality video of the same thing.</blockquote><br>First thanks for your help <img src="/static/djangobb_forum/img/smilies/smile.png" /><br><br>The principle is that but I was not enough precise, in fact Ì have this network at /obj level<div class="text-danger"><i class="fa fa-ban"></i> Image Not Found</div><br><div class="text-danger"><i class="fa fa-ban"></i> Image Not Found</div><br><br>My matnet :<br><br><div class="text-danger"><i class="fa fa-ban"></i> Image Not Found</div><br><br>Here box is green<br><br>In my LOP<br><br><div class="text-danger"><i class="fa fa-ban"></i> Image Not Found</div><br><div class="text-danger"><i class="fa fa-ban"></i> Image Not Found</div><br><br>But my box doesn't bind to my material library material<br><br><div class="text-danger"><i class="fa fa-ban"></i> Image Not Found</div><br><div class="text-danger"><i class="fa fa-ban"></i> Image Not Found</div><br><div class="text-danger"><i class="fa fa-ban"></i> Image Not Found</div>
Auto assign material with shop_materialpath in vex
2024-03-28T06:19:28+00:00eikonoklastes418145Something like this?<br><a href="https://i.imgur.com/nU6qpYf.mp4" rel="nofollow noopener">https://i.imgur.com/nU6qpYf.mp4</a> [<a href="http://i.imgur.com" rel="nofollow noopener">i.imgur.com</a>]<br><br>I've attached a higher quality video of the same thing.
timesampled hairprocedural motionblur
2024-03-28T01:03:22+00:00goldleaf418111<blockquote><em>ronald_a</em><br>Is timesampled motionblur supposed to work with the hair procedural in H20 or is it still just vectormotionblur?</blockquote><br>Still just velocity or acceleration blur.
Custom Husk Procedurals
2024-03-28T01:02:56+00:00goldleaf418110<blockquote><em>Ben Toogood Aardman</em><br>@robp_sidefx - are you able to give a steer on whether it would be possible enable the SOP GuideDeform 'Arbitrary Geometry' mode within the current Solaris Hair Procedural? We'd be up for customising the <a href="http://hair_graph.bgeo.sc" rel="nofollow noopener">hair_graph.bgeo.sc</a> (have had success following the steps mentioned on the other thread to avoid the issue of moving roots to the skin) but only to the level of bypassing/enabling parts of the existing graph.<br>Does the existing embedded graph have the logic for enabling 'Arbitrary Geometry' in deformation mode and it's just not exposed at the user level, or is the graph a more stripped back implementation?<br><br>If it's a no go with hacking the existing <a href="http://hair_graph.bgeo.sc" rel="nofollow noopener">hair_graph.bgeo.sc</a>, would implementing that behaviour into the Hair Procedural be a quick and easy thing for Sidefx to do.. or is there a good reason why it's not there already?<br><br>many thanks</blockquote><br>Well, it's not there mainly because it's the <em>Hair</em> Procedural <img src="/static/djangobb_forum/img/smilies/wink.png" /><br><br>It doesn't look like the embedded graph has removed that logic, so customizing the existing graph may work; though it's hard to tell if it's a quick and easy thing or not. We have other ideas/plans in this space (nothing to announce/discuss yet though), so it isn't likely that we'd change the hair procedural this way. But feel free to give it a try, and keep us posted if you wish!
Karma XPU background plate
2024-03-27T22:45:13+00:00jsmack418087<blockquote><em>timjan</em><br>Wonderful news! Does that mean the next production build? Thanks :<img src="/static/djangobb_forum/img/smilies/smile.png" /></blockquote><br>no, that means like v21 or something
timesampled hairprocedural motionblur
2024-03-27T22:32:11+00:00ronald_a418084Is timesampled motionblur supposed to work with the hair procedural in H20 or is it still just vectormotionblur?
Network Boxes not supported in USD and Edit Material Network
2024-03-27T22:27:06+00:00kskovbo418083Hi,<br><br>I got an issue with the writing out a .usd material and the Edit Material Network" node. Is there a way to for USD / Edit Material Network node to recognize the network boxes and placement of nodes within the nodegraph? Is there a future plan to support this?<br><br>Here is the original network with the network box:<br><img src="/forum/attachment/fa048604b247df9591e8cdc59b06514daf3e4ff6/"><br><br><br>Here is the .usda file read in again and viewed through an "Edit Material Network" node:<br><img src="/forum/attachment/d21d2c3a4d103948fd2fa9343d67a22bdb5ba1c3/"><br><br><br>Thanks
Karma XPU background plate
2024-03-27T22:00:37+00:00timjan418079Wonderful news! Does that mean the next production build? Thanks :<img src="/static/djangobb_forum/img/smilies/smile.png" />