Currently there's no way to tell the service to reset and clear its cache. To work around this at the moment, you would need to restart the service whenever the memory needs to be freed up. We're looking into being able to issue a reset command to the services, however, which will provide a way to clear the contents of the scene file and empty out any caches that the service maintains without needing to restart it.
Are there any updates on this? I'm running into the same problem of quickly running out of RAM on a long-runing service processing lots of geometry/work items
Is there anything planned to be able to combine batching/services in some way, where you can respawn a service for every nth work item, or after a certain RAM percentage is hit etc etc
The issue could be that since you're setting the unreal material to the same values, the plugin thinks that you want to use the same material instance for both groups. So Unfortunately, you may have to duplicate your master material.
Has this been resolved / is there any fix for this? I need to generate hundreds of material instances from a single parent material, all with different parameters, with multiple material slots on a single merged static mesh, and performance is critical so duplicating a master material hundreds/thousands of times is not going to work.
Is there any workflow that supports this? Would that be an RFE situation to have the plugin automatically generate multiple material instances if it detects different 'unreal_material_parameter_*' attribute values on different prims?
Is there any support for this kind of functionality in newer versions of Houdini Engine 2.0? Looking to do something similar, automatically generate textures and material instances from a material_override attribute, without having to create tons of shaders in python etc, which can be problematic when using PDG. thanks
Did you ever find a solution for this? I'd like to have a shared keymap.overrides file but selecting a custom keymap always saves to $HOUDINI_USER_PREF_DIR
does anyone have any updates on this? Autocomplete was working fine in pycharm a few weeks ago but after upgrading Houdini daily builds, updating pycharm, now hou autocomplete is completely broken.
Tried hython interpreter, houdini python2.7, system python2.7, added $HFS/houdini/python2.7libs to extra paths in pycharm, added $HFS/bin to $PATH in system environment variables, rolled back houdini installs, rolled back pycharm installs, on 2 different machines. Nothing works
The stubs file works ok for first level autocomplete (hou.) but stops working on any level deeper. VSCode works perfectly with exactly the same configuration as pycharm, but VScode PySide2 autocomplete isn't great…
This makes me think this is a pycharm problem not a Houdini problem, but has anyone solved this?
I have an object-level batch asset processing-type HDA with a TOP network inside that contains several sop-level HDA processors
I promoted all the necessary parameters from the HDAs themselves onto the top object-level asset so the user has control over everything, but I need to link the pdg HDA processors to those parameters as well
there doesnt seem to be any parameter references created when you hit ‘update HDA parameters’ on the hda processor, even when pointing to a template sop node. So far i've only been able to manually copy/paste relative reference each parameter manually, and the assets contain multiparms so that no longer works
On a normal HDA for instance you could use the ‘Import Settings’ toggle on a folder pointing to the template HDA, refresh the imports, and your good, but since the HDA processor is dynamically generating spare parameters for everything (and not all of them come across correct, things like buttons, hide/disable when etc don't link up perfectly), whats the preferred workflow here?
If its through the Filter HDA parameters window, I cant find any documentation on it, I tried checking enable expression but it just creates an empty python expression. Is the idea that you then fill in the expressions yourself, or how is that mean to work?
Finally, I remember reading somewhere you can control parameters using work item attributes, would you create an attribute that has the same name and type as the HDA parameter and it will pick it up automatically? or do you need to fill in all the HDA parameters manually with `@my_attrib` for it to work? Which again would be problematic due to the dynamic multiparms etc
I'm looking to write a python script that will scan the current hip file for unlocked HDAs of a certain criteria and detect if modifications have been made to them; i.e. the unlocked instance differs from its associated hda library file
I can easily detect if they are unlocked, but in the python docs I'm not finding any good way of analyzing the contents of the network, like an unsaved word document. For instance I would want the script to flag even having put an unconnected null down as unsaved changes
First thing that comes to mind would be to generate a hash/GUID and compare the two, but other than writing my own hash algorithm (and possibly storing it as user data?) I don't see any built-in way to do this in Houdini, and since the slightest changes in the HDA library file would would invalidate the hash I'm not super confident with that solution
I have an fbx subnet with a bunch of different models that Im object merging into a separate geo network with wildcards in the object_path to grab all the file nodes in the fbx subnet, and enabling create path attribute in the object merge
I need the shop_material path to correctly point to the corresping fbx material that was created, and while most objects are read in with the shop_materialpath attribute correctly, the houdini fbx importer will randomly assign the material at the object level depending on how the model was prepped, so the resulting shop_materialpath from the object merge is missing. The textures and materials are there at the object level (in the fbx subnet) but the shop path is wrong, and thus textures break in the object merge.
so I need in a python sop to run a loop over each prim, eval the obj level shop_materialpath parameter from its path parameter (i.e. its source geo network in the fbx subnet) and append that string to the current geo shop_materialpath for the missing prims
jsmack Hotkeys for items in menus require the menu to be open for the hotkey to work. Open the gear menu first and and then hit the assigned hotkey to get the edit parm dialog to appear.
well that defeats the purpose haha its 1mm away at that point. I geuss you could add it as a menu parm script in your main right click menu and hotkey that
I updated to h17.5 and all my custom HDAs broke. It seems sidefx updated the bound sop and now im getting this error message on all my HDAs -
/obj/path/to/my/HDA/bound1:
“Too many elements found for parameter ”foo/bound1 Bounding Type"
This is across about 20 different HDAs in this scene that all contain the bound sop, with probably 100 total instances throughout the scene.
Is there any way to go about fixing this sort of issue without manually updating every single HDA by hand?
Furthermore, when working with nested HDAs, (where you have 1 HDA inside another), and the nested one updates in a way that breaks the parent HDA, what are some strategies to deal with/avoid situations like that as well?
I'm trying to create network boxes with padding in python
However I'm having problems getting the padding to be even and constant on all sides. I can recreate the built in functionality with fitAroundContents(), but I cant figure out how to correctly figure out the bounding rect from that and then expand it evenly on all sides. I think this is partially because I'm not correctly calculating the center point from sel.position
Here is my code so far (my python knowledge is very basic). Any help is much appreciated!
sel=hou.selectedNodes()[0]all=hou.selectedNodes()pos=sel.position()parent=sel.parent()box=parent.createNetworkBox()box.setPosition(sel.position())# add selected nodesfornodeinall:box.addItem(node)# set colorbox.setColor(hou.Color(0.3,0.3,0.3))# fit boxbox.fitAroundContents()# fit boxsize=box.size()center=box.position()pad=hou.Vector2(2,2)# set bounding boxbbox=hou.BoundingRect((center[0]-size[0])-pad[0],(center[1]-size[1])-pad[1],(center[0]+size[0])+pad[0],(center[1]+size[1])+pad[1])# set to bboxbox.setBounds(bbox)
I have been having this issue a lot lately too, when I perform an operation on normals like compute tangent normals on a curve they will not update until I close the viewer and open a new one
My guess is an nvidia thing, I have a quadro k2200 at home and at work a GTX 980, same issue.
Try downloading the latest daily build and updating your graphics drivers, see if that helps
Short answer - I want to recreate the pyro dual rest solver in SOPS to output a new position value from dual rest fields that I can use with standard houdini vops
I have a smoke sim from pyro with dual rest fields, and I want to use the rest fields as my sample source for a custom noise in a volume VOP, amongst other things. However with Houdini 14+ the dual rest solver only seems to work with unified noise as it has two outputs, or in houdini 15, a signal output. It doesn't play well with any standard VOP nodes, I.e. if you output it into a regular noise for instance it will error out like crazy
With older versions of houdini you could use a mix to blend between rest and rest2 using rest_ratio as the bias (see attached for a simple example) but this no longer seems to work and causes popping in the noise as the rest positions change, as it doesnt account for rest2_ratio, or rest2 in general (or so Im guessing).
Both the dual rest solver and unified noise have been black boxed in Houdini 14+, so does anyone know how to either-
A) recreate the dual rest solver using regular vops
or
B) get the dual rest solver to work with regular turbulent or anti-aliased noise?
Hey guys, so I have an attached file of an alembic file of an animated deforming creature from maya. I am converting it into a VDB volume and applying noise via vop sop. I created a rest.x rest.y rest.z volume to compute the volume noise rest fields for animation. Now the problem is, after I create these volume rest fields I need to then deform the volume rest fields to match the alembic animation so my rest field will deform along with the geometry and the noise will stick to it without any slipping
At the most basic level I can manually keyframe an xform sop, but the noise slips when there is deformation; i.e. this only works with translation or rotation
So is there any way to somehow transfer the deformation of the alembic animation over to the volume? Ideally it would be some kind of lattice deformer but it does not work with volumes. I tried transferring density to points, deforming via lattice and then transferring back, but this caused a huge loss of detail/unpredictable results. I also tried transferring volume attributes, along with a thousand other things, with no luck. Any ideas?