Hello.
I am trying to render crowds in Solarice using USD skel and I can't figure out how to generate subsamples. In 19.5 there's this Motion Blur LOP and that works for most things, but for USD Skel prims, it doesn't seem to work. I'm wondering if there is something I might have missed?
Thanks.
Found 79 posts.
Search results Show results as topic list.
Technical Discussion » Solarice USDSkel motion blur subframes
- MathieuLeclaire
- 79 posts
- Offline
Technical Discussion » Exporting SubD in Alembic ROP
- MathieuLeclaire
- 79 posts
- Offline
Technical Discussion » Exporting SubD in Alembic ROP
- MathieuLeclaire
- 79 posts
- Offline
Is there a way to force a mesh to be exported as a SubD mesh when exporting through an Alembic ROP in SOP?
I have this Alembic file that contains meshes with intrinsic:abctypename = SubD while in Packed Alembics primitive. Once I unpack it, it becomes Polygons and when I try to re-export it to a new Alembic file using the Alembic ROP and re-import it, it always inherits the intrinsic:abctypename = PolyMesh. I can't seem to find it, but I'm guessing there must be a way to force the Alembic ROP to export it as a SubD mesh !?
I have this Alembic file that contains meshes with intrinsic:abctypename = SubD while in Packed Alembics primitive. Once I unpack it, it becomes Polygons and when I try to re-export it to a new Alembic file using the Alembic ROP and re-import it, it always inherits the intrinsic:abctypename = PolyMesh. I can't seem to find it, but I'm guessing there must be a way to force the Alembic ROP to export it as a SubD mesh !?
Edited by MathieuLeclaire - March 13, 2023 16:04:36
Technical Discussion » UV Layout between locked pieces
- MathieuLeclaire
- 79 posts
- Offline
Hello... I'm wondering if there is a way to tell the UV Layout SOP to lock the UV position of certain pieces and layout the rest around these locked pieces?
I have a workaround where I convert the UV coordinates to 3D space to get a 3D representation of each UV islands, then I boolean subtract a grid by the 3D locked islands to extract the available space and then use this mesh as a "Pack Into : Island From Second Input" to layout the rest of the pieces in 3D space and finally copy those new positions back to UV space to get the desired results...
That approach works, but something tells me there must exist an easier, more efficient way to do that? Any recommendations?
I have a workaround where I convert the UV coordinates to 3D space to get a 3D representation of each UV islands, then I boolean subtract a grid by the 3D locked islands to extract the available space and then use this mesh as a "Pack Into : Island From Second Input" to layout the rest of the pieces in 3D space and finally copy those new positions back to UV space to get the desired results...
That approach works, but something tells me there must exist an easier, more efficient way to do that? Any recommendations?
Technical Discussion » UV Layout around locked islands?
- MathieuLeclaire
- 79 posts
- Offline
I mean, I see a workaround where I could convert my UV Islands to point positions and create 3D islands, then boolean a grid to generate a new mesh representing the area between the islands and then use this mesh as a "pack into: islands from second input" option to layout on that mesh and finally convert those new point positions back to UV values... but it feels like maybe there's an easier way to do this that I'm not finding !?
Technical Discussion » UV Layout around locked islands?
- MathieuLeclaire
- 79 posts
- Offline
I was under the impression there was an option to do this, but I can't seem to find it. I have a selection of UV Islands I don't want to move. I want to use the UV Layout tool to layout the rest of the UV islands around those locked islands. How should I do something like this?
Edited by MathieuLeclaire - April 1, 2022 15:36:01
Technical Discussion » Deform HairGen by new guides...
- MathieuLeclaire
- 79 posts
- Offline
So we have this groom that was done using just a few guide curves (let's say 100 guide curves that generates 10000 hair strands for rendering). Now we want to simulate these guide curves, but we realize that there aren't enough guides to make an accurate simulation. Simulating the full hairGen is just overkill, so what we would like to do is be able to adjust the number of guide curves to our needs (let's say 1000 guides), simulate these new guides and have these new guides deform the old hairGen. Generating the new guides is not a problem, but it's using these new guides to drive the hairGen that is causing us issues. When we use more guides, it changes the number of curves generated by the hairGen since the influence radius value becomes too big and even if we adjust that value, we still can't quite get back our original groom generated by the original hairGen.
So my question is this: Is there a way to deform the old hairGen groom using new guide curves? Is there an equivalent of a pointdeform sop that would advect these old curves by these new guides? Is there already a solution that exists for this type of scenario or should I create my own vex code to do this myself?
So my question is this: Is there a way to deform the old hairGen groom using new guide curves? Is there an equivalent of a pointdeform sop that would advect these old curves by these new guides? Is there already a solution that exists for this type of scenario or should I create my own vex code to do this myself?
Technical Discussion » Push curve to surface
- MathieuLeclaire
- 79 posts
- Offline
Thanks all for the tips and recommendations. I ended up blending a few of your ideas and cooking up my own which seems to do the job. We'll see when it's applied to multiple shots if I need to revisit this solution, but you input has been highly useful so thank you very much.
Technical Discussion » Push curve to surface
- MathieuLeclaire
- 79 posts
- Offline
So I have these curves that are penetrating in and out of some geometry and I'm looking for a way to push the points on the surface of that geometry without completely destroying the smoothness of the curve. Here is a simplified example of the type of situation I have to deal with :
Now, I usually convert the mesh to a VDB and use an attribute wrangle to push the points outside the mesh using VEX code that looks like this:
...problem is, certain points on the curves are closer to the surface on one side of the mesh while others are closer to the other side of the mesh, so this gives me a broken curve that goes through the volume and looks like this:
..What I am looking for is something that would look more like this:
Any tips on how to get these type of curves pushed out to the same side of a mesh?
Now, I usually convert the mesh to a VDB and use an attribute wrangle to push the points outside the mesh using VEX code that looks like this:
float dist = volumesample(1, 0, @P); vector dir = volumegradient(1, 0, @P); if (dist < 0) @P += -1 * dist * dir;
...problem is, certain points on the curves are closer to the surface on one side of the mesh while others are closer to the other side of the mesh, so this gives me a broken curve that goes through the volume and looks like this:
..What I am looking for is something that would look more like this:
Any tips on how to get these type of curves pushed out to the same side of a mesh?
Technical Discussion » Seed from String
- MathieuLeclaire
- 79 posts
- Offline
Great! That works perfectly.
Thanks guys.
BabaJ, I'm not sure what you consider long strings, but my path strings are between 100 and 200 characters and using this piece of code works perfectly for my needs:
But thanks for the example. I will do it your way if ever it becomes an issue.
Thanks guys.
BabaJ, I'm not sure what you consider long strings, but my path strings are between 100 and 200 characters and using this piece of code works perfectly for my needs:
int seed = random_shash(s@path); @rnd = rand(seed);
But thanks for the example. I will do it your way if ever it becomes an issue.
Edited by MathieuLeclaire - Feb. 17, 2021 14:48:36
Technical Discussion » Seed from String
- MathieuLeclaire
- 79 posts
- Offline
I'm wondering if there's an easy way to generate a float or int value from a string to use as a seed in a random function?
I usually use like @ptnum or @primnum as seeds in the rand() function to generate random numbers per point or packed primitive or whatever, but now we have a situation where we have an asset that have a unique string path attribute per mesh and this asset will continuously be updated by adding new meshes or removing some. That means certain meshes will have shifting @primnum values as new meshes are inserted or removed and that will change the random value that comes out from the function if I use @primnum as a seed. The path string attributes though will stay consistent. So it would make more sense to use that path attribute to drive the random number generation, but the rand() function only takes in float values. So I'm looking for a way to generate a unique float (or int) value from a string so I can use that value as the seed per mesh and assure consistent random values as our assets get updated.
Does anybody have any suggestion on how I could manage that?
I usually use like @ptnum or @primnum as seeds in the rand() function to generate random numbers per point or packed primitive or whatever, but now we have a situation where we have an asset that have a unique string path attribute per mesh and this asset will continuously be updated by adding new meshes or removing some. That means certain meshes will have shifting @primnum values as new meshes are inserted or removed and that will change the random value that comes out from the function if I use @primnum as a seed. The path string attributes though will stay consistent. So it would make more sense to use that path attribute to drive the random number generation, but the rand() function only takes in float values. So I'm looking for a way to generate a unique float (or int) value from a string so I can use that value as the seed per mesh and assure consistent random values as our assets get updated.
Does anybody have any suggestion on how I could manage that?
Technical Discussion » Poly-reduce with holes
- MathieuLeclaire
- 79 posts
- Offline
I just figured out a solution…
I'm measuring the throughout area of the mesh, then cap it. Then I measure the area per element. I divide the per element to the throughout area and if the ratio is bigger then a threshold, then it's the large bottom cap and I delete that polygon.
I'm not sure if it's the best solution, but it works. I'm open to other suggestions if there's an easier one.
I'm measuring the throughout area of the mesh, then cap it. Then I measure the area per element. I divide the per element to the throughout area and if the ratio is bigger then a threshold, then it's the large bottom cap and I delete that polygon.
I'm not sure if it's the best solution, but it works. I'm open to other suggestions if there's an easier one.
Edited by MathieuLeclaire - Nov. 3, 2020 18:46:51
Technical Discussion » Poly-reduce with holes
- MathieuLeclaire
- 79 posts
- Offline
I have a problem I'm not sure how to get around. I'm sure one of you guys will have a simple and elegant solution to this…
I need to poly-reduce some assets and the poly-reduce sop works very well in most situations, but here is a situation that causes me a lot headaches :
Sometimes, my high-resolution meshes ends up with little holes/missing polygons in them like this:
…it doesn't really matter for us when we render the high-res mesh since we never see these holes. They are occluded and super small.
But when we poly-reduce this mesh, these small holes become big holes like this:
…which now becomes a big problem.
How can I make sure my poly-reduce operation ignores small holes like this and treats the surface like a fully closed surface?
I though of using a polycap, but that closes the bottom surface as well and creates a weird shape in the end. I don't want that bottom to be capped, but I do need these small holes to be closed so my poly-reduce gives me clean results.
In this particular case, I could go and close those holes up by hand, but I have thousands of meshes to process, some with thousands of these small holes. So I'm looking for an automatic process that can close these small holes so that my poly-reduce operations can give me clean results.
Any suggestions?
I need to poly-reduce some assets and the poly-reduce sop works very well in most situations, but here is a situation that causes me a lot headaches :
Sometimes, my high-resolution meshes ends up with little holes/missing polygons in them like this:
…it doesn't really matter for us when we render the high-res mesh since we never see these holes. They are occluded and super small.
But when we poly-reduce this mesh, these small holes become big holes like this:
…which now becomes a big problem.
How can I make sure my poly-reduce operation ignores small holes like this and treats the surface like a fully closed surface?
I though of using a polycap, but that closes the bottom surface as well and creates a weird shape in the end. I don't want that bottom to be capped, but I do need these small holes to be closed so my poly-reduce gives me clean results.
In this particular case, I could go and close those holes up by hand, but I have thousands of meshes to process, some with thousands of these small holes. So I'm looking for an automatic process that can close these small holes so that my poly-reduce operations can give me clean results.
Any suggestions?
Edited by MathieuLeclaire - Nov. 3, 2020 18:35:18
Technical Discussion » Cluster by budget
- MathieuLeclaire
- 79 posts
- Offline
I have a bunch of points that I'm trying to cluster together based on position and budget. Similar to what the cluster SOP is doing, but I want the sum of the budget attribute values in each cluster to be as close as possible to other clusters.
For example, let's say I have 1000 points with random budget values and when we sum the budget values of all the points, we have total of 20,000 units and we want to split those points in 100 clusters… Then the sum of the budget of each cluster should be as close as possible to 200 units. I don't care how many points are in each cluster. Some could have 1 or 2 points and another could have 50 points, I just want to make sure a cluster is as much as possible neighboring points and that the sum of their budget is as close as possible to 200 units.
Any suggestions how I should go about getting these kind of results? I'm not seeing an easy solution right now. I'm looking for ideas.
Thanks.
For example, let's say I have 1000 points with random budget values and when we sum the budget values of all the points, we have total of 20,000 units and we want to split those points in 100 clusters… Then the sum of the budget of each cluster should be as close as possible to 200 units. I don't care how many points are in each cluster. Some could have 1 or 2 points and another could have 50 points, I just want to make sure a cluster is as much as possible neighboring points and that the sum of their budget is as close as possible to 200 units.
Any suggestions how I should go about getting these kind of results? I'm not seeing an easy solution right now. I'm looking for ideas.
Thanks.
Technical Discussion » Detect packed alembic deformed vs transformed
- MathieuLeclaire
- 79 posts
- Offline
Technical Discussion » Detect packed alembic deformed vs transformed
- MathieuLeclaire
- 79 posts
- Offline
I have this Alembic file where most packed alembic primitives are static shapes that are only being deformed through transformations… but there are a few packed alembic primitives whose geometry shape does change through time. I need to find a way to detect which packed alembic primitives are deforming vs which ones are simply transformed. How do I do that? Is there an intrinsic attribute that tells you that? I can't seem to find any info on that.
Technical Discussion » VEX fetch various timeshift values of packed alembic
- MathieuLeclaire
- 79 posts
- Offline
I have this situation where I have 2 sets of packed alembics, one is static and I want to copy the transform of the second animated set using primintrinsic to extract packedfulltransform. This allows me to extract transform animation from a lower res mesh and copy it onto a higher res mesh.
As long as I have a way to find the matching id of the primitive to extract the transform from low-res to high-res, I can use this piece VEX code to copy the transform from the animated prim to the static prim :
This works well, but now I have this situation where I have multiple copies of the static mesh, each containing a time attribute value, and I want to extract and copy the transform at that time value.
If I was working on the animated packed alembic, I could simply set the abcframe primintrinsic like this:
…but since the transform has been copied onto the static packed alembic, changing the abcframe value does nothing.
The only solution I see is to first pre-process the animated alembic to set the abcframe values before the static alembic calls to copy them over, but since multiple static primitive might pull transform from the same animated prim but at different time stamps, that means I would need to duplicate each animated prim before offsetting them so the static prim can find the right time offset transform to copy over.
All this is very doable, but I can't help but wonder if there exist a simpler more elegant solution out there to do something like this? Maybe some of the crowd tools already have similar features that I could use? What do you guys think? Is my approach the right one or is there more elegant and efficient solutions I should explore?
Cheers!
As long as I have a way to find the matching id of the primitive to extract the transform from low-res to high-res, I can use this piece VEX code to copy the transform from the animated prim to the static prim :
// set position int anim_ptnum = primpoint(1, id, 0); vector P = point(1, 'P', anim_ptnum); setpointattrib(0, 'P', @ptnum, P ); // set scale & rotation matrix T = primintrinsic(0, "packedfulltransform", @primnum); matrix anim_T = primintrinsic(1, "packedfulltransform", id); T = T * anim_T; setprimintrinsic(0, 'transform', @primnum, (matrix3)T);
This works well, but now I have this situation where I have multiple copies of the static mesh, each containing a time attribute value, and I want to extract and copy the transform at that time value.
If I was working on the animated packed alembic, I could simply set the abcframe primintrinsic like this:
setprimintrinsic(0, 'abcframe', @primnum, @time);
…but since the transform has been copied onto the static packed alembic, changing the abcframe value does nothing.
The only solution I see is to first pre-process the animated alembic to set the abcframe values before the static alembic calls to copy them over, but since multiple static primitive might pull transform from the same animated prim but at different time stamps, that means I would need to duplicate each animated prim before offsetting them so the static prim can find the right time offset transform to copy over.
All this is very doable, but I can't help but wonder if there exist a simpler more elegant solution out there to do something like this? Maybe some of the crowd tools already have similar features that I could use? What do you guys think? Is my approach the right one or is there more elegant and efficient solutions I should explore?
Cheers!
Technical Discussion » destroy in python sop
- MathieuLeclaire
- 79 posts
- Offline
Mmmm… that's actually a great idea. Storing it in a Stash does make a lot of sense. I'll give this a shot. Thanks for the suggestion.
Technical Discussion » Missing UDIM with Texture VOP
- MathieuLeclaire
- 79 posts
- Offline
I want to read a texture value and copy the colors onto a point attribute, so I use the Texture VOP with the <UDIM> token in the texture path and everything reads OK… in most cases. Problem is I have this asset that has it's UVs split on 5 UDIM tiles, but we only have 4 texture files. We do this because we use Arnold to render and Arnold has a default color value option you can use when the texture is missing. So since the geometry on tile 1005 is a constant color, we use this value instead of creating a map for it, which works well in Arnold… but in Houdini, this situation raises an error.
The only solution I see is to use a Python SOP to first check which UDIM files actually exists and then use a switch to decide if the current UVs are on an existing tile or not.
Is this the simplest solution to this problem or does anybody have other ideas or suggestion on how to deal with such a scenario?
The only solution I see is to use a Python SOP to first check which UDIM files actually exists and then use a switch to decide if the current UVs are on an existing tile or not.
Is this the simplest solution to this problem or does anybody have other ideas or suggestion on how to deal with such a scenario?
Technical Discussion » destroy in python sop
- MathieuLeclaire
- 79 posts
- Offline
I need to write a Python script that goes through a list of points that contains asset information. For each point, it needs to load the referenced asset, extract some information and copy that info back onto a custom attribute on that point and then delete the asset before loading the next one. The assets are heavy, so that's why I need to load them one at a time and delete/destroy them before loading the next one.
Since my asset info are on a list of points, I figured I would use a Python SOP to do this, but I can't delete nodes/assets in a Python SOP. If you try and run this code:
You'll get this error message :
So instead I tried to use a pre-render script in a geometry ROP. That allows me to destroy my assets, but I can't create and copy the attributes I need on my point cloud as it gives me this error message:
So how can I write a Python script that will load and delete assets as it sets new data per point attribute? What's the proper way to do something like this?
Since my asset info are on a list of points, I figured I would use a Python SOP to do this, but I can't delete nodes/assets in a Python SOP. If you try and run this code:
obj = hou.node('/obj') sub = obj.createNode('subnet') sub.destroy()
You'll get this error message :
Error Python error: Traceback (most recent call last): File "", line 3, in File "D:/PROGRA~1/SIDEEF~1/HOUDIN~1.348/houdini/python2.7libs\houpythonportion\ui.py", line 927, in decorator return func(*args, **kwargs) File "D:/PROGRA~1/SIDEEF~1/HOUDIN~1.348/houdini/python2.7libs\hou.py", line 10553, in destroy return _hou.Node_destroy(*args, **kwargs) OperationFailed: The attempted operation failed. Cannot delete nodes while cooking
So instead I tried to use a pre-render script in a geometry ROP. That allows me to destroy my assets, but I can't create and copy the attributes I need on my point cloud as it gives me this error message:
Error Python error: Traceback (most recent call last): File "", line 1, in File "opdef:/Sop/fetch_material_info?PythonModule", line 16, in fetchMaterials File "D:/PROGRA~1/SIDEEF~1/HOUDIN~1.348/houdini/python2.7libs\houpythonportion\Geometry.py", line 67, in addAttrib create_local_variable) File "D:/PROGRA~1/SIDEEF~1/HOUDIN~1.348/houdini/python2.7libs\hou.py", line 29644, in addAttrib return _hou.Geometry_addAttrib(*args) GeometryPermissionError: Geometry is read-only.
So how can I write a Python script that will load and delete assets as it sets new data per point attribute? What's the proper way to do something like this?
-
- Quick Links