Hi, guy,
without having tried this one myself, experience says that MS tools tend to have some machine selection (x86 vs. x64 etc) set wrongly when using cmake (without paying EXTREME caution to all options).
Also, make sure you're not mixing debug and release versions (another “goodie” that I encounter every other compile run).
Maybe this helps?
Marc
Found 590 posts.
Search results Show results as topic list.
Technical Discussion » Unresolved external symbol while building command.c
-
- malbrecht
- 806 posts
- Offline
Technical Discussion » Capturing BlendShape
-
- malbrecht
- 806 posts
- Offline
Works for me …


Simply connect the output of the blendshape node to the input of the deform-node to combine both morphing and deforming.
BTW: I would it helpful if you could either store geo into the file node (“lock” it) or provide the geo you are using alongside the asset … just a thought to make things easier. Sure, I can grab the geo from the stash node … but if I accidentally restash, it's all gone.
Marc
Simply connect the output of the blendshape node to the input of the deform-node to combine both morphing and deforming.
BTW: I would it helpful if you could either store geo into the file node (“lock” it) or provide the geo you are using alongside the asset … just a thought to make things easier. Sure, I can grab the geo from the stash node … but if I accidentally restash, it's all gone.
Marc
3rd Party » Substance Plugin Redshift Workflow
-
- malbrecht
- 806 posts
- Offline
Update: SideFX say they have fixed a bug on their side and asked me to contact Redshift/Maxon about fixing their side of things. A direct contact between the two companies involved does not seem like a desired thing (this is from both sides, as far as my impression goes), so I tried to negotiate.
Anyway, Redshift will have a crash-fix in the next official 3.x update. The C-channel still has to be provided correctly. That, however, might be something I could look into if still required.
Marc
Anyway, Redshift will have a crash-fix in the next official 3.x update. The C-channel still has to be provided correctly. That, however, might be something I could look into if still required.
Marc
Technical Discussion » create locomotion clips for walk cycles at origin?
-
- malbrecht
- 806 posts
- Offline
If you wanted a tech-geek-solution, what you could do is check for points on ground at frame 0 and track their position throughout the animation cycle. With some luck, what you found there are the feet of the character - therefore, adding those positions' average distance between two frames to the object's world space (sign inverted) should give you “walk forward/backward/sideways” fully automatically.
Obviously this won't work with long cloths touching the ground - you could get better results using the maximum distance between frames to cover for that, but I am a fan of averaging, so I suggested that first :-)
Marc
Obviously this won't work with long cloths touching the ground - you could get better results using the maximum distance between frames to cover for that, but I am a fan of averaging, so I suggested that first :-)
Marc
Technical Discussion » Capture Geometry Issue
-
- malbrecht
- 806 posts
- Offline
Technical Discussion » Capture Geometry Issue
-
- malbrecht
- 806 posts
- Offline
Hi, unknown user,
you are resampling the channels to 240 steps. If you set your CHOPSAMPLERATE lower, you don't get that “rasterizing” effect.
Marc
you are resampling the channels to 240 steps. If you set your CHOPSAMPLERATE lower, you don't get that “rasterizing” effect.
Marc
Houdini Lounge » Once again another GPU question.
-
- malbrecht
- 806 posts
- Offline
Although it's probably useless to answer yet another 1-post-user-question (no hard feelings, it's just a matter of exclusively bad experience) … my purely analogue cents would be:
It depends.
When I write CUDA kernels, I often run out of memory quicker than I can say “stop, wait, hold on a second” (which may be because that phrase is way too long anyway) on my RTX2080. Therefore, I would always suggest getting as much GPU memory as you can, for RAM can only be replaced by more RAM. (And yes, I admit that sometimes those memory-overflows are thanks to me having forgotten to actually FREE the memory after using it :-P )
Most of the time spent with GPU usage is transferring data back and fore anyway. So if your SIM has to shuffle data a lot, more memory on the GPU might be - again - the better thing than some few more ticks on a clock.
That said, it also depends on whether the simulation you intend to do actually USES the GPU in the first place. Most of Houdini's user interaction is very single-corish (synonymous to “incredibly laggy”), what good is a fast sim if you can't get to clicking the button to start it because a few billion particles are bogging your CPU down already …
I do understand that some simulations in Houdini can and do utilize the GPU, however, I am not sure how much data transfer is done in between steps. I would HOPE that “every vertex that might play a role further down the line” gets transferred in one huge chunk of just-about-everything. If not, memory bandwidth would be the bottle-neck and you might want to go for the fastest PCI route instead of looking at memory.
TLDR: My personal suggestion, without knowing precisely what you want to do, is to go for more memory over higher clock speed. That comes with the caveat that MAYBE Houdini spends more time on transferring data back and fore than on the actual (GPU-side) simulation, so faster clock (memory) COULD be more important.
I realize that I am not helpful. Again.
Marc
It depends.
When I write CUDA kernels, I often run out of memory quicker than I can say “stop, wait, hold on a second” (which may be because that phrase is way too long anyway) on my RTX2080. Therefore, I would always suggest getting as much GPU memory as you can, for RAM can only be replaced by more RAM. (And yes, I admit that sometimes those memory-overflows are thanks to me having forgotten to actually FREE the memory after using it :-P )
Most of the time spent with GPU usage is transferring data back and fore anyway. So if your SIM has to shuffle data a lot, more memory on the GPU might be - again - the better thing than some few more ticks on a clock.
That said, it also depends on whether the simulation you intend to do actually USES the GPU in the first place. Most of Houdini's user interaction is very single-corish (synonymous to “incredibly laggy”), what good is a fast sim if you can't get to clicking the button to start it because a few billion particles are bogging your CPU down already …
I do understand that some simulations in Houdini can and do utilize the GPU, however, I am not sure how much data transfer is done in between steps. I would HOPE that “every vertex that might play a role further down the line” gets transferred in one huge chunk of just-about-everything. If not, memory bandwidth would be the bottle-neck and you might want to go for the fastest PCI route instead of looking at memory.
TLDR: My personal suggestion, without knowing precisely what you want to do, is to go for more memory over higher clock speed. That comes with the caveat that MAYBE Houdini spends more time on transferring data back and fore than on the actual (GPU-side) simulation, so faster clock (memory) COULD be more important.
I realize that I am not helpful. Again.
Marc
Work in Progress » WIP: OBJ-MTL handler
-
- malbrecht
- 806 posts
- Offline
Redshift render using the output of that HDA, only adjustment are camera/atmosphere settings, LUT and grass-opacity. Setup time including loading the OBJ-file: less than one minute. Render time: About 2 minutes.
Marc
Edited by malbrecht - Feb. 20, 2020 12:27:31
Work in Progress » WIP: OBJ-MTL handler
-
- malbrecht
- 806 posts
- Offline
Moin,
a “Side Effect” (pun intended) of my “DAZ to Houdini” (or general “FBX-RIG-converting-loader”) has been a material-setup routine.
I created an additional HDA that sets up a material network for Mantra and/or Redshift, populating material nodes and setting values.
While the “Lab-OBJ-Loader” doesn't work for me on most of my test OBJ files (materials don't get created correctly and/or paths are not adjusted), this one tries to be smart and allows for “material collection” in a separate folder or “rewrite path” where desired.
If the MTL file provides shader information like specularity/opacity, those values are set as well. If those values are not provided in the MTL file, the user can tell the HDA to create dummy nodes (for Redshift) or set up dummy paths for principledshader nodes.
Wrangle-Rewriter and Material-Node are both created so that the user can easily pick what to use, what to change or where to stash out.

Read in an OBJ through a file node, wire the file node out to the HDA.

Specify if “forced” parameters/nodes are created and define whether Mantra or RS materials are set up. Also specify a replacement directory (or not) and whether you want material files to be copied into that directory.

After “RUN” is executed, a complete network structure has been provided.

A Rewrite-Wrangle takes care of adjusting “shop_path” in the geo data, conveniently laid out for adjustment after the fact.

A Material Node takes care of assigning materials to groups, also laid out for manual adjustment, if so desired.

Materials get populated with specular, translucency/opacity and diffuse color information, if those are provided in the MTL file.

Works well with hundreds of paths/materials as well.
Marc
a “Side Effect” (pun intended) of my “DAZ to Houdini” (or general “FBX-RIG-converting-loader”) has been a material-setup routine.
I created an additional HDA that sets up a material network for Mantra and/or Redshift, populating material nodes and setting values.
While the “Lab-OBJ-Loader” doesn't work for me on most of my test OBJ files (materials don't get created correctly and/or paths are not adjusted), this one tries to be smart and allows for “material collection” in a separate folder or “rewrite path” where desired.
If the MTL file provides shader information like specularity/opacity, those values are set as well. If those values are not provided in the MTL file, the user can tell the HDA to create dummy nodes (for Redshift) or set up dummy paths for principledshader nodes.
Wrangle-Rewriter and Material-Node are both created so that the user can easily pick what to use, what to change or where to stash out.
Read in an OBJ through a file node, wire the file node out to the HDA.
Specify if “forced” parameters/nodes are created and define whether Mantra or RS materials are set up. Also specify a replacement directory (or not) and whether you want material files to be copied into that directory.
After “RUN” is executed, a complete network structure has been provided.
A Rewrite-Wrangle takes care of adjusting “shop_path” in the geo data, conveniently laid out for adjustment after the fact.
A Material Node takes care of assigning materials to groups, also laid out for manual adjustment, if so desired.
Materials get populated with specular, translucency/opacity and diffuse color information, if those are provided in the MTL file.
Works well with hundreds of paths/materials as well.
Marc
Work in Progress » Procedural generator of random castles
-
- malbrecht
- 806 posts
- Offline
Amazing results - that looks really, really fun!
If you wanted to pursue this idea further, what I would suggest is to allow the user to pick structural points like “here's the well, here's bad-lands etc” and then “weight” certain parts/elements of the structure set accordingly (i.e. create court-yard, create back-side/kitchen-area etc).
You could also think about allowing for components to be edited, so that, additionally to the “19th century reconstruction style” you are currently building, the tool could do various “styles” (you would need to limit heights and block sizes etc).
Not meant as a critic in any way, just throwing in ideas to keep you going - and I LOVE your statement about not being interested in the rendering but in constructing the tool. Very much my world-view!
Marc
If you wanted to pursue this idea further, what I would suggest is to allow the user to pick structural points like “here's the well, here's bad-lands etc” and then “weight” certain parts/elements of the structure set accordingly (i.e. create court-yard, create back-side/kitchen-area etc).
You could also think about allowing for components to be edited, so that, additionally to the “19th century reconstruction style” you are currently building, the tool could do various “styles” (you would need to limit heights and block sizes etc).
Not meant as a critic in any way, just throwing in ideas to keep you going - and I LOVE your statement about not being interested in the rendering but in constructing the tool. Very much my world-view!
Marc
3rd Party » WIP: FBX HD & SD importer, Joints to Bones Converter and Morph-Helper (was: DAZ to Houdini converter)
-
- malbrecht
- 806 posts
- Offline
This is in no way a final demo, just a quick and dirty overview of the general workflow. I am working on a bunch of features (e.g. cloth/props/eyelash/hair integration).
Marc
3rd Party » Substance Plugin Redshift Workflow
-
- malbrecht
- 806 posts
- Offline
Moin,
crashes for me, too, using Houdini 18.0.348. I was able to get a segmentation fault error and a crash-report. I will file a bug report to SeSi.
Hacking around problems I might be able to do. But plain crashs with Segmentation faults … that's something for those to fix who made the booboo in the first place :-)
Marc
crashes for me, too, using Houdini 18.0.348. I was able to get a segmentation fault error and a crash-report. I will file a bug report to SeSi.
Hacking around problems I might be able to do. But plain crashs with Segmentation faults … that's something for those to fix who made the booboo in the first place :-)
Marc
Houdini Lounge » Creating a "mud monster" for Unreal Engine?
-
- malbrecht
- 806 posts
- Offline
Ha, I passed 50 myself … and have given up on University a LONG time ago. No wonder I don't know squat :-)
If UE has a particle system, combining that with morphs/texture-animated displacement should get you really far really fast, I think.
Marc
If UE has a particle system, combining that with morphs/texture-animated displacement should get you really far really fast, I think.
Marc
Houdini Lounge » Creating a "mud monster" for Unreal Engine?
-
- malbrecht
- 806 posts
- Offline
Hi, Paul,
I have no idea what an “MFA” might be and the only computer game I ever play is Solitaire, but of course I know what a “mud monster” is, with Spirited Away being one of my (if not the) all-time-favorite movies.
Kevin Smith had a “Golgothan Crip Demon” in “Dogma” that might, maybe, help as a starting point - I don't know what is possible in Unreal Engine as far as vector displacement goes (which I think would be the best approach, having a suitably “muddy” base mesh and then layers of randomized mud blobs peeling/moving along limited areas with “hero-mud-blobs” being able to get instanced and move off from the main mesh).
The most simple approach I can think of is a bunch of (maybe even procedurally created) morphs that allow you to let “muddy blotches” move downwards an area of the skin, so that you can vary the speed of movement and even layer such morphs on top of each other.
Hopefully this is inspirative
Marc
I have no idea what an “MFA” might be and the only computer game I ever play is Solitaire, but of course I know what a “mud monster” is, with Spirited Away being one of my (if not the) all-time-favorite movies.
Kevin Smith had a “Golgothan Crip Demon” in “Dogma” that might, maybe, help as a starting point - I don't know what is possible in Unreal Engine as far as vector displacement goes (which I think would be the best approach, having a suitably “muddy” base mesh and then layers of randomized mud blobs peeling/moving along limited areas with “hero-mud-blobs” being able to get instanced and move off from the main mesh).
The most simple approach I can think of is a bunch of (maybe even procedurally created) morphs that allow you to let “muddy blotches” move downwards an area of the skin, so that you can vary the speed of movement and even layer such morphs on top of each other.
Hopefully this is inspirative

Marc
Houdini Lounge » Layer from FBX file
-
- malbrecht
- 806 posts
- Offline
“Layers” the way you use them in Photoshop or other programs aren't fully compatible with the way Houdini structures its scene (this might, to some extend, change with USD integration progressing over time).
However, there are multiple way to “fake” a layer structure in Houdini. You can use subnetworks inside any structure that basically “are layers” (or folders, if you will):

Or you can use selection groups (press Shift+Z in your network view) and freely add/remove/whatever nodes to any wild topology of selections that you can imagine:

My perspective would be that the “layer view” really isn't helpful in Houdini, except for general scene builds - and there you might want to look into USD.
I hope this helps.
Marc
However, there are multiple way to “fake” a layer structure in Houdini. You can use subnetworks inside any structure that basically “are layers” (or folders, if you will):
Or you can use selection groups (press Shift+Z in your network view) and freely add/remove/whatever nodes to any wild topology of selections that you can imagine:
My perspective would be that the “layer view” really isn't helpful in Houdini, except for general scene builds - and there you might want to look into USD.
I hope this helps.
Marc
3rd Party » WIP: FBX HD & SD importer, Joints to Bones Converter and Morph-Helper (was: DAZ to Houdini converter)
-
- malbrecht
- 806 posts
- Offline
Thank you! While, yes, it isn't easy - it is a fun way of learning new tricks and better understanding the workings of (and especially the horror of cooking that is) Houdini! I can only recommend taking on a project/experiment like this if one wants to “get it” :-)
Marc
Marc
3rd Party » WIP: FBX HD & SD importer, Joints to Bones Converter and Morph-Helper (was: DAZ to Houdini converter)
-
- malbrecht
- 806 posts
- Offline
Small update (about huge updates): I am back at almost 1000 lines of Python code now … with lots of featuritys (German-ish pun on “way too many features”):
Two stupid bugs fixed, naming “beautified”, capture pipeline is created automatically, eye-rigs are added (I don't like the DAZ way of morphing eye directions), last-bone-plus-one patch (to include bones at the ends of chains where the original FBX rig only has a joint that captures the final geometry), animation-rig on low-resolution geometry (switchable to hires-geo for rendering), bone-visibility-switch for rendering, automatically fix for the UDIM-mess DAZ exports on combined geometry (overlaid UV patches that break the morphing) and lots more. Some of the features will only work on “suitable” figures since I couldn't avoid having some hardcoded anchor names, but I made most features switchable on the HDA. I do think that most of this setup should work fine on non-DAZ FBX as well.
I would love to get some real-world feedback on this now. If there's anyone who is interested (and does have the time to do so) in cooperating with me on this, I would love to get in touch and bounce ideas back and fore, test, experiment, improve. If such a someone should exist, please do contact me through PM or my website.
Marc
P.S. On my to-do-list is having a closer look at makeHuman's API. Originally I wanted to integrate that package into a “character designer” for Houdini, with what I learned from the DAZ2H project I hope to be able to do that. Looking for cooperation on that one, too.
Two stupid bugs fixed, naming “beautified”, capture pipeline is created automatically, eye-rigs are added (I don't like the DAZ way of morphing eye directions), last-bone-plus-one patch (to include bones at the ends of chains where the original FBX rig only has a joint that captures the final geometry), animation-rig on low-resolution geometry (switchable to hires-geo for rendering), bone-visibility-switch for rendering, automatically fix for the UDIM-mess DAZ exports on combined geometry (overlaid UV patches that break the morphing) and lots more. Some of the features will only work on “suitable” figures since I couldn't avoid having some hardcoded anchor names, but I made most features switchable on the HDA. I do think that most of this setup should work fine on non-DAZ FBX as well.
I would love to get some real-world feedback on this now. If there's anyone who is interested (and does have the time to do so) in cooperating with me on this, I would love to get in touch and bounce ideas back and fore, test, experiment, improve. If such a someone should exist, please do contact me through PM or my website.
Marc
P.S. On my to-do-list is having a closer look at makeHuman's API. Originally I wanted to integrate that package into a “character designer” for Houdini, with what I learned from the DAZ2H project I hope to be able to do that. Looking for cooperation on that one, too.
3rd Party » Substance Plugin Redshift Workflow
-
- malbrecht
- 806 posts
- Offline
Hmm … doesn't seem to be that important to you. I would have loved to help/cooperate, but maybe you figured it out by yourself.
Fair enough.
Marc
Fair enough.
Marc
Houdini Lounge » Solaris and USD advantages for one-person teams?
-
- malbrecht
- 806 posts
- Offline
If I may chime in on this not-really-discussion, maybe a different angle on how “single-user-teams” work can broaden some people's horizons:
… most of the (experimental and relaxational) projects I do with Houdini involve several different scene(-files) utilizing the same assets. Having to reimport, re-sync, re-do, re-sim, re-attach etc. all and everything, re-export, re-sync, adjust elsewhere, reconnect with scene dependencies again and again AND AGAIN is a major pain in what I use when I sit down and think.
In that respect, ANYTHING that provides a WORKING sync- and distribution process is not only highly welcome but might actually be “a dream come true”. I personally cannot imagine a scenario where I would not have assets reused in several scenes - ever.
Think about a scanned head, medium resolution (90 million triangles), standard texture set applied (12 UDIMs at 8192x8192 each). I may have one scene file with specific light setups, geometry blasted away for performance reason, normal maps converted to high detail geometry, another one with different setups for the same thing, but exportable for inspection elsewhere, another one with a different renderer. No, I do not do that all in one scene file, that would be a horrible mess. But being able to “reference” a major/main scene file where changes to the core geometry would distribute cleanly across dependent scene files but not limiting me to geometry alone (but including lights, maps etc) … that would indeed be nice.
Yes, to some degree I can construct that with Houdini scene files. Which ties me into Houdini exclusively, breaking any inter-cooperation options I might want to test.
Even with this very simplistic scenario, the “idea” of “many people working on one scene file” is absolutely applicable to my Houdini usage - with “many people” simply being me, manipulating reference assets elsewhere.
Marc
The fact that it's designed to be many people chomping on things at once kinda negates a real tangible benefit for a
solo user.
… most of the (experimental and relaxational) projects I do with Houdini involve several different scene(-files) utilizing the same assets. Having to reimport, re-sync, re-do, re-sim, re-attach etc. all and everything, re-export, re-sync, adjust elsewhere, reconnect with scene dependencies again and again AND AGAIN is a major pain in what I use when I sit down and think.
In that respect, ANYTHING that provides a WORKING sync- and distribution process is not only highly welcome but might actually be “a dream come true”. I personally cannot imagine a scenario where I would not have assets reused in several scenes - ever.
Think about a scanned head, medium resolution (90 million triangles), standard texture set applied (12 UDIMs at 8192x8192 each). I may have one scene file with specific light setups, geometry blasted away for performance reason, normal maps converted to high detail geometry, another one with different setups for the same thing, but exportable for inspection elsewhere, another one with a different renderer. No, I do not do that all in one scene file, that would be a horrible mess. But being able to “reference” a major/main scene file where changes to the core geometry would distribute cleanly across dependent scene files but not limiting me to geometry alone (but including lights, maps etc) … that would indeed be nice.
Yes, to some degree I can construct that with Houdini scene files. Which ties me into Houdini exclusively, breaking any inter-cooperation options I might want to test.
Even with this very simplistic scenario, the “idea” of “many people working on one scene file” is absolutely applicable to my Houdini usage - with “many people” simply being me, manipulating reference assets elsewhere.
Marc
3rd Party » Substance Plugin Redshift Workflow
-
- malbrecht
- 806 posts
- Offline
Moin, Fred,
I haven't looked into anything substance-related lately, but since I am “dabbling” a lot with RS and scripting in Houdini these days, maybe I can help you out.
Would you be willing to share a test file that I can play with? I don't have any substances stored anywhere any longer (basically left that ship quite some time ago), but I developed some substance-related tools a few years ago, so I do have a basic understanding of what they are and how they work. I can't promise anything and I won't be able to do it “right away”, but I am interested in this from the RS point of view.
Marc
I haven't looked into anything substance-related lately, but since I am “dabbling” a lot with RS and scripting in Houdini these days, maybe I can help you out.
Would you be willing to share a test file that I can play with? I don't have any substances stored anywhere any longer (basically left that ship quite some time ago), but I developed some substance-related tools a few years ago, so I do have a basic understanding of what they are and how they work. I can't promise anything and I won't be able to do it “right away”, but I am interested in this from the RS point of view.
Marc
-
- Quick Links