> Why not leave the scale alone in houdini, it would make round tripping a lot cleaner
… well, scaling of other assets for sure is one reason, but most importantly, simulation usually requires “more or less natural” scales. The moment you want to sim cloth, destruction etc, your scale needs to be “sane”.
Then, measuring comes to mind. If you're not “eye-balling” your stuff but have to deliver correct-ish results (I used to do animations for documentation back in the days, before my Houdini times), you may have pre-established camera extrinsic and intrinsic, so that you need to “stay in focus” (pun intended). Not sure if something like that is “a thing” for game engines, not my area of expertise.
But in general, I wouldn't like to work in the wrong scaling. Ever. :-)
Marc
Found 590 posts.
Search results Show results as topic list.
Houdini Indie and Apprentice » Applying FBX scale (pre) transforms. Like in Blender.
- malbrecht
- 806 posts
- Offline
Houdini Indie and Apprentice » Applying FBX scale (pre) transforms. Like in Blender.
- malbrecht
- 806 posts
- Offline
> The pre transforms seem to be a bit of a mystery to me as I cannot find them in the paramter interface
Oh, I so very much agree … not having direct access to the “pre transform” (like in that other application I won't mention and it's not Blender) has always hit me like a wet sack of plushfrogs.
You can use “node.moveParmTransformIntoPreTransform()” to send the current transform into “pre” and “node.ovePreTransformIntoParmTransform()” to move the pre-transform into the current transform. You can also “backup” the current world transform into a matrix using “node.worldTransform()” as a source. Shifting stuff around in Python works quite well, but you don't really have that “freedom” that you'd expect from within the parameter interface.
Marc
Oh, I so very much agree … not having direct access to the “pre transform” (like in that other application I won't mention and it's not Blender) has always hit me like a wet sack of plushfrogs.
You can use “node.moveParmTransformIntoPreTransform()” to send the current transform into “pre” and “node.ovePreTransformIntoParmTransform()” to move the pre-transform into the current transform. You can also “backup” the current world transform into a matrix using “node.worldTransform()” as a source. Shifting stuff around in Python works quite well, but you don't really have that “freedom” that you'd expect from within the parameter interface.
Marc
Houdini Indie and Apprentice » Applying FBX scale (pre) transforms. Like in Blender.
- malbrecht
- 806 posts
- Offline
Hmm … looks like you're using an older version of Houdini - my screenshot was from 18.0.5xxx, I think the unit conversion was introduced somewhere around the .400s.
That conversion, I think, is the cleanest way to go.
Otherwise, for simulation I suggest importing the collision/simulation geometry into a new geo-object (Houdini calls them “SOP”, I think). Use an object-merge node, convert “into this object” to apply the “global scale” (see my screenshot above, the scaling on the FBX import network in “/obj” level) and use THAT geometry for collision.
Marc
That conversion, I think, is the cleanest way to go.
Otherwise, for simulation I suggest importing the collision/simulation geometry into a new geo-object (Houdini calls them “SOP”, I think). Use an object-merge node, convert “into this object” to apply the “global scale” (see my screenshot above, the scaling on the FBX import network in “/obj” level) and use THAT geometry for collision.
Marc
Houdini Indie and Apprentice » Applying FBX scale (pre) transforms. Like in Blender.
- malbrecht
- 806 posts
- Offline
Poking in the dark (as I don't use Unreal): Have you tried using the unit-conversion in FBX import? That, in theory, should take care of scaling issues.
Otherwise, what happens if you use the FBX global scale (on the network node itself) instead of “parent-scaling” from the top/root null?
Marc
Otherwise, what happens if you use the FBX global scale (on the network node itself) instead of “parent-scaling” from the top/root null?
Marc
Houdini Indie and Apprentice » how to camera project meshes
- malbrecht
- 806 posts
- Offline
Focal length doesn't “change the direction of the projection” (as long as you are in 3d space), but it DOES affect your projection if you are going from 2d to 3d or vice versa (that's what I would consider a “camera projection”, involving a 2d-step).
A longer focal length would project onto a smaller area of your target, creating a smaller and less skewed projected mesh, a shorter focal length would stretch your projected mesh across a wider area of your target.
Read: *IF* you are doing a 2d-step in between, focal-length does actually change your rays' directions. If you are NOT doing a 2d-step (“camera”) and only consider the camera's position your pivot, focal length is “out of the picture” (pun intended).
The resulting projection would also depend on how you consider your source (projected mesh) placed within the camera: Is it within the camera's field of view (then you could actually calculate your field of view from there), is it on your camera plate (which might also help) or is it “independent” from the camera's transform/matrix?
Marc
A longer focal length would project onto a smaller area of your target, creating a smaller and less skewed projected mesh, a shorter focal length would stretch your projected mesh across a wider area of your target.
Read: *IF* you are doing a 2d-step in between, focal-length does actually change your rays' directions. If you are NOT doing a 2d-step (“camera”) and only consider the camera's position your pivot, focal length is “out of the picture” (pun intended).
The resulting projection would also depend on how you consider your source (projected mesh) placed within the camera: Is it within the camera's field of view (then you could actually calculate your field of view from there), is it on your camera plate (which might also help) or is it “independent” from the camera's transform/matrix?
Marc
Houdini Indie and Apprentice » how to camera project meshes
- malbrecht
- 806 posts
- Offline
Moin,
I guess “correctly camera projecting meshes” depends on what you consider “camera projection” - from my perspective (pun intended) that *requires* a focal length, else it wouldn't be a “camera projection”.
Do you want to “parallel project” (orthogonally ray-cast) a mesh onto a projection target or do you want to “perspective-project” from a single point (pin-hole camera model)? If the latter, you *are* talking about “camera projection” and you *would* need a focal length, since that would define your field of view (or “what part of your projection target can actually receive any points from the projection”).
If you wanted to do a parallel projection, one approach might be to take the vector between the two bounding boxes (projected mesh and projection target) as your direction, then ray-cast every point from the projection mesh onto the projection target and do a “intersect” (VEX) to find the target point (if there is one).
If you “only” want to “project” through your camera's pinhole, you would get a “projection result” using intersect with rays created from your projection-mesh through the pinhole position against the projection target. Technically, this doesn't require a focal length, but it's not exactly a “projection” I think.
Marc
I guess “correctly camera projecting meshes” depends on what you consider “camera projection” - from my perspective (pun intended) that *requires* a focal length, else it wouldn't be a “camera projection”.
Do you want to “parallel project” (orthogonally ray-cast) a mesh onto a projection target or do you want to “perspective-project” from a single point (pin-hole camera model)? If the latter, you *are* talking about “camera projection” and you *would* need a focal length, since that would define your field of view (or “what part of your projection target can actually receive any points from the projection”).
If you wanted to do a parallel projection, one approach might be to take the vector between the two bounding boxes (projected mesh and projection target) as your direction, then ray-cast every point from the projection mesh onto the projection target and do a “intersect” (VEX) to find the target point (if there is one).
If you “only” want to “project” through your camera's pinhole, you would get a “projection result” using intersect with rays created from your projection-mesh through the pinhole position against the projection target. Technically, this doesn't require a focal length, but it's not exactly a “projection” I think.
Marc
Houdini Lounge » When is houdini 19 coming out?
- malbrecht
- 806 posts
- Offline
Dear Mrs FDX3245,
for lack of any information about who you are or are not, I am probably free to assume you are a lady, therefore the addressing. If I should be mistaken, I am sorry - simply trying to give some balance here. I don't consider women less worthy of being addressed than men.
> Also, you might want to reconsider addressing a group of people, some of whom are definitely men, as ‘ladies’,
… Why? Because usually the ladies among us are being addressed as “guys” and that's the “natural” way? I think some of us men need to get used to being “meant inclusively” when someone says “hey, ladies” instead of “hey guys”. As you say, it's a “matter of taste”.
I apologize to anyone who doesn't like to read the brand “Adobe”. I also apologize to anyone who doesn't like to be reminded of development being a lot of work and sometimes things needing longer than anyone would wish for. I also apologize to anyone who is sick of reading my shit here. I wish for an ignore feature on this forum like the next one.
However, I do not apologize for voicing my opinion. THAT is something that some people need to get accustomed to as well: Freedom of speech is not just the freedom of ONE perspective alone. If I consider a statement not fair, I too have the right to say so. And I think I made it clear that I merely expressed my own impression, not eternal wisdom.
Finally, humour *IS* a matter of taste and I consider it possible and not necessarily a bad thing that we do not agree on this either.
Marc Albrecht
for lack of any information about who you are or are not, I am probably free to assume you are a lady, therefore the addressing. If I should be mistaken, I am sorry - simply trying to give some balance here. I don't consider women less worthy of being addressed than men.
> Also, you might want to reconsider addressing a group of people, some of whom are definitely men, as ‘ladies’,
… Why? Because usually the ladies among us are being addressed as “guys” and that's the “natural” way? I think some of us men need to get used to being “meant inclusively” when someone says “hey, ladies” instead of “hey guys”. As you say, it's a “matter of taste”.
I apologize to anyone who doesn't like to read the brand “Adobe”. I also apologize to anyone who doesn't like to be reminded of development being a lot of work and sometimes things needing longer than anyone would wish for. I also apologize to anyone who is sick of reading my shit here. I wish for an ignore feature on this forum like the next one.
However, I do not apologize for voicing my opinion. THAT is something that some people need to get accustomed to as well: Freedom of speech is not just the freedom of ONE perspective alone. If I consider a statement not fair, I too have the right to say so. And I think I made it clear that I merely expressed my own impression, not eternal wisdom.
Finally, humour *IS* a matter of taste and I consider it possible and not necessarily a bad thing that we do not agree on this either.
Marc Albrecht
Houdini Lounge » When is houdini 19 coming out?
- malbrecht
- 806 posts
- Offline
People, developing software most of the time is a team effort. With SideFX' developers mostly working from home and trying to communicate through whatever technical means they have available can all but slow things down MASSIVELY.
Yet, they have provided us with a “back-port” of some goodies PLUS they constantly push out builds for you to try out or not.
> expectations from them about houdini 18.5 are enormous.
… I don't consider this a fair statement. Not to the least.
Houdini is evolving all the time, if you look back just two years, a LOT has improved, exciting new features have been added - and it has not all been in .x releases, some of the stuff trickles in through production builds.
Be it a .5 or a .0 release, I am certain the changes and improvements coming will be a “game-changer” (no pun intended) for many of us. If we have to wait a few more months for it to happen: So what? It's not like we are in a “typical situation” right now. That virus-thing is still as active as it was in February/March, there's no cure/vaccine to it, Hollywootz is still on their knees and yet, the moon chases the sun as if purple was the new black. Or was it green.
If you're unhappy with the stream of updates AND the “do something constructive/creative” show going on right now here in the forums, buy into an Adobe subscription. That'll teach you patience. Photoshop STILL isn't fully 16bit or, shush, linear floating point capable. And that's about 20 years after most of us in the printing/photo industry have adopted to at least 16 bit color workflows.
Marc Albrecht
Yet, they have provided us with a “back-port” of some goodies PLUS they constantly push out builds for you to try out or not.
> expectations from them about houdini 18.5 are enormous.
… I don't consider this a fair statement. Not to the least.
Houdini is evolving all the time, if you look back just two years, a LOT has improved, exciting new features have been added - and it has not all been in .x releases, some of the stuff trickles in through production builds.
Be it a .5 or a .0 release, I am certain the changes and improvements coming will be a “game-changer” (no pun intended) for many of us. If we have to wait a few more months for it to happen: So what? It's not like we are in a “typical situation” right now. That virus-thing is still as active as it was in February/March, there's no cure/vaccine to it, Hollywootz is still on their knees and yet, the moon chases the sun as if purple was the new black. Or was it green.
If you're unhappy with the stream of updates AND the “do something constructive/creative” show going on right now here in the forums, buy into an Adobe subscription. That'll teach you patience. Photoshop STILL isn't fully 16bit or, shush, linear floating point capable. And that's about 20 years after most of us in the printing/photo industry have adopted to at least 16 bit color workflows.
Marc Albrecht
Edited by chrism - Oct. 1, 2020 12:06:36
Technical Discussion » Working with MIDI data
- malbrecht
- 806 posts
- Offline
Moin,
in your sample you are scoping to note 60, but your midi file only uses notes 38 and 40. You need to scope to the notes you are using.
Marc
in your sample you are scoping to note 60, but your midi file only uses notes 38 and 40. You need to scope to the notes you are using.
Marc
Houdini Lounge » karma or mantra which is better for PBR accuracy ?
- malbrecht
- 806 posts
- Offline
Anonymous user/ess “pickled”, you are, of course, entitled to consider me a waste of time, just like I consider most of your comments the same. Nevertheless, for the protocol, let me state that:
… you can take your low-res concepts to Facebook and the likes.
In my experience I have lost way too much precious lifetime talking to or with people endorsing their “low-res concepts” and casual communicating. I know some people are really proud of not caring a bleep for anything and refusing to learn while they go along (I worked for “the industry”, you know).
Here, I would have preferred learning about other people's definition of “photo-realism” instead of being told to shut up if I don't write a paper about it.
Marc Albrecht
pickled
we ought to nevertheless use low-res concepts and ideas when we're casually communicating.
… you can take your low-res concepts to Facebook and the likes.
In my experience I have lost way too much precious lifetime talking to or with people endorsing their “low-res concepts” and casual communicating. I know some people are really proud of not caring a bleep for anything and refusing to learn while they go along (I worked for “the industry”, you know).
Here, I would have preferred learning about other people's definition of “photo-realism” instead of being told to shut up if I don't write a paper about it.
Marc Albrecht
3rd Party » Redshift Crashes on startup
- malbrecht
- 806 posts
- Offline
You are trying to load a GoZ plugin that has been compiled for Houdini 17.0. As the requester clearly says, that plugin needs to be (re-)compiled for Houdini 17.5. I am pretty sure that if you follow the information that is given by the requester, you won't get that requester (which isn't exactly a crash but an incompatibility warning).
As for Redshift: Make sure you have MATCHING VERSIONS. Houdini has a tendency to change its HDA constantly and if your Redshift plugin isn't compatible with the (possibly daily build) Houdini version you are running, you *will* actually experience crashes. If you want to be sure that it's actually Redshift that is causing crashes (like you say), simply deactivate Redshift, start Houdini and check if it still crashes.
But as long as you are seeing the GoZ-requester, you're not talking about Redshift
Marc
As for Redshift: Make sure you have MATCHING VERSIONS. Houdini has a tendency to change its HDA constantly and if your Redshift plugin isn't compatible with the (possibly daily build) Houdini version you are running, you *will* actually experience crashes. If you want to be sure that it's actually Redshift that is causing crashes (like you say), simply deactivate Redshift, start Houdini and check if it still crashes.
But as long as you are seeing the GoZ-requester, you're not talking about Redshift
Marc
Houdini Lounge » karma or mantra which is better for PBR accuracy ?
- malbrecht
- 806 posts
- Offline
What's your definition of “photo-realistic”? Sounds like an oxymoron to me :-) A photo is a subjective representation of what some people consider reality. It depends on everyone involved to define “reality” in an agreeable way to even MEASURE “realistic” and most photos aren't MEANT to be “realistic”. Take a look at the “movie industry”, they aren't exactly about “realistic” representations, if they were, some well-paid actors and actresses would behave rather unacceptable after having seen how they, suddenly, look on the big screen.
That said: Both render engines you mention call themselves “unbiased” (to me there is no “real” unbiased rendering, since you will never REALLY cover all possible photons in a calculation, it's more about “more or less unbiased rendering”, but that's nit-picking). Therefore, in theory, both renderers can give you the same “real” (as in physically-correct-ish) look. Be careful with believing in that, in order to be physically correct, one needs to understand the basics of physics, and so far physicists are still debating that.
In the end, what most people mean by using the phrase “photo-realistic” is the exact opposite of “realistic”: They want to render what they consider “visually pleasing” to their own, subjective, mood-depending preferences. If you hand over a “physically plausible” render to them, they usually say “that looks mediocre”. If you hand over a photoshopped, artistically “improved” render, they'll go out of their way to praise you for the “realism”.
What you want is to LEARN how to use a render engine, any render engine, and create the images you want. What you PROBABLY want is a render engine that is easy to use, offers a lots of tweaks and is fast enough to allow you to experiment. If you really wanted “realistic” (omitting “photo-realistic”), you wouldn't need to experiment, you'd simply hit “go” and be done. That's not going to happen.
Photos are ART. Art isn't done by clicking on a button and going home.
Marc Albrecht
(Note to self: Stop responding to single-post users, it's usually fruitfree.)
That said: Both render engines you mention call themselves “unbiased” (to me there is no “real” unbiased rendering, since you will never REALLY cover all possible photons in a calculation, it's more about “more or less unbiased rendering”, but that's nit-picking). Therefore, in theory, both renderers can give you the same “real” (as in physically-correct-ish) look. Be careful with believing in that, in order to be physically correct, one needs to understand the basics of physics, and so far physicists are still debating that.
In the end, what most people mean by using the phrase “photo-realistic” is the exact opposite of “realistic”: They want to render what they consider “visually pleasing” to their own, subjective, mood-depending preferences. If you hand over a “physically plausible” render to them, they usually say “that looks mediocre”. If you hand over a photoshopped, artistically “improved” render, they'll go out of their way to praise you for the “realism”.
What you want is to LEARN how to use a render engine, any render engine, and create the images you want. What you PROBABLY want is a render engine that is easy to use, offers a lots of tweaks and is fast enough to allow you to experiment. If you really wanted “realistic” (omitting “photo-realistic”), you wouldn't need to experiment, you'd simply hit “go” and be done. That's not going to happen.
Photos are ART. Art isn't done by clicking on a button and going home.
Marc Albrecht
(Note to self: Stop responding to single-post users, it's usually fruitfree.)
Edited by malbrecht - July 17, 2020 02:16:11
Technical Discussion » VEX get the vertex number
- malbrecht
- 806 posts
- Offline
You need vertexprimindex [www.sidefx.com] to convert from linear (“global”) vertex ID to “local” (within the given primitive).
Marc Albrecht
Marc Albrecht
Technical Discussion » Opening a network path with Houdini
- malbrecht
- 806 posts
- Offline
If your network naming doesn't get recognized by the file requester, try using the IP path:
Marc Albrecht
//192.168.1.222/my3dstuff/… in the “Look in” line works fine on my setup (only that both the IP address and the folder path are mock ups).
Marc Albrecht
Houdini Lounge » How to know if a node is camera (include cam HDA) ?
- malbrecht
- 806 posts
- Offline
Hi,
have you tried reading the documentation? Node type for “cam” is … wait for it … “cam” :-)
Documentation says “node.type()” gives you the node type, so in Python you would do something like, which would return “cam”. You could traverse all your “/obj” nodes and check for each node's type whether it is “cam”.
Marc
have you tried reading the documentation? Node type for “cam” is … wait for it … “cam” :-)
Documentation says “node.type()” gives you the node type, so in Python you would do something like
hou.node("/obj/myCam").type()
Marc
Houdini Indie and Apprentice » Error: L_MEM_OBJECT_ALLOCATION_FAILUR
- malbrecht
- 806 posts
- Offline
Houdini Indie and Apprentice » Error: L_MEM_OBJECT_ALLOCATION_FAILUR
- malbrecht
- 806 posts
- Offline
Moin,
> Annypne have anny idea why this is happening ?
Yes, sure.
However, having an idea doesn't help with solving the problem. Would you mind giving a bit of a CONTEXT to the error message?
2GB of RAM isn't exactly comfortable for running openCL stuff, it's quickly eaten up by texture memory for example. How much data are you sending to the GPU?
Have you tried switching to CPU openCL instead?
Again, some context might help in reading your mind.
Marc
> Annypne have anny idea why this is happening ?
Yes, sure.
However, having an idea doesn't help with solving the problem. Would you mind giving a bit of a CONTEXT to the error message?
2GB of RAM isn't exactly comfortable for running openCL stuff, it's quickly eaten up by texture memory for example. How much data are you sending to the GPU?
Have you tried switching to CPU openCL instead?
Again, some context might help in reading your mind.
Marc
Houdini Lounge » Well it's finally happening. I have word DIRECT FROM OTOY...
- malbrecht
- 806 posts
- Offline
What Kays says … some companies are great at big words, but aside from selected “closed beta users' lab experimental show-offs”, their products are having hard times standing up to their claims.
I'll keep my mouth shut about Redshift, but Octane has been on my watch-list for some time now to replace the former. User experience I have read has not been THAT great and I am getting tired of companies promising or “mysteriously hinting at” the next-big-sliced-bread-invention. Actually delivering and actually making people OUTSIDE the “closed beta” laboratories make proper use of their tools would convince me. THAT I have to see.
For both render engines mentioned.
Marc
I'll keep my mouth shut about Redshift, but Octane has been on my watch-list for some time now to replace the former. User experience I have read has not been THAT great and I am getting tired of companies promising or “mysteriously hinting at” the next-big-sliced-bread-invention. Actually delivering and actually making people OUTSIDE the “closed beta” laboratories make proper use of their tools would convince me. THAT I have to see.
For both render engines mentioned.
Marc
3rd Party » WIP: FBX HD & SD importer, Joints to Bones Converter and Morph-Helper (was: DAZ to Houdini converter)
- malbrecht
- 806 posts
- Offline
RE: Contact/“Testing”
Please read what I wrote above, I am only and exclusively interested in dealing with real people, not with anonymous avatars via the web. If you don't want to contact me by mail, skype, phone or whatever, at least send me a PM through this forum and give some background about what you do, what experience with rigging and Houdini you have and what exactly you expect from the HDA (let me repeat that I am NOT doing a “DAZ-only” tool here, I am looking into supporting as many FBX rig flavours as I can - I do know, for example, that Blender exports are a nightmare and, typically Blender, not pipeline-friendly in any sense of the word).
Animation Import Support
While this is technically doable and I did have some success with tests, I am not following this road at the moment due to changes in the hosting platform. I am concentrating on supporting parallel deformation (e.g. for high res meshes from DAZ, but also for multi-layer and other types of “special case” workflows), morphs and, hopefully soon, animator-friendly “starting points” (rig-supports).
Material magic is on my list, but since Redshift is a definite no-go and Mantra is a tiny bit on the slow side, I am looking into creating a render-agnostic material-setup system. Lots of experimentation to do there and this is unpaid time, so it may take me several months.
Marc
Please read what I wrote above, I am only and exclusively interested in dealing with real people, not with anonymous avatars via the web. If you don't want to contact me by mail, skype, phone or whatever, at least send me a PM through this forum and give some background about what you do, what experience with rigging and Houdini you have and what exactly you expect from the HDA (let me repeat that I am NOT doing a “DAZ-only” tool here, I am looking into supporting as many FBX rig flavours as I can - I do know, for example, that Blender exports are a nightmare and, typically Blender, not pipeline-friendly in any sense of the word).
Animation Import Support
While this is technically doable and I did have some success with tests, I am not following this road at the moment due to changes in the hosting platform. I am concentrating on supporting parallel deformation (e.g. for high res meshes from DAZ, but also for multi-layer and other types of “special case” workflows), morphs and, hopefully soon, animator-friendly “starting points” (rig-supports).
Material magic is on my list, but since Redshift is a definite no-go and Mantra is a tiny bit on the slow side, I am looking into creating a render-agnostic material-setup system. Lots of experimentation to do there and this is unpaid time, so it may take me several months.
Marc
Technical Discussion » Rigged Character- Rigging Bug when moving in World Space
- malbrecht
- 806 posts
- Offline
Ah, I see what you mean - when you wrote that “the arms collapse”, I really only looked at “the arms”, not at the elbows
The problem seems to be somewhere in the “arms noodle” setup, I assume there is a rotation or maybe a scale messed up. As soon as I have a free system available, I'll try to figure it out …
Marc
The problem seems to be somewhere in the “arms noodle” setup, I assume there is a rotation or maybe a scale messed up. As soon as I have a free system available, I'll try to figure it out …
Marc
-
- Quick Links