I updated this for Houdini 20.5 and added a "stopped" attribute of 1.0 if the particles are less than 0.05 from their target.
Seems to work.
ed
Found 7 posts.
Search results Show results as topic list.
Houdini Indie and Apprentice » Pile of particles form another object
- Ed B
- 7 posts
- Offline
Technical Discussion » How could I rotate "Copied to Points" geos randomly...?
- Ed B
- 7 posts
- Offline
that instanced geo link on SideFX is now here: Copying and instancing point attributes [www.sidefx.com]
Technical Discussion » Redshift + Triplanar reference?
- Ed B
- 7 posts
- Offline
Bump for April 2024 now that Redshift supports rest position inherently; trying to set up the RS OSL Triplanar Coordinates from their Github [github.com]
My textures are swimming, and my geo is deforming over time. I have a load of cloned softbodies going through vellum, so at some point in the network, I've added a Rest SOP with a TimeShift set to frame 0 into the second input, and it's set to store the rest and rest normal. That should mean I freeze frame zero as the reference frame to be able to stick the RS TriPlanar node to deforming geometry. The point of using the OSL node is just to try to eke out more control over the setup.
Supposedly this is all you need in Houdini to allow the State node to pick up the rest and rest normal as Reference Object for the TriPlanar node (or indeed, TriPlanar Coordinates OSL node) to stick the triplanar to deforming geo.
Alas - this is not a documented workflow in Houdini (it's barely documented in C4D either tbh). Any ideas?
My textures are swimming, and my geo is deforming over time. I have a load of cloned softbodies going through vellum, so at some point in the network, I've added a Rest SOP with a TimeShift set to frame 0 into the second input, and it's set to store the rest and rest normal. That should mean I freeze frame zero as the reference frame to be able to stick the RS TriPlanar node to deforming geometry. The point of using the OSL node is just to try to eke out more control over the setup.
Supposedly this is all you need in Houdini to allow the State node to pick up the rest and rest normal as Reference Object for the TriPlanar node (or indeed, TriPlanar Coordinates OSL node) to stick the triplanar to deforming geo.
Alas - this is not a documented workflow in Houdini (it's barely documented in C4D either tbh). Any ideas?
Technical Discussion » PDG used to evolve
- Ed B
- 7 posts
- Offline
I'd like to attempt to bump this thread (failing that, I'll repeat Simon's question).
I want to use PDG to version out trees, animals, insects, jellyfish for a film about using AI for evolution.
For each base model, the TOPS parameters that get randomised will be base form (a, b or c), number of legs/branches/eyes, scale, point colour, texture, scales/skin. This could be further complicated by choosing from a variety of base models for each auxiliary feature.
I'd like, as a principle of my project, to design the flora and fauna, but augmented by AI bias or randomised biases. The alternative is to use a gen AI to just concept them as images, which I don't love. In that case I'd prefer to get an output text list of attributes for each animal and just design it; if it came to that, I could do it with just a detail wrangle in a for-loop.
NO, I want a bunch of semi-random designs output as 3D models with randomised attributes, preferably with the point/prim attribs and groups intact so I can edit them and apply materials.
Am I crazy, or is this the exact use case for which PDG could be useful?
I want to use PDG to version out trees, animals, insects, jellyfish for a film about using AI for evolution.
For each base model, the TOPS parameters that get randomised will be base form (a, b or c), number of legs/branches/eyes, scale, point colour, texture, scales/skin. This could be further complicated by choosing from a variety of base models for each auxiliary feature.
I'd like, as a principle of my project, to design the flora and fauna, but augmented by AI bias or randomised biases. The alternative is to use a gen AI to just concept them as images, which I don't love. In that case I'd prefer to get an output text list of attributes for each animal and just design it; if it came to that, I could do it with just a detail wrangle in a for-loop.
NO, I want a bunch of semi-random designs output as 3D models with randomised attributes, preferably with the point/prim attribs and groups intact so I can edit them and apply materials.
Am I crazy, or is this the exact use case for which PDG could be useful?
Solaris and Karma » How to use json file to make Redshift custom builds work?
- Ed B
- 7 posts
- Offline
This is the JSON that actually works! Excuse me, I have to go comment under some Youtube videos on the topic
Technical Discussion » Cannot determine preloaded icons for shelf error
- Ed B
- 7 posts
- Offline
I just got this error after installing Megascans to the Packages folder and appending Redshift to the houdini.env file
Undoing that and reporting back... yep, deleting the Redshift lines (which might have been erroneous) restored the icons.
So maybe take a look at editing the .env before deleting all your stuff
Undoing that and reporting back... yep, deleting the Redshift lines (which might have been erroneous) restored the icons.
So maybe take a look at editing the .env before deleting all your stuff
Technical Discussion » Exporting cameras from Maya into Houdini
- Ed B
- 7 posts
- Offline
Finally someone says to use the File > Import menu rather than a File node.
(for search engines) This is the way to get cameras to import to Houdini from Maya/ Cinema 4D
Thank you, Peter.
(for search engines) This is the way to get cameras to import to Houdini from Maya/ Cinema 4D
Thank you, Peter.
-
- Quick Links