Without seeing the actual asset, it's difficult to guess what might be wrong. I suspect some of the node paths inside the asset is using absolute path, which wouldn't work well in general. Could you check the node paths that the RBD object is using relative path? For example, the shelf tool will use a node path like `opinputpath("/obj/box_object1/dopimport1", 0)`to import geometry from SOP to DOP. However, a path like that wouldn't work in general in an asset. It'd have to be changed to something like: `opinputpath("../../box_object1/dopimport1", 0)` I've attached a simple RBD asset as an example.
This question is probably better suited in the “Technical Discussion” board, since this board is for the Houdini Engine for Maya plugin. I can try to answer the question though.
The “hairgen” object is actually being exported as a single point. This is because the hairgen object is setup to render using procedural geometry by default. If you change the “Render -> Hair Generation” to “Use SOP Geometry”, you should be able to export the dense hair into Alembic.
The Maya plugin imports the Houdini particles as a standard Maya nParticle, so Maya's built-in nParticle cache would work on the imported particle. This essentially converts the .bgeo cache into Maya's own caching format. You can then attach Maya cache to a nParticle without needing Houdini Engine.
Note that when you use Maya's nParticle cache to cache the Houdini Engine nParticle, Maya will likely print out an error at the end saying it can't attach the cache to the nParticle. This is because the nParticle is connected by Houdini Engine. The caching has actually been completed at this point, so it's safe to ignore the error.
If you have custom point attributes on the particle, Maya would also cache them. However, when you attach the Maya cache onto a nParticle, you have to manually create the equivalent per-particle attributes on the nParticle. Otherwise, Maya wouldn't read in per-particle attributes. The easy way to have all the per-particle attributes imported is to duplicate the nParticle node that was created by the Maya plugin, since that nParticle node contains all the necessary per-particle attributes. Then, attach the nCache onto the duplicated particle.
You're not missing anything. Unfortunately, handles aren't supported in the Maya plugin (yet). The usual workaround is to attach something like Maya locators to those attributes manually. To automate these setups, some users have scripts on Maya side to instantiate assets, and hook up various things, like creating and attaching handles.
You might need to click the “Sync” button after the asset cooks. So that the plugin would create a mesh node to output the fluid surface. Initially, when the asset loads, it doesn't output any geometry, so a mesh node wasn't created.
Maybe this is because of the evaluation mode. In the Preferences -> Animation, and under “Evaluation Mode”, make sure “DG” is selected. The new evaluation mechanism is not supported yet.
Because an HDA could contain multiple assets, you need to pass the name of the asset as the second string. For example: houdiniAsset -loadAsset “T:/foo.hda” “Object/foo” houdiniAsset -loadAsset “T:/foo.hda” “Sop/foo”
You could list the assets in a HDA by running: houdiniAsset -listAssets “T:/foo.hda”
Unfortunately, it's not possible to output attributes for curves at the moment. It's a bit of a limitation due to how curves output is setup in the Maya plugin.
The Maya plugin that comes with Houdini 16.0 is only compatible with 16.0. So it wouldn't run in 15.5. I guess you probably want the change in 15.5. So I just backported the change to 15.5. The change should be in the next 15.5 build.
In 15.5 (and older), “AEhoudiniAssetSetInputToSelection” would be the best way. In the upcoming 16.0, the input mechanism is largely rewritten, and there's a more straightforward function without relying on selection.
By default, Houdini Engine will try to load Houdini's libraries into Maya. This could be due to library conflicts between Houdini and Maya. Could you start Maya without any plugins loaded at all, and then try loading the Maya plugin?
Another thing to try is switch to named pipe (auto-start server). That would avoid loading Houdini's libraries into Maya's.
I just tried using fairly dense particles with the scene that I uploaded. But I still can't seem to reproduce the issue that you described. Can you try using Maya's own particles, and see if you can reproduce the same issue?
I was able to reproduce this. This is actually a bug in how file type filters were handled. The file browser should have used the *.* filter. I've just fixed the issue. The fix will be in Thursday's build (15.5.716).
kahuna031 Trying the ramps. Seems like the first segment outputs NaNs. Not until I have two points on x=0 (thus removing the first segment) do I get correct evaluations. The points that queries the first segment disappears.
The Maya plugin can't output materials that are used per-face yet. However, it is possible to output material assignment information to use a Maya material that already exists in the Maya scene. This is done through manipulating the “maya_shading_group” primitive attribute. When the output geometry contains this attribute, the plugin will assign Maya material that already exists in the Maya scene to the corresponding faces. See the attached scene file for an example.
The original intent of this attribute is to save the per-face material assignment of an input geometry, and restore it when the geometry is outputted into Maya again. However, it can be used (abused?) to output material assignments.
The plugin just provides two MEL commands: houdiniEngine and houdiniAsset. You could use the help command to see the flags: help houdiniEngine help houdiniAsset
For simply changing parameters, you just need to use getAttr, setAttr, and connectAttr. For sync'ing to create output nodes, you'd need houdiniAsset -sync myNodeName. These should cover most cases.
These can be run from Python by using `maya.cmds` and `maya.mel`.
The plugin also uses several MEL scripts that can be found in the scripts directory, but they are more for internal use.