Found 45 posts.
Search results Show results as topic list.
Technical Discussion » How to export even the most basic FBX animation?
- localstarlight
- 45 posts
- Online
Hey, thanks so much. Yeah, I was able to export it using the File/Export menu. Thanks!
Technical Discussion » How to export even the most basic FBX animation?
- localstarlight
- 45 posts
- Online
I've used Houdini for a few years now, but never actually needed to export any animation from there.
Now I'm trying to do the most absurdly simple animation, and cannot figure out to how to export it as an FBX.
For testing purposes I simply want to animate a cube around a scene and export both the cube and the animation clip for bringing into a game engine (Playcanvas).
I'm aware that FBX animation requires a bone hierarchy, so I've tried creating a single bone and pairing it with the cube. I've then animated the chain root.
I've tried using the ROP FBX Output node to export the animation using the 'Export Animation Clips (Takes)' section, but that doesn't seem to work. I've also tried the ROP FBX Animation Output node, but that complains that it "can't find 'name' nor 'path' point attribute'.
I'm totally lost. How can I export this super simple FBX animation?!
I have attached the simple example hip file.
Now I'm trying to do the most absurdly simple animation, and cannot figure out to how to export it as an FBX.
For testing purposes I simply want to animate a cube around a scene and export both the cube and the animation clip for bringing into a game engine (Playcanvas).
I'm aware that FBX animation requires a bone hierarchy, so I've tried creating a single bone and pairing it with the cube. I've then animated the chain root.
I've tried using the ROP FBX Output node to export the animation using the 'Export Animation Clips (Takes)' section, but that doesn't seem to work. I've also tried the ROP FBX Animation Output node, but that complains that it "can't find 'name' nor 'path' point attribute'.
I'm totally lost. How can I export this super simple FBX animation?!
I have attached the simple example hip file.
PDG/TOPs » Problem with TOPs and texture baking with Arnold + Redshift
- localstarlight
- 45 posts
- Online
OK, so I've figured out what's causing the problem, and it's nothing to do with Arnold or the filename as speculated above, because the same error occurs with Redshift.
It actually has to do with an HDA I am using to light the scene. I have a 'stage light' (attached) which consists of a mesh and either an Arnold or Redshift spotlight. Simply having one of these in the scene, even with both lights switched off, causes the error I outlined above and makes the baking process fail.
Does anyone know why this would be? Could it be to do with having the lights not on obj level?
It actually has to do with an HDA I am using to light the scene. I have a 'stage light' (attached) which consists of a mesh and either an Arnold or Redshift spotlight. Simply having one of these in the scene, even with both lights switched off, causes the error I outlined above and makes the baking process fail.
Does anyone know why this would be? Could it be to do with having the lights not on obj level?
PDG/TOPs » Problem with TOPs and texture baking with Arnold + Redshift
- localstarlight
- 45 posts
- Online
I'm trying to use TOPs to run a series of bakes using Arnold. I've successfully done the same process using Redshift, but have hit a problem with the way the texture baker seems to work with filenames when trying to use TOPs.
With Arnold, you set an output filename (in Output/Output Picture), such as: $HIP/bake/LM_Light_B.exr
And then in the 'Baking' tab you set your list of objects to bake, for example: /obj/TestSphere
It doesn't seem to give you any control over the final filename (unlike Redshift), so the final output file from this example would be called:
'Light_A_obj_TestSphere.exr'
The baking process works fine when running off the ROP iself, but when using a ROP Fetch and trying to run it, it gives this error:
Work item 'ropfetch1_23' lists file 'F:/XXX/bake/LM_Light_B.exr' as an expected output file, but it wasn't found when cooked
So for some reason it is looking for a file named after what's set in Output/Output Picture, rather than the actual concatenated filename the output will actually have. This stops it from even rendering the bake.
Anyone know if there's a way to fix this, or a way around it?
With Arnold, you set an output filename (in Output/Output Picture), such as: $HIP/bake/LM_Light_B.exr
And then in the 'Baking' tab you set your list of objects to bake, for example: /obj/TestSphere
It doesn't seem to give you any control over the final filename (unlike Redshift), so the final output file from this example would be called:
'Light_A_obj_TestSphere.exr'
The baking process works fine when running off the ROP iself, but when using a ROP Fetch and trying to run it, it gives this error:
Work item 'ropfetch1_23' lists file 'F:/XXX/bake/LM_Light_B.exr' as an expected output file, but it wasn't found when cooked
So for some reason it is looking for a file named after what's set in Output/Output Picture, rather than the actual concatenated filename the output will actually have. This stops it from even rendering the bake.
Anyone know if there's a way to fix this, or a way around it?
Edited by localstarlight - March 9, 2021 11:00:16
PDG/TOPs » How to set 'render with take' with an attribute?
- localstarlight
- 45 posts
- Online
I want to wedge multiple versions of something by using the take system in Houdini, and although the 'Render with take' parameter on a ROP node is a string attribute, I can't figure out how to drive it from a TOPs attribute. It only seems available as a drop-down menu with all available takes in it. How can I access this parameter as a string so that I can drive it from my TOPS net?
Houdini for Realtime » Houdini Data Interface for UE4's Niagara
- localstarlight
- 45 posts
- Online
I had been hoping to use the Houdini -> Niagara workflow to bring in a particle (whitewater) sim into Unreal, but I'm starting to get the sense that might not be possible with this system?
Does this only work with a set number of particles? Whitewater has varying numbers of particles per frame (though each one has an id).
I want to bring the ids and positions over time into Niagara and use them to spawn bubble meshes as the particles (not doing anything with foam or spray, just a few hundred bubbles).
The Niagara ROP doesn't seem set up for this, it seems like it's only for RBD sims, or am I misunderstanding something?
Is there a way to do this?
Does this only work with a set number of particles? Whitewater has varying numbers of particles per frame (though each one has an id).
I want to bring the ids and positions over time into Niagara and use them to spawn bubble meshes as the particles (not doing anything with foam or spray, just a few hundred bubbles).
The Niagara ROP doesn't seem set up for this, it seems like it's only for RBD sims, or am I misunderstanding something?
Is there a way to do this?
Technical Discussion » How to export normal information in blend shapes?
- localstarlight
- 45 posts
- Online
I'm trying to export an FBX with blendshapes for use in Unreal Engine 4, and although the vertex deformation is getting exported correctly, the corresponding normal blending is not happening - the mesh is retaining the normal information from the first frame of the animation.
I've attached a hip file which shows a very basic setup illustrating the problem. As you can see, in Houdini, the normals are being correctly blended - if you scroll from frame 0 to 100 you can see the normals changing from the first blendshape to the second one.
However, no matter how I try to export this, when I bring it into any other piece of software (Modo, Maya, UE4) there is no blending of normal information.
I've tried having the normals as vertex normals, point normals, and both at the same time, but it makes no difference.
What is the correct way to get blendshapes to carry normal information for use in other software?
I've attached a hip file which shows a very basic setup illustrating the problem. As you can see, in Houdini, the normals are being correctly blended - if you scroll from frame 0 to 100 you can see the normals changing from the first blendshape to the second one.
However, no matter how I try to export this, when I bring it into any other piece of software (Modo, Maya, UE4) there is no blending of normal information.
I've tried having the normals as vertex normals, point normals, and both at the same time, but it makes no difference.
What is the correct way to get blendshapes to carry normal information for use in other software?
Houdini Engine for Unreal » HDA won't create multiple material instances
- localstarlight
- 45 posts
- Online
Hi,
I had to leave this part of my project for a while and I'm now just coming back to it, thanks for your response (and the fix!) and sorry for the slow response.
However, I'm having a slight issue with this now:
When I bring in the Sectioned Sphere HDA with only 1 subdivision, it assigns all the textures correctly, but if I change then subdivisions to a higher number (as I need to), then it doesn't correctly assign the “diffuse” parameter to the correct texture any more. I have attached a UE4 project for you to take a look at. You'll see that when Subdivisions is set to 1, the top row of textures run A0 to A3 and the bottom row run B0 to B3. However, setting the Subdivisions to 2 or higher, the bottom row now also runs from A0 to A3.
Looking at the HDA in Houdini, in the geometry spreadsheet it does seem like the values are correctly assigned there, so I'm not sure where the error is creeping in.
It's not an insurmountable problem for now, as I can just go through and manually change all the diffuse assignments for the lower row myself, but I'm hoping to be able to automate the creation of a sphere with a large number of tiled texture assignments in the future, where manually changing them will be annoying.
Any idea what's happening here?
I had to leave this part of my project for a while and I'm now just coming back to it, thanks for your response (and the fix!) and sorry for the slow response.
However, I'm having a slight issue with this now:
When I bring in the Sectioned Sphere HDA with only 1 subdivision, it assigns all the textures correctly, but if I change then subdivisions to a higher number (as I need to), then it doesn't correctly assign the “diffuse” parameter to the correct texture any more. I have attached a UE4 project for you to take a look at. You'll see that when Subdivisions is set to 1, the top row of textures run A0 to A3 and the bottom row run B0 to B3. However, setting the Subdivisions to 2 or higher, the bottom row now also runs from A0 to A3.
Looking at the HDA in Houdini, in the geometry spreadsheet it does seem like the values are correctly assigned there, so I'm not sure where the error is creeping in.
It's not an insurmountable problem for now, as I can just go through and manually change all the diffuse assignments for the lower row myself, but I'm hoping to be able to automate the creation of a sphere with a large number of tiled texture assignments in the future, where manually changing them will be annoying.
Any idea what's happening here?
Houdini for Realtime » How to export ROP Volume Texture with colour information?
- localstarlight
- 45 posts
- Online
I want to be able to export a volume texture using the ROP Volume Texture, but retaining the colour information in the volume.
But even with colour information as a Cd channel in VDB (which I can see is there in the Volume Visualization SOP), the output is always black and white.
Is there a way to retain and export the colour information?
As a possibly related question, when I try to connect up a pyro sim to the ROP Volume Texture node, it doesn't work and throws this error:
Invalid source /obj/geo1/rop_volume_texture1/vdb2
Error: ArithmeticError: Non-zero scale values required
This is from running a basic test creating a sphere and using the shelf ‘Explosion’ tool.
Any ideas?
But even with colour information as a Cd channel in VDB (which I can see is there in the Volume Visualization SOP), the output is always black and white.
Is there a way to retain and export the colour information?
As a possibly related question, when I try to connect up a pyro sim to the ROP Volume Texture node, it doesn't work and throws this error:
Invalid source /obj/geo1/rop_volume_texture1/vdb2
Error: ArithmeticError: Non-zero scale values required
This is from running a basic test creating a sphere and using the shelf ‘Explosion’ tool.
Any ideas?
Houdini for Realtime » Imposter resolution
- localstarlight
- 45 posts
- Online
Hey Mike,
Thanks! I'm not trying to do animated imposters right now. Good to know about the materials too. Was only using the pig head to test it out anyway.
Thanks again.
LS
Thanks! I'm not trying to do animated imposters right now. Good to know about the materials too. Was only using the pig head to test it out anyway.
Thanks again.
LS
Houdini Engine for Unreal » HDA won't create multiple material instances
- localstarlight
- 45 posts
- Online
Houdini Engine for Unreal » HDA won't create multiple material instances
- localstarlight
- 45 posts
- Online
I have an HDA which is essentially a sphere partitioned into a number of sections. For each section (which might consist of a number of primitives), I would like to assign a different material instance with a different diffuse texture.
So, if I create a sphere with 8 sections (and for simplification here imagine each section is just one primitive), then here is what I have in my 'unreal_material_instance' attribute column:
Material'/Game/Materials/EarthHoudiniTile_0.EarthHoudiniTile_0'
Material'/Game/Materials/EarthHoudiniTile_1.EarthHoudiniTile_1'
Material'/Game/Materials/EarthHoudiniTile_2.EarthHoudiniTile_2'
Material'/Game/Materials/EarthHoudiniTile_3.EarthHoudiniTile_3'
Material'/Game/Materials/EarthHoudiniTile_4.EarthHoudiniTile_4'
Material'/Game/Materials/EarthHoudiniTile_5.EarthHoudiniTile_5'
Material'/Game/Materials/EarthHoudiniTile_6.EarthHoudiniTile_6'
Material'/Game/Materials/EarthHoudiniTile_7.EarthHoudiniTile_7'
And here is what I have in my 'unreal_material_parameter_diffuse' column:
Texture2D'/Game/Textures/A_0.A_0'
Texture2D'/Game/Textures/A_1.A_1'
Texture2D'/Game/Textures/A_2.A_2'
Texture2D'/Game/Textures/A_3.A_3'
Texture2D'/Game/Textures/B_0.B_0'
Texture2D'/Game/Textures/B_1.B_1'
Texture2D'/Game/Textures/B_2.B_2'
Texture2D'/Game/Textures/B_3.B_3'
I have created all those material instances in UE4, and they have a parameter called ‘diffuse’ which is a texture parameter.
Most of this seems to work - I get the geometry in, it has the right number of material slots, with materials assigned to them. However, only the first material is correctly created as an instance with the diffuse texture parameter set. All the other materials are directly linked to the UE4 materials I created, and the parameter is not changed. If I look in the HoudiniEngine/Temp folder there is just one material instance there, called ‘EarthHoudiniTile_0_instance_DFA94D48’.
So it seems like the first material slot has had it's material instance created, with the parameter set, but all the rest have been skipped somehow.
Is this a bug, or am I just going about this the wrong way?
So, if I create a sphere with 8 sections (and for simplification here imagine each section is just one primitive), then here is what I have in my 'unreal_material_instance' attribute column:
Material'/Game/Materials/EarthHoudiniTile_0.EarthHoudiniTile_0'
Material'/Game/Materials/EarthHoudiniTile_1.EarthHoudiniTile_1'
Material'/Game/Materials/EarthHoudiniTile_2.EarthHoudiniTile_2'
Material'/Game/Materials/EarthHoudiniTile_3.EarthHoudiniTile_3'
Material'/Game/Materials/EarthHoudiniTile_4.EarthHoudiniTile_4'
Material'/Game/Materials/EarthHoudiniTile_5.EarthHoudiniTile_5'
Material'/Game/Materials/EarthHoudiniTile_6.EarthHoudiniTile_6'
Material'/Game/Materials/EarthHoudiniTile_7.EarthHoudiniTile_7'
And here is what I have in my 'unreal_material_parameter_diffuse' column:
Texture2D'/Game/Textures/A_0.A_0'
Texture2D'/Game/Textures/A_1.A_1'
Texture2D'/Game/Textures/A_2.A_2'
Texture2D'/Game/Textures/A_3.A_3'
Texture2D'/Game/Textures/B_0.B_0'
Texture2D'/Game/Textures/B_1.B_1'
Texture2D'/Game/Textures/B_2.B_2'
Texture2D'/Game/Textures/B_3.B_3'
I have created all those material instances in UE4, and they have a parameter called ‘diffuse’ which is a texture parameter.
Most of this seems to work - I get the geometry in, it has the right number of material slots, with materials assigned to them. However, only the first material is correctly created as an instance with the diffuse texture parameter set. All the other materials are directly linked to the UE4 materials I created, and the parameter is not changed. If I look in the HoudiniEngine/Temp folder there is just one material instance there, called ‘EarthHoudiniTile_0_instance_DFA94D48’.
So it seems like the first material slot has had it's material instance created, with the parameter set, but all the rest have been skipped somehow.
Is this a bug, or am I just going about this the wrong way?
Houdini for Realtime » Imposter resolution
- localstarlight
- 45 posts
- Online
Hey Mike,
That's what I assumed at first as well, but when I have the project open, it says ‘Houdini Indie Limited Commercial’ in the top left corner.
I have just created another project from scratch with the basic setup, definitely in Indie not Apprentice, and it's got the same issues. See attached.
It outputs 1440x1440 and the BaseColor pass is totally black.
That's what I assumed at first as well, but when I have the project open, it says ‘Houdini Indie Limited Commercial’ in the top left corner.
I have just created another project from scratch with the basic setup, definitely in Indie not Apprentice, and it's got the same issues. See attached.
It outputs 1440x1440 and the BaseColor pass is totally black.
Houdini for Realtime » Imposter resolution
- localstarlight
- 45 posts
- Online
Hi,
I'm trying out the new imposter workflow, following the tutorial here: https://www.sidefx.com/tutorials/generating-impostor-textures/ [www.sidefx.com]
I'm trying to render out using the following settings:
Sprite Resolution: 256x256
Frames Around Z: 16
So this should give me a 4096x4096 texture atlas. However, no matter what I try, it gives me a 1440x1440 atlas.
I can't see anything in the settings which should be reducing the resolution, and this isn't mentioned in the tutorial (I don't think). I'm working on Houdini Indie, so should have unlimited output resolution for still images.
What's going on?
Additionally, Beauty and Normal are rendering out fine but the BaseColour is rendering out totally black.
Thanks for any help!
I'm trying out the new imposter workflow, following the tutorial here: https://www.sidefx.com/tutorials/generating-impostor-textures/ [www.sidefx.com]
I'm trying to render out using the following settings:
Sprite Resolution: 256x256
Frames Around Z: 16
So this should give me a 4096x4096 texture atlas. However, no matter what I try, it gives me a 1440x1440 atlas.
I can't see anything in the settings which should be reducing the resolution, and this isn't mentioned in the tutorial (I don't think). I'm working on Houdini Indie, so should have unlimited output resolution for still images.
What's going on?
Additionally, Beauty and Normal are rendering out fine but the BaseColour is rendering out totally black.
Thanks for any help!
Technical Discussion » Why can't I export FBX?
- localstarlight
- 45 posts
- Online
Ah, you're totally right. Just upgraded to 16.5 but hadn't sorted out the license server properly.
Thanks!
Thanks!
Technical Discussion » Why can't I export FBX?
- localstarlight
- 45 posts
- Online
I'm using Houdini Indie 16.5 and I'm pretty damn sure that I have successfully exported FBX in previous versions, but I cannot get it to work at all now.
If I try to do in in ROPs then I get this error:
“FBX Export is only supported in Houdini Core and Houdini FX versions.”
If I try to export using File -> Export -> Filmbox FBX then nothing happens at all, no error, but no saved FBX file.
What's going on?
As I said, I'm sure that I've successfully saved out FBX from an earlier version of Houdini Indie.
Has FBX export been removed for some reason?
If I try to do in in ROPs then I get this error:
“FBX Export is only supported in Houdini Core and Houdini FX versions.”
If I try to export using File -> Export -> Filmbox FBX then nothing happens at all, no error, but no saved FBX file.
What's going on?
As I said, I'm sure that I've successfully saved out FBX from an earlier version of Houdini Indie.
Has FBX export been removed for some reason?
Houdini Indie and Apprentice » Cannot extrude front or back from Alembic imported curve
- localstarlight
- 45 posts
- Online
I am trying to create a vector curve in Adobe Illustrator to extrude in Houdini. I read elsewhere that you need to convert to Alembic via Cinema 4D in order to be able to do this. SO I've exported from AI to C4D (Illustrator 8 file), then exported .abc from C4D and imported into Houdini using the Alembic node.
On the Alembic node, I have set ‘Load As’ to be ‘Unpack Alembic Delayed Load Primitives’.
Then I am trying to use the PolyExtrude node, which works fine except for the fact that it doesn't allow me to output the front or back - ticking them does nothing.
Attached are two images, one showing a curve created inside Houdini which has been extruded and has a front and back, and then the curve created in AI which does not have front or back.
What do I need to do to be able to output front and back on an extruded imported curve?
On the Alembic node, I have set ‘Load As’ to be ‘Unpack Alembic Delayed Load Primitives’.
Then I am trying to use the PolyExtrude node, which works fine except for the fact that it doesn't allow me to output the front or back - ticking them does nothing.
Attached are two images, one showing a curve created inside Houdini which has been extruded and has a front and back, and then the curve created in AI which does not have front or back.
What do I need to do to be able to output front and back on an extruded imported curve?
Edited by localstarlight - June 23, 2017 12:17:22
Technical Discussion » Geometry Cache Rendering Workflow
- localstarlight
- 45 posts
- Online
I just found this thread on odforce: http://forums.odforce.net/topic/19857-render-dependencies-bug/ [forums.odforce.net]
Which also suggests using the Batch ROP as a workaround, but says that SESI confirmed this as a bug. That was back in 2014.
So shouldn't this work without needing to use the Batch ROP?
Which also suggests using the Batch ROP as a workaround, but says that SESI confirmed this as a bug. That was back in 2014.
So shouldn't this work without needing to use the Batch ROP?
Technical Discussion » Geometry Cache Rendering Workflow
- localstarlight
- 45 posts
- Online
I am trying to set up a render dependency workflow where the Mantra render node is dependent on a DOP simulation being cached. For some reason, even with “Node by Node” selected on the Mantra ROPs overrides, it seems to be going frame by frame.
Here's what I have:
1. A geometry ROP which is reading from the DOP network via DOP Import Field node, and then writing bgeo out.
2. The bgeo cache is then read back in to OBJ level using a File node.
3. The Mantra node is then rendering this OBJ node.
4. The Geometry ROP is connected into the Mantra ROP.
5. The Mantra ROP has the setting Dependency Render Settings -> Order -> Node by Node
See attached image which shows most of this.
My understanding was that with this setup, when I tell the Mantra node to render, it will first cache out the bgeo sim before rendering a single frame. What actually happens is that it sims each frame and then renders it.
Anyone know what I'm doing wrong?
Here's what I have:
1. A geometry ROP which is reading from the DOP network via DOP Import Field node, and then writing bgeo out.
2. The bgeo cache is then read back in to OBJ level using a File node.
3. The Mantra node is then rendering this OBJ node.
4. The Geometry ROP is connected into the Mantra ROP.
5. The Mantra ROP has the setting Dependency Render Settings -> Order -> Node by Node
See attached image which shows most of this.
My understanding was that with this setup, when I tell the Mantra node to render, it will first cache out the bgeo sim before rendering a single frame. What actually happens is that it sims each frame and then renders it.
Anyone know what I'm doing wrong?
Technical Discussion » Basic question about procedural modelling in Houdini
- localstarlight
- 45 posts
- Online
-
- Quick Links