Brian, this is a very interesting scene.
I see you have a pre-transform on the “hip” node, not saying that it's actually causing any problems, it seems like it's properly accounted for (in solve_ik, via origin expressions), it's just that have some prejudice against pre-transforms. Anyway, am going to take a closer look at the scene.
I did some initial tests with VOP SOP before turning to Python - essentially the same as in the scene I've posted but coded in Vex. Though, in contrast to your scene, my Vex was mostly a single inline with a chunk of code in it…
BTW, I guess the solution can be calculated in COPs as well (again, using either Vex or Python) - store solution in pixels, use pic/picni to read the values. Perhaps not really practical, but seems amusing nonetheless.
Found 51 posts.
Search results Show results as topic list.
Technical Discussion » Custom IK solver
- axebeak
- 51 posts
- Offline
Technical Discussion » Custom IK solver
- axebeak
- 51 posts
- Offline
jason_iversenAh yes, I tried this but ran into some problems… This was some time ago though, can't remember the exact details right now, but something to do with forcing it to recook properly due to chain dependencies.
Just in case you didn't know about it, there are such things are “Python Objects”.
Still, the problems were most likely due to my own error so this must be investigated further.
Technical Discussion » Custom IK solver
- axebeak
- 51 posts
- Offline
arctorI want to export unbaked IK animations for real-time use (in a game).
can I ask why you're implementing a custom solver?
So I must replicate the IK solution exactly at run-time and this seems like the easiest (if not, indeed, the only) way to achieve this.
The scene shows just the simplest two-bone chain, but I need around a dosen of such specialized solvers - mostly short chains for limbs, reverse-foot leg is the most obvious example.
The are numerous reasons why I want unbaked IK, some are very specific to my real-time system… But, in general, it makes certain things easier to code, faster and more predictable - things like character limbs interacting with the environment, adjusting legs to slopes, aiming with a weapon etc.
Now, I guess, this can also be used to build fully equivalent rigs and animate in different packages. I tried something like this several years ago between LW and XSI, albeit with a limited success.
Technical Discussion » Custom IK solver
- axebeak
- 51 posts
- Offline
I'm trying to implement a custom IK solver, CHOPs seems like the natural place for such things and this is where I started. I've got an almost working CHOP prototype, but the whole setup is rather cumbersome.
The scene I'm posting here appears to be a much simpler solution. However this seems like a somewhat unorthodox way of doing things, so my question is: what options exist for implementing such a system in Houdini? Also is there something obviously wrong with my approach of doing calculations in SOPs?
About the scene:
I'm a programmer, not an artist, so modeling is rough and weighting is obviously defective.
There are no bones at all in this scene - the joints are just Nulls with cregions.
Paths to chain nodes are hard-wired in the Python code for simplicity.
The schematic picture below is supposed to illustrate the setup, but it's quite simple - I have a Python operator in SOPs that calculates the chain angles based on the control Nulls' transforms, the results are stored in geometry attributes and then read into appropriate joints via simple expressions.
The scene I'm posting here appears to be a much simpler solution. However this seems like a somewhat unorthodox way of doing things, so my question is: what options exist for implementing such a system in Houdini? Also is there something obviously wrong with my approach of doing calculations in SOPs?
About the scene:
I'm a programmer, not an artist, so modeling is rough and weighting is obviously defective.
There are no bones at all in this scene - the joints are just Nulls with cregions.
Paths to chain nodes are hard-wired in the Python code for simplicity.
The schematic picture below is supposed to illustrate the setup, but it's quite simple - I have a Python operator in SOPs that calculates the chain angles based on the control Nulls' transforms, the results are stored in geometry attributes and then read into appropriate joints via simple expressions.
Technical Discussion » Enforcing dependency
- axebeak
- 51 posts
- Offline
Houdini Lounge » GLSL materials
- axebeak
- 51 posts
- Offline
Since H10 I'm working on a GLSL material preview system for my personal project. Basically this allows to quickly setup and preview materials in Houdini, then export and do the final tweaks using real-time rendering system. This essentially means implementing all the material rendering techniques available in the real-time system as well as bypassing Houdini lighting and doing your own.
The system is rather big and is still in development, but I've a number of small self-contained tests produced while working on it. I'm posting some of these scenes here in case anyone may be interested.
H11_mtl_test - this essentially demonstrates the whole setup albeit in a very simplified form. You can see material parameters GUI here, the shader implements simple bump-mapping and specularity techniques without any lighting. Note that I'm not using h/w cubemaps in the shader, instead it does software lookup from a strip-formated 2D image. Yes, I know about isixpack, but I want to be able to read cubemaps from COPs and it's the only way I have found so far. The environment map here is one of the standard textures that come with Houdini - the cubemap faces are extracted in COPs, then downscaled and reassembled as a strip. The base texture in this scene is standard Houdini UV chart, specular mask and normal map are both generated in COPs;
H11_mtl_test_wall2 - it's basically the scene as above, but using a texture set from cgtextures.com;
H11_plx_test - basic parallax mapping test (not parallax occlusion) using sample textures from DX SDK, this one also shows how to do custom head-light;
I have a number of other potentially interesting scenes, but they require some clean-up - these scenes include more material rendering techniques, custom lighting (including image-based from light-probes), custom fog (including light-scattering fog) and more. So if anybody is interested to talk about these things please post here or contact me directly (see GLSL code for my e-mail).
These scenes are for H11 but this also works in H10 with minor modifications.
Note: if after opening one of these scenes viewport stays blank, try the following - enable “High quality lighting” in the Display options, Render/Update textures from the main menu, optionally turn HQ lighting off (may be faster on some cards).
The system is rather big and is still in development, but I've a number of small self-contained tests produced while working on it. I'm posting some of these scenes here in case anyone may be interested.
H11_mtl_test - this essentially demonstrates the whole setup albeit in a very simplified form. You can see material parameters GUI here, the shader implements simple bump-mapping and specularity techniques without any lighting. Note that I'm not using h/w cubemaps in the shader, instead it does software lookup from a strip-formated 2D image. Yes, I know about isixpack, but I want to be able to read cubemaps from COPs and it's the only way I have found so far. The environment map here is one of the standard textures that come with Houdini - the cubemap faces are extracted in COPs, then downscaled and reassembled as a strip. The base texture in this scene is standard Houdini UV chart, specular mask and normal map are both generated in COPs;
H11_mtl_test_wall2 - it's basically the scene as above, but using a texture set from cgtextures.com;
H11_plx_test - basic parallax mapping test (not parallax occlusion) using sample textures from DX SDK, this one also shows how to do custom head-light;
I have a number of other potentially interesting scenes, but they require some clean-up - these scenes include more material rendering techniques, custom lighting (including image-based from light-probes), custom fog (including light-scattering fog) and more. So if anybody is interested to talk about these things please post here or contact me directly (see GLSL code for my e-mail).
These scenes are for H11 but this also works in H10 with minor modifications.
Note: if after opening one of these scenes viewport stays blank, try the following - enable “High quality lighting” in the Display options, Render/Update textures from the main menu, optionally turn HQ lighting off (may be faster on some cards).
Technical Discussion » Viewport facing primitives
- axebeak
- 51 posts
- Offline
I think you can also use (a really simple) GLSL material for this.
See attached scene (requires H11).
But it seems this only works properly while in Smooth Shaded view - for Smooth Wire Shaded you'll get a wireframe “ghost” of your geometry, Wireframe mode will simply display untransformed object…
But anyway, just an idea.
See attached scene (requires H11).
But it seems this only works properly while in Smooth Shaded view - for Smooth Wire Shaded you'll get a wireframe “ghost” of your geometry, Wireframe mode will simply display untransformed object…
But anyway, just an idea.
Technical Discussion » Enforcing dependency
- axebeak
- 51 posts
- Offline
I have a setup where material nodes reference textures from COPs. These textures are either processed images or completely generated inside COPs (e.g. auto-generated normal maps). In other words, SHOP nodes use op:$SHOP_PATH style reference.
My problem is that I'd like to get immediate feedback in OGL view-port when I'm editing some values in COPs. However, it seems that simply having a reference from SHOP to COP is not enough to force the automatic update. In fact, I ran into this problem back in H9. Texture update can be forced manually - “Render -> Update textures”, switch between panes, move on the time-line etc.
I found a workaround some time ago, but it's more of a hack than a real solution.
See the attached scene and screenshots (warning: programmer art). The pictures show how texture tint and Bricker params are edited in COPs and the texture updates interactively in OGL.
To achieve this I create a sort of forced dependency inside SOP, see /obj/COLUMN/attribcreate1 - a Detail vector attribute is created, then it's populated with arbitrary pixel values read from referenced COP networks (using pic() expressions). It's rather cumbersome though, and I really hope I'm missing something obvious.
Any suggestions?
My problem is that I'd like to get immediate feedback in OGL view-port when I'm editing some values in COPs. However, it seems that simply having a reference from SHOP to COP is not enough to force the automatic update. In fact, I ran into this problem back in H9. Texture update can be forced manually - “Render -> Update textures”, switch between panes, move on the time-line etc.
I found a workaround some time ago, but it's more of a hack than a real solution.
See the attached scene and screenshots (warning: programmer art). The pictures show how texture tint and Bricker params are edited in COPs and the texture updates interactively in OGL.
To achieve this I create a sort of forced dependency inside SOP, see /obj/COLUMN/attribcreate1 - a Detail vector attribute is created, then it's populated with arbitrary pixel values read from referenced COP networks (using pic() expressions). It's rather cumbersome though, and I really hope I'm missing something obvious.
Any suggestions?
Technical Discussion » SOP quadtrees
- axebeak
- 51 posts
- Offline
I'm experimenting with spatial data structures in SOP. Briefly, I want to do the following:
1) given some polygonal geometry and some control parameters, apply a spatial partitioning scheme and store the resulting structure with the geometry;
2) visualize the generated structure;
3) perform queries against the structure and visualize the results;
I'm posting my solution below, so maybe someone would like to comment on this, or in case it will be useful for someone.
Some details:
This is all in the context of collision detection for static game environments.
That is, given collision model for a scene, I want to be able to generate acceleration structure, improve it interactively by simulating common types of collision queries and tweaking some parameters, then export the result for real-time use.
The attached scene uses quad-trees for simplicity, but the general idea is the same for other tree-like structures - octrees, kd-trees, BSP and all sorts of BVH. Basically, the biggest question for me was how to encode these hierarchical structures in SOP.
My solution encodes the tree using geometry attributes. New points are generated to represent tree nodes, polygon lists are encoded in existing primitives, some meta info is stored in detail attributes.
Initially, I was considering using Node.setUserData() available in Hou11, to associate a string-encoded tree with a node. But this attribute-based method seems to be better on every account, any comments on this?
Some illustrations:
SOP_qtree1: “Quad-tree Viz” is a SOP Python operator that generates quadtree for the input geometry, maximum depth for the tree can be selected here, it also provides some simple visualization by coloring polygons according to their depth in the tree (lighter color means deeper in the tree);
SOP_qtree2: view_lvl node is a Delete with simple expression that allow to visualize all polygons at particular level in the tree;
SOP_qtree3: “Quad-tree Check” is another Python operator that checks a sphere against the model, using generated quad-tree to select only those polygons that may be colliding with the sphere; “Check Color” is used to visualize polygons that pass the broad-phase test with the tree, “Hit Color” visualizes polygons the sphere actually intersects with. This node also records the number on polygons that passed the broad-phase test in “nbCk” detail attribute. The sphere can be moved around and results will be updated in real-time;
1) given some polygonal geometry and some control parameters, apply a spatial partitioning scheme and store the resulting structure with the geometry;
2) visualize the generated structure;
3) perform queries against the structure and visualize the results;
I'm posting my solution below, so maybe someone would like to comment on this, or in case it will be useful for someone.
Some details:
This is all in the context of collision detection for static game environments.
That is, given collision model for a scene, I want to be able to generate acceleration structure, improve it interactively by simulating common types of collision queries and tweaking some parameters, then export the result for real-time use.
The attached scene uses quad-trees for simplicity, but the general idea is the same for other tree-like structures - octrees, kd-trees, BSP and all sorts of BVH. Basically, the biggest question for me was how to encode these hierarchical structures in SOP.
My solution encodes the tree using geometry attributes. New points are generated to represent tree nodes, polygon lists are encoded in existing primitives, some meta info is stored in detail attributes.
Initially, I was considering using Node.setUserData() available in Hou11, to associate a string-encoded tree with a node. But this attribute-based method seems to be better on every account, any comments on this?
Some illustrations:
SOP_qtree1: “Quad-tree Viz” is a SOP Python operator that generates quadtree for the input geometry, maximum depth for the tree can be selected here, it also provides some simple visualization by coloring polygons according to their depth in the tree (lighter color means deeper in the tree);
SOP_qtree2: view_lvl node is a Delete with simple expression that allow to visualize all polygons at particular level in the tree;
SOP_qtree3: “Quad-tree Check” is another Python operator that checks a sphere against the model, using generated quad-tree to select only those polygons that may be colliding with the sphere; “Check Color” is used to visualize polygons that pass the broad-phase test with the tree, “Hit Color” visualizes polygons the sphere actually intersects with. This node also records the number on polygons that passed the broad-phase test in “nbCk” detail attribute. The sphere can be moved around and results will be updated in real-time;
Technical Discussion » How reduce geometry
- axebeak
- 51 posts
- Offline
I don't think there is a software in the world that will do that.
Well, it's probably true, but Polygon Reduction operator in XSI is perhaps the closest thing to that.
At least it has this nice Quad Preservation feature.
Houdini Lounge » Contrail Tutorial
- axebeak
- 51 posts
- Offline
-
- Quick Links