There are probably better ways to deal with this, but personally I would set orientations with VEX.
Take a look at the example file.
Found 477 posts.
Search results Show results as topic list.
Technical Discussion » Robot arm rig best practices
- ajz3d
- 477 posts
- Offline
Houdini Engine for Unreal » Flipping Normals
- ajz3d
- 477 posts
- Offline
Technical Discussion » Network view bug
- ajz3d
- 477 posts
- Offline
There's a chance you will see it once you upgrade to Debian 12 (XFCE), which I'm currently on.
Technical Discussion » Network view bug
- ajz3d
- 477 posts
- Offline
It seems to be caused by MMB-invoked "Node Information" window. Occurrence appears to be random.
Houdini Engine for Unreal » Flipping Normals
- ajz3d
- 477 posts
- Offline
I added two Reverse SOPs (one per method) to the network that are reversing the candidate group, and now PolyExtrude returns correct result.
There were ill-formed primitives in the input geometry, so I cleaned them up by recreating primitives from their edges. This might not be a perfect solution, but it works on geometry data you provided.
There were ill-formed primitives in the input geometry, so I cleaned them up by recreating primitives from their edges. This might not be a perfect solution, but it works on geometry data you provided.
Technical Discussion » SSS Anisotropy (MaterialX) - negative values not available
- ajz3d
- 477 posts
- Offline
Technical Discussion » SSS Anisotropy (MaterialX) - negative values not available
- ajz3d
- 477 posts
- Offline
Possible bug report.
MaterialX Standard Surface shader has Subsurface Anisotropy parameter (
In Houdini 19.5.640 it's not possible to set this parameter to negative values (and I recall that it was possible in older versions). For some reason, this parameter's lowest end of the range that it accepts is hardlocked to 0.
MaterialX Standard Surface shader has Subsurface Anisotropy parameter (
subsurface_anisotropy
). The tooltip of this parameter, as well as its docs entry, state the following:The direction of subsurface scattering. 0 scatters light evenly, positive values scatter forward and negative values scatter backward.
In Houdini 19.5.640 it's not possible to set this parameter to negative values (and I recall that it was possible in older versions). For some reason, this parameter's lowest end of the range that it accepts is hardlocked to 0.
Technical Discussion » Network view bug
- ajz3d
- 477 posts
- Offline
From time to time I'm getting this weird bug while working with the Network View.
Has anyone experienced something similar?
Has anyone experienced something similar?
Houdini Engine for Unreal » Flipping Normals
- ajz3d
- 477 posts
- Offline
gallerykimDepends.
Do I need to modify @N to change the normal vector?
- If you have no existing vertex normals and it is only primitive winding that is wrong, then obviously no.
- If you have existing vertex normals, then regardless of primitive winding you will also need to reverse vertex normals.
When instructed to multiply by -1 when @N.y < 0, in the spread sheet, the N value became positive, but the actual normal did not flip.It would be best if you could post a sample scene.
Maybe it's a viewport bug? Or perhaps you're multiplying only one vector component instead of negating the full vector?
Houdini Engine for Unreal » Flipping Normals
- ajz3d
- 477 posts
- Offline
If this is the case, then perhaps it will be enough to reverse all primitives which have normals within a specific spread angle from the (0, -1, 0) vector? It should work for this kind of data.
Edited by ajz3d - 2023年7月6日 06:30:17
Houdini Engine for Unreal » Flipping Normals
- ajz3d
- 477 posts
- Offline
Do prims that you're generating always lie on a flat plane, like on the screenshot?
Edited by ajz3d - 2023年7月5日 08:56:20
Houdini Indie and Apprentice » Random switch :
- ajz3d
- 477 posts
- Offline
How random this polygon needs to be?
The code below, when pasted into a detail wrangle node will create a random polygon within a (-1, 1) range of XYZ. Point count and seed are regulated by spare parameters. They need to be created first (hit the plus button in the wrangle's UI).
The code below, when pasted into a detail wrangle node will create a random polygon within a (-1, 1) range of XYZ. Point count and seed are regulated by spare parameters. They need to be created first (hit the plus button in the wrangle's UI).
int point_count = chi("points") + npoints(0); int points[]; int seed = chi("seed"); for(int i = npoints(0); i < point_count; i++) { vector pos = set( fit(rand(seed + i - npoints(0)), 0, 1, -1, 1), fit(rand(seed + i + 1 - npoints(0)), 0, 1, -1, 1), fit(rand(seed + i + 2 - npoints(0)), 0, 1, -1, 1) ); addpoint(0, pos); push(points, i); } if(len(points) > 0) addprim(0, "poly", points);
Edited by ajz3d - 2023年7月4日 15:06:35
Technical Discussion » Generating solid (filled) collider for a car ?
- ajz3d
- 477 posts
- Offline
Depends on how and where you want to use the asset. You might want to try Convex Hull (Shrinkwrap) or Convex Decomposition applied on the whole model or per piece. Using a combination of simple convex shapes, like boxes, capsules and cylinders is also an option.
DaVlasI already added this website to hosts blacklist.
Or the way out for you may be to use the opportunity to cooperate with outsourced tech support ww*.helpware.com companies
Technical Discussion » How to get assets for simulation
- ajz3d
- 477 posts
- Offline
The topology, at least by judging from available screenshots, looks clean. But it's a subdivisible model, which isn't good for Vellum. You need to process it first. Remesh it to uniform triangles before feeding it to Vellum. Then use Point Deform to drive deformations of the original geometry with the simulated proxy.
freewindYou can't. There's always a risk of buying a pig in a poke, unless the asset is well documented (3d viewer, lots of reference images depicting topology) or you know that the source supplies quality assets.
Do you know how you can be sure an asset works well before buying?
freewindDepends on many circumstances, like available time, funds, asset complexity, project type, etc.
Or would you rather model everything on your own cause you know the requirements and don't want to rely on others?
Houdini Engine for Unreal » Flipping Normals
- ajz3d
- 477 posts
- Offline
PolyDoctor's "Correct Winding of Polygons to Majority in their Manifold Patch" is one way to do it.
Houdini Learning Materials » Titan Train Tutorial Project Files Down
- ajz3d
- 477 posts
- Offline
thoseEyesI can confirm this.
Alas actually the zip is corrupt
▶ unzip project_titan_train_destructionfx_scene_files.zip
Archive: project_titan_train_destructionfx_scene_files.zip
End-of-central-directory signature not found. Either this file is not
a zipfile, or it constitutes one disk of a multi-part archive. In the
latter case the central directory and zipfile comment will be found on
the last disk(s) of this archive.
unzip: cannot find zipfile directory in one of project_titan_train_destructionfx_scene_files.zip or
project_titan_train_destructionfx_scene_files.zip.zip, and cannot find project_titan_train_destructionfx_scene_files.zip.ZIP, period.
Houdini Indie and Apprentice » Exporting USD files from Houdini Indie?
- ajz3d
- 477 posts
- Offline
This restriction was removed four years ago:
https://www.sidefx.com/forum/topic/71019/ [www.sidefx.com]
https://www.sidefx.com/forum/topic/71019/ [www.sidefx.com]
Solaris and Karma » Align Transform LOP's pivot to existing primitive
- ajz3d
- 477 posts
- Offline
Thanks!
I ended up referencing camera's "Translate" and "Rotate" parameters from Transform's "Pivot Translate/Rotate" which effectively matched Transform's pivot to camera's.
But, I just tested what you suggested, and it works very well. Even though the pivot doesn't visually snap to camera's, I can affect the target camera in the exact same way as if I were to use references. And that's more than enough for CHOPs, so thanks again!
I ended up referencing camera's "Translate" and "Rotate" parameters from Transform's "Pivot Translate/Rotate" which effectively matched Transform's pivot to camera's.
But, I just tested what you suggested, and it works very well. Even though the pivot doesn't visually snap to camera's, I can affect the target camera in the exact same way as if I were to use references. And that's more than enough for CHOPs, so thanks again!
Solaris and Karma » Align Transform LOP's pivot to existing primitive
- ajz3d
- 477 posts
- Offline
I'd like to drive transformations of an existing camera primitive (which has non-zero transforms) with a Transform LOP (ultimately driven by CHOPs). I tried to snap Transform LOP's pivot to existing camera using Align Handle, but all I'm getting are segfaults. Every single time.
Any advice on how to align transform's pivot to camera, other than using ye olde eyeballing technique?
Any advice on how to align transform's pivot to camera, other than using ye olde eyeballing technique?
Houdini Lounge » Best Mini PC for Houdini?
- ajz3d
- 477 posts
- Offline
I thought it would be interesting to mention that one doesn't even need a VNC for this kind of task.
It is possible to do practically the same with just FFmpeg and USB/IP via SSH tunneling. It requires some configuring, like setting up firewall rules and port forwarding, two SSH keys, determining optimal FFmpeg settings for downscaling and encoding of the captured KMS buffer, collecting usbip indexes, and finally --- a decent network connection to support lightly compressed video stream. Oh, and an extra set of USB keyboard and mouse on the client machine! Because once those devices are bound to USB/IP, they are passed-through to the machine we want to control, and as such can no longer be used to control the computer they are physically connected to, until they're unbound.
Most GPUs support hardware H.264 and/or HEVC video encoding, so encoding should consume minimal resources on the host. Pretty much all CPUs support hardware decoding (even ARM ones), so we can use virtually any machine to display the stream hosted by the workstation. Possibly even ODROID, ROCKPro64 or a Pi.
Heavy duty 3D rendering is done on the host that serves FFmpeg stream, and client only decodes the video and hosts USB controls.
It is possible to do practically the same with just FFmpeg and USB/IP via SSH tunneling. It requires some configuring, like setting up firewall rules and port forwarding, two SSH keys, determining optimal FFmpeg settings for downscaling and encoding of the captured KMS buffer, collecting usbip indexes, and finally --- a decent network connection to support lightly compressed video stream. Oh, and an extra set of USB keyboard and mouse on the client machine! Because once those devices are bound to USB/IP, they are passed-through to the machine we want to control, and as such can no longer be used to control the computer they are physically connected to, until they're unbound.
Most GPUs support hardware H.264 and/or HEVC video encoding, so encoding should consume minimal resources on the host. Pretty much all CPUs support hardware decoding (even ARM ones), so we can use virtually any machine to display the stream hosted by the workstation. Possibly even ODROID, ROCKPro64 or a Pi.
Heavy duty 3D rendering is done on the host that serves FFmpeg stream, and client only decodes the video and hosts USB controls.
Edited by ajz3d - 2023年6月19日 07:57:13
-
- Quick Links