Found 22 posts.
Search results Show results as topic list.
Technical Discussion » Set render view background image with python
- mike.battcock
- 22 posts
- Offline
Solaris and Karma » Oceans with Karma XPU
- mike.battcock
- 22 posts
- Offline
Technical Discussion » Oceans with Karma XPU
- mike.battcock
- 22 posts
- Offline
Technical Discussion » Custom Karma XPU pyro shader
- mike.battcock
- 22 posts
- Offline
I've not delved into karma much at all yet but just wondering if it's possible to build a custom Karma pyro shader for karma XPU? I'm try to add a specular component using the vdb gradient. I've seen there is the Karma Pyro Preview shader but it's fairly limited.
Cheers,
Mike
Cheers,
Mike
PDG/TOPs » PDG Wedges on Deadline
- mike.battcock
- 22 posts
- Offline
I'm trying to submit wedges to deadline using PDG (with the Deadline Scheduler) however its only submitting one job. Is there a way to submit each wedge as a separate job so it can be rendered on multiple machines?
I'm using Houdini 19.0.622 (Python 2) on Linux
I'm using Houdini 19.0.622 (Python 2) on Linux
Edited by mike.battcock - Sept. 16, 2022 06:12:30
PDG/TOPs » Flipbook wedge work items as sequence
- mike.battcock
- 22 posts
- Offline
Thanks for your reply. I didn't know about PDG services but i'll take a look. I was trying to trick the ROP fetch into thinking each work item is a frame by giving it a frame attribute but it wasn't fooled... Can you think of another workaround? From a quick glance at PDG services they look a bit overcomplicated for something that should be fairly simple. Is there a way to active a work item through a pre render frame python script? So rather than rendering a single frame I would set the number of frames to the number of work items and render it from rops rather than tops->rops. Seems a bit counterintuitive but might be the simplest option
PDG/TOPs » Flipbook wedge work items as sequence
- mike.battcock
- 22 posts
- Offline
I've created a bunch of wedge variations in TOPs and I want to flipbook each one, so i'm using a rop fetch pointing to an OpenGL Rop. This works fine, however it seems quite slow and it looks like Houdini is opening a separate hython instance for each work item. I think it would be more efficient to batch work items into groups of 10 for example and render that as a sequence. I've tried setting Frames Per Batch or All Frames in One Batch but it doesn't seem to work - the opengl rop is set to only render one frame so maybe thats the issue?
Technical Discussion » Querying Selected Keyframes in Python
- mike.battcock
- 22 posts
- Offline
Do you know how to query selected keyframe handles? If I just select the incoming handle of a keyframe the above doesn't return anything
Technical Discussion » HtoA - Lentil Plug In
- mike.battcock
- 22 posts
- Offline
Did you resolve this? I'm having a similar issue with Lentil - it renders fine locally but I get similar errors when rendering on our farm (Deadline):
2021-12-20 13:56:50: 0: STDOUT: 00:00:00 100MB ERROR | Couldn't find imager input. Is your imager connected?
2021-12-20 13:56:50: 0: STDOUT: 00:00:00 100MB ERROR | Couldn't find imager input. Is your imager connected?
Houdini Engine for Unreal » Landscape UV Coordinates
- mike.battcock
- 22 posts
- Offline
Hi Hektor, yea thats what I did in the end. The trouble was the geometry was actually an RBD sim so I couldn't use world space as the UVs would swim. I ended up using the Pre Skinned Local Position which effectively acts like a rest position. Then I just needed one material instance for my landscape and one for my geo so I could switch between world position and pre-skinned position.. bit of a pain but it works..
Houdini Engine for Unreal » Landscape UV Coordinates
- mike.battcock
- 22 posts
- Offline
I'm using Houdini engine to load my heightfield in Unreal as a landscape. I also have some geometry right next to the heightfield that needs to seamlessly blend in with the heightfield so the UVs need to line up.
In houdini this is easy since the heightfield uvs are unitized but I can't figure out how the UVs are generated for the landscape in Unreal. It seems to be based on the heightfield dimensions and grid spacing but doesn't seem very consistent..
could someone enlighten me on how the landscape uvs are generated through the Houdini engine so I can procedurally transform the UVs on my geometry so they line up with my landscape in Unreal?
In houdini this is easy since the heightfield uvs are unitized but I can't figure out how the UVs are generated for the landscape in Unreal. It seems to be based on the heightfield dimensions and grid spacing but doesn't seem very consistent..
could someone enlighten me on how the landscape uvs are generated through the Houdini engine so I can procedurally transform the UVs on my geometry so they line up with my landscape in Unreal?
Houdini for Realtime » Vertex Animation Textures ROP 2.1 Released
- mike.battcock
- 22 posts
- Offline
Is there a solution for the initial offset? I have some static pieces that never move so the offset is quite obvious on those pieces
Technical Discussion » Transfer Guide ID to Hair
- mike.battcock
- 22 posts
- Offline
I got a response from sidefx that this has been fixed in a later build:
This was fixed in Houdini 18.0.517:
Guide primitive attributes of data types other than float will now transfer
to hairs reliably. The value of the closest guide is used.
Too late for our job but good to know it will work in the future!
This was fixed in Houdini 18.0.517:
Guide primitive attributes of data types other than float will now transfer
to hairs reliably. The value of the closest guide is used.
Too late for our job but good to know it will work in the future!
Technical Discussion » Transfer Guide ID to Hair
- mike.battcock
- 22 posts
- Offline
I did try that but the hairgen interpolates the values so if a hair is between two guides with IDs 1 and 100 the hairs will end up with a whole range of values between 1 and 100. But what I actually want is just to transfer the ID from the nearest guide (I.e. either 1 or 100).
I tried using xyzdist to get the nearest guide and promote to prims using “mode” which gives a decent result but still not perfect.
I might try a similar thing but only on the root point then transferring that along the rest of the curve
I tried using xyzdist to get the nearest guide and promote to prims using “mode” which gives a decent result but still not perfect.
I might try a similar thing but only on the root point then transferring that along the rest of the curve
Technical Discussion » Transfer Guide ID to Hair
- mike.battcock
- 22 posts
- Offline
I need to get the id of the guide that the hair was created from. I've create a prim attribute called “guide_id” and added it to the Guide Attribute Transfer and Output Attributes on the hairgen node but the attribute is transferred across properly. I get the attribute on some hairs but only about 0.01%…. Same thing on sop level hairgen
Can someone tell me what i'm doing wrong or is this a bug?
Can someone tell me what i'm doing wrong or is this a bug?
Technical Discussion » Redirect stdout to custom pyside console
- mike.battcock
- 22 posts
- Offline
I've created an app launcher with pyside2 to start houdini (and maya/nuke) with job specific environments, but i'm struggling to capture console messages from stdout.
I want to get stdout and append to a QTextEdit widget in the pyside ui.
Has anyone done anything similar and could share some advice for this?
I want to get stdout and append to a QTextEdit widget in the pyside ui.
Has anyone done anything similar and could share some advice for this?
Technical Discussion » Crowd blend shape
- mike.battcock
- 22 posts
- Offline
Could you upload an example hip file?
And it would be good to get more comprehensive documentation at some point..
And it would be good to get more comprehensive documentation at some point..
Technical Discussion » Redirecting the Houdini Console...
- mike.battcock
- 22 posts
- Offline
Technical Discussion » Crowd blend shape
- mike.battcock
- 22 posts
- Offline
Technical Discussion » Crowd blend shape
- mike.battcock
- 22 posts
- Offline
Is it possible to have a blend shape with random weights in the crowd system? I'm trying to drive a facial expression on the agents. Similar to what is discussed here with Miarmy:
https://basefount.atlassian.net/wiki/spaces/MDE/pages/58523650/Action+Attribute+and+Blendshape [basefount.atlassian.net]
https://basefount.atlassian.net/wiki/spaces/MDE/pages/58523650/Action+Attribute+and+Blendshape [basefount.atlassian.net]
-
- Quick Links