brians
brians
About Me
Connect
LOCATION
Not Specified
ウェブサイト
Houdini Skills
Availability
Not Specified
Recent Forum Posts
Houdini not using one GPU 2024年3月27日19:04
Another thing you can do is to do some render tests.
So... perform renders with the following environment variables set.
This will test with only the first GPU enabled
This will test with only the second GPU enabled
This will test with both of the GPUs enabled.
more instructions here
https://www.sidefx.com/docs/houdini/solaris/karma_xpu.html#disablingdevices [www.sidefx.com]
I'm guessing you'll find the first two have the same approximate time, and the last will be approximately twice as fast. Please note that its best to run the tests twice, to ensure all GPU shaders are compiled and cached.
So... perform renders with the following environment variables set.
This will test with only the first GPU enabled
KARMA_XPU_DISABLE_DEVICE_0=0 KARMA_XPU_DISABLE_DEVICE_1=1 KARMA_XPU_DISABLE_DEVICE_2=1
This will test with only the second GPU enabled
KARMA_XPU_DISABLE_DEVICE_0=1 KARMA_XPU_DISABLE_DEVICE_1=0 KARMA_XPU_DISABLE_DEVICE_2=1
This will test with both of the GPUs enabled.
KARMA_XPU_DISABLE_DEVICE_0=0 KARMA_XPU_DISABLE_DEVICE_1=0 KARMA_XPU_DISABLE_DEVICE_2=1
more instructions here
https://www.sidefx.com/docs/houdini/solaris/karma_xpu.html#disablingdevices [www.sidefx.com]
I'm guessing you'll find the first two have the same approximate time, and the last will be approximately twice as fast. Please note that its best to run the tests twice, to ensure all GPU shaders are compiled and cached.
Houdini not using one GPU 2024年3月27日17:14
Are you determining this by looking at the windows task manager performance tab?
We've found that with 2 GPUs (have not tested with more sorry), the 2nd GPU doesn't seem to register any usage, even though it is running at full steam. Some hint that it is working is that you can see its temperature rise.
So this is an issue with the windows performance UI (probably something to do with the way Optix talks to the GPU under the hood :/ )
Evidence to show that it's running fine can be seen by looking at the header of the EXR image. If you render with the new driver (ie not the legacy driver), karma will put stats in the EXR header. You can then view those stats by this command line call
The data is fairly dense sorry, but what we're looking for is the "xpu_device_samples" value associated with each device (which indicates how many passes each device contributed to the frame). So for a 128 sample image, all 3 devices should add to 128. I'm guessing the two GPU devices will have approximately the same value (indicating they're both working fine).
We've found that with 2 GPUs (have not tested with more sorry), the 2nd GPU doesn't seem to register any usage, even though it is running at full steam. Some hint that it is working is that you can see its temperature rise.
So this is an issue with the windows performance UI (probably something to do with the way Optix talks to the GPU under the hood :/ )
Evidence to show that it's running fine can be seen by looking at the header of the EXR image. If you render with the new driver (ie not the legacy driver), karma will put stats in the EXR header. You can then view those stats by this command line call
iinfo -v myfile.exr
The data is fairly dense sorry, but what we're looking for is the "xpu_device_samples" value associated with each device (which indicates how many passes each device contributed to the frame). So for a 128 sample image, all 3 devices should add to 128. I'm guessing the two GPU devices will have approximately the same value (indicating they're both working fine).
Stripes in fire sim after rendering in Karma 2024年3月19日18:25
you could try tweaking the volume step size control