XPU test UPDATED (with file)

   906   5   1
User Avatar
Member
123 posts
Joined: June 2016
Offline
I have updated my rhino test scene for Houdini 19.5, link to the file below...

My results...

AMD 8 cores CPU + Nvidia GTX 1650 4 GB
Linux Mint 21 with Nvidia drivers 515.65.01: 11 minutes 06 seconds, memory used 330.76 MB
Windows 10 Workstation, Nvidia drivers 516.94: 15 minutes 34 seconds, memory used 1.22 GB

AMD 8 cores + Nvidia RTX 3060
Linux Mint 21 with Nvidia drivers 515.65.01: 4 minutes 36 seconds, memory used 331.12 MB
Windows 10 Workstation, Nvidia drivers 516.94: 5 minutes 02 seconds, memory used 1.33 GB

On Windows the RTX 3060 is about 3 times as fast as the GTX 1650, while on Linux it is about 2.4 times as fast, both Windows and Linux uses the latest Nvidia drivers, the memory usage is also quite different, no idea why.

I must say that I am a bit disappointed with my new RTX 3060, I was expecting it to be at least 4 times as fast as the GTX 1650 with XPU.

EDIT: I made some tests with Blender's Cycles and the RTX 3060 can be as much as 4.6 times faster than the GTX 1650, depending on settings, so my guess is that Houdini XPU will get better over time!
Edited by GCharb - Sept. 3, 2022 09:00:37

Attachments:
RhinoScene_19.5.zip (8.0 MB)
Rhino-Linux-RTX3060.png (2.6 MB)
Rhino-Win10-RTX3060.png (2.6 MB)

User Avatar
Member
123 posts
Joined: June 2016
Offline
I made an SSS test by turning off transmission and turning on Subsurface Scattering, looking good for a quicky, rendered on AMD 8 Cores and a RTX 3060 !

Attachments:
RhinoJade-Win10-RTX3060.png (2.5 MB)

User Avatar
Member
6076 posts
Joined: Sept. 2011
Offline
GCharb
AMD 8 cores CPU + Nvidia GTX 1650 4 GB
Linux Mint 21 with Nvidia drivers 515.65.01: 11 minutes 06 seconds, memory used 330.76 MB
Windows 10 Workstation, Nvidia drivers 516.94: 15 minutes 34 seconds, memory used 1.22 GB

AMD 8 cores + Nvidia RTX 3060
Linux Mint 21 with Nvidia drivers 515.65.01: 4 minutes 36 seconds, memory used 331.12 MB
Windows 10 Workstation, Nvidia drivers 516.94: 5 minutes 02 seconds, memory used 1.33 GB

Are these times pre-warmed, or cold? I've noticed a 30-40 second spin up time for the Optix library regardless of scene complexity.

I ran your test on my system and got the times:
(tests were done a machine that wasn't exactly idle so they aren't that scientific)
2:57 (cold) 2.19GB Mem
2:26 (warm) 2.16GB Mem

Embree device: i9 10850K
Optix Device: RTX 3090
User Avatar
Member
123 posts
Joined: June 2016
Offline
I made several renders, the first one always takes a bit longer!
User Avatar
Member
198 posts
Joined: Sept. 2012
Offline
jsmack
GCharb
AMD 8 cores CPU + Nvidia GTX 1650 4 GB
Linux Mint 21 with Nvidia drivers 515.65.01: 11 minutes 06 seconds, memory used 330.76 MB
Windows 10 Workstation, Nvidia drivers 516.94: 15 minutes 34 seconds, memory used 1.22 GB

AMD 8 cores + Nvidia RTX 3060
Linux Mint 21 with Nvidia drivers 515.65.01: 4 minutes 36 seconds, memory used 331.12 MB
Windows 10 Workstation, Nvidia drivers 516.94: 5 minutes 02 seconds, memory used 1.33 GB

Are these times pre-warmed, or cold? I've noticed a 30-40 second spin up time for the Optix library regardless of scene complexity.

I ran your test on my system and got the times:
(tests were done a machine that wasn't exactly idle so they aren't that scientific)
2:57 (cold) 2.19GB Mem
2:26 (warm) 2.16GB Mem

Embree device: i9 10850K
Optix Device: RTX 3090


Is it the same on Linux? the 30-40 sec spin up?
Vincent Thomas   (VFX and Art since 1998)
Senior Env and Lighting  artist & Houdini generalist & Creative Concepts
http://fr.linkedin.com/in/vincentthomas [fr.linkedin.com]
User Avatar
Member
6076 posts
Joined: Sept. 2011
Offline
vinyvince
jsmack
GCharb
AMD 8 cores CPU + Nvidia GTX 1650 4 GB
Linux Mint 21 with Nvidia drivers 515.65.01: 11 minutes 06 seconds, memory used 330.76 MB
Windows 10 Workstation, Nvidia drivers 516.94: 15 minutes 34 seconds, memory used 1.22 GB

AMD 8 cores + Nvidia RTX 3060
Linux Mint 21 with Nvidia drivers 515.65.01: 4 minutes 36 seconds, memory used 331.12 MB
Windows 10 Workstation, Nvidia drivers 516.94: 5 minutes 02 seconds, memory used 1.33 GB

Are these times pre-warmed, or cold? I've noticed a 30-40 second spin up time for the Optix library regardless of scene complexity.

I ran your test on my system and got the times:
(tests were done a machine that wasn't exactly idle so they aren't that scientific)
2:57 (cold) 2.19GB Mem
2:26 (warm) 2.16GB Mem

Embree device: i9 10850K
Optix Device: RTX 3090


Is it the same on Linux? the 30-40 sec spin up?

I don't know, I don't have access to a linux system with hardware drivers that support XPU
  • Quick Links