SLI Support with cuda calculation

   4683   3   0
User Avatar
Member
3 posts
Joined: March 2014
Offline
Hi,

We are getting 2 titan Z to buil a very powerful workstation. I have discover that houdini dosen't support SLI. Do you plan on implementing this really soon ?

Also do you know if houdini will support the dual gpu on the Titan Z ?
User Avatar
Member
1908 posts
Joined: Nov. 2006
Online
A recent odforce post about the Titan Z

http://forums.odforce.net/topic/19852-is-nvidia-titan-z-a-worthwhile-card-for-houdini/ [forums.odforce.net]
Graham Thompson, Technical Artist @ Rockstar Games
User Avatar
Member
3 posts
Joined: March 2014
Offline
Hi,

The discussion is quite interesting, but i wonder if side fx actually have this on their time line. If they planned to support sli or dual gpu en the next 3 month, well this expense is quite a good thing to do. But if it is not even in the next year well it dosn't need to be consider for now because the card will lose value during that year.

Also have i some concern about the dual gpu. If it is the driver of the card managing the bridge, the software should not see any difference ? So if this is true i could still have only one card, and enjoy the 12 gig of ram and the 5700 cuda core of computation.

And there is this card aswell:
http://community.amd.com/community/amd-blogs/amd-gaming/blog/2014/04/08/hail-to-the-king-introducing-the-amd-radeon-r9-295x2 [community.amd.com]

Single gpu so that could rule out the problem aswell.
User Avatar
Staff
5161 posts
Joined: July 2005
Offline
If the driver abstracted the two GPUs as one, then yes Houdini would use both. However, at least in the drivers I've seen, it presents them as two separate compute devices. This leaves it up to the developer to manually divide up the work between them. From what I've heard of the DOP OpenCL stuff, this isn't trivial.
  • Quick Links